Data Governance Overview
Also available as:
PDF

Apache Atlas features

Apache Atlas is a low-level service in the Hadoop stack that provides core metadata services.

Atlas currently provides metadata services for the following components:

  • Hive

  • Ranger

  • Sqoop

  • Storm/Kafka (limited support)

  • Falcon (limited support)

Apache Atlas provides the following features:
  • Knowledge store that leverages existing Hadoop metastores: Categorized into a business-oriented taxonomy of data sets, objects, tables, and columns. Supports the exchange of metadata between HDP foundation components and third-party applications or governance tools.

  • Data lifecycle management: Leverages existing investment in Apache Falcon with a focus on provenance, multi-cluster replication, data set retention and eviction, late data handling, and automation.

  • Audit store: Historical repository for all governance events, including security events (access, grant, deny), operational events related to data provenance and metrics. The Atlas audit store is indexed and searchable for access to governance events.

  • Security: Integration with HDP security that enables you to establish global security policies based on data classifications and that leverages Apache Ranger plug-in architecture for security policy enforcement.

  • Policy engine: Fully extensible policy engine that supports metadata-based, geo-based, and time-based rules that rationalize at runtime.

  • RESTful interface: Supports extensibility by way of REST APIs to third-party applications so you can use your existing tools to view and manipulate metadata in the HDP foundation components.

Figure 1. Atlas Overview