- 1. Using Apache Hive
- Hive Documentation
- New Feature: Temporary Tables
- New Feature: Cost-based SQL Optimization
- New Feature: ORC Format Improvement
- Streaming Data Ingestion
- Query Vectorization
- Comparing Beeline to the Hive CLI
- Hive JDBC and ODBC Drivers
- Configuring HiveServer2 for LDAP and for LDAP over SSL
- Troubleshooting Hive
- Hive JIRAs
- 2. SQL Compliance
- New Feature: INSERT ... VALUES, UPDATE, and DELETE SQL Statements
- Hive 0.13 Feature: SQL Standard-based Authorization with GRANT And REVOKE SQL Statements
- Hive 0.13 Feature: Transactions
- Hive 0.13 Feature: Subqueries in WHERE Clauses
- Hive 0.13 Feature: Common Table Expressions
- Hive 0.13 Feature: Quoted Identifiers in Column Names
- Hive 0.13 Feature: CHAR Data Type Support
- 3. Running Pig with the Tez Execution Engine
- 4. Using HDP for Metadata Services (HCatalog)
- 5. Using Apache HBase
- 6. Using HDP for Workflow and Scheduling (Oozie)
- 7. Using Apache Sqoop
- Apache Sqoop Connectors
- Sqoop Import Table Commands
- Netezza Connector
- Sqoop-HCatalog Integration
- Controlling Transaction Isolation
- Automatic Table Creation
- Delimited Text Formats and Field and Line Delimiter Characters
- HCatalog Table Requirements
- Support for Partitioning
- Schema Mapping
- Support for HCatalog Data Types
- Providing Hive and HCatalog Libraries for the Sqoop Job
- Examples