Cloudera Docs
»
2.3.4
»
Data Services
Data Services
Also available as:
Contents
1. Using Apache Hive
Hive Documentation
Features Overview
Temporary Tables
Cost-Based SQL Optimization
Optimized Row Columnar (ORC) Format
Streaming Data Ingestion
Query Vectorization
Comparing Beeline to the Hive CLI
Moving Data into Hive
Moving Data from HDFS to Hive Using an External Table
Using Sqoop to Move Data into Hive
Incrementally Updating a Hive Table Using Sqoop and an External Table
Hive JDBC and ODBC Drivers
Configuring HiveServer2 for Transactions (ACID Support)
Configuring HiveServer2 for LDAP and for LDAP over SSL
Troubleshooting Hive
Hive JIRAs
2. SQL Compliance
INSERT ... VALUES, UPDATE, and DELETE SQL Statements
SQL Standard-based Authorization with GRANT And REVOKE SQL Statements
Transactions
Subqueries
Common Table Expressions
Quoted Identifiers in Column Names
CHAR Data Type Support
3. Running Pig with the Tez Execution Engine
4. Using HDP for Metadata Services (HCatalog)
Using HCatalog
Using WebHCat
Security for WebHCat
5. Using Apache HBase and Apache Phoenix
Cell-level Access Control Lists (ACLs)
Column Family Encryption
Tuning Region Server
Using Phoenix with HBase
6. Using HDP for Workflow and Scheduling (Oozie)
7. Using Apache Sqoop
Apache Sqoop Connectors
Sqoop Import Table Commands
Netezza Connector
Sqoop-HCatalog Integration
Controlling Transaction Isolation
Automatic Table Creation
Delimited Text Formats and Field and Line Delimiter Characters
HCatalog Table Requirements
Support for Partitioning
Schema Mapping
Support for HCatalog Data Types
Providing Hive and HCatalog Libraries for the Sqoop Job
Examples
« Prev
Next »
Hive JIRAs
Issue tracking for Hive bugs and improvements can be found on the
Apache Hive site
.
© 2012–2021 by Cloudera, Inc.
Document licensed under the
Creative Commons Attribution ShareAlike 4.0 License
.
Cloudera.com
|
Documentation
|
Support
|
Community