Accessing data using Apache Druid
Also available as:

Securing Apache Druid using Kerberos

It is important to place the Apache Druid (incubating) endpoints behind a firewall and configure authentication.

You can configure Druid nodes to integrate a Kerberos-secured Hadoop cluster. Such an integration provides authentication between Druid and other HDP Services. If Kerberos is enabled on your cluster, to query Druid data sources that are imported from Hive, you must set up LLAP.

You can enable the authentication with Ambari 2.5.0 and later to secure the HTTP endpoints by including a SPNEGO-based Druid extension. After enabling authentication in this way, you can connect Druid Web UIs to the core Hadoop cluster while maintaining Kerberos protection. The main Web UIs to use with HDP are the Coordinator Console and the Overlord Console.

To use Hive and Druid together under Kerberos, you must run Hive Interactive Server to configure Hive in low-latency, analytical processing (LLAP) mode.