Known Issues in Data Analytics Studio

Learn about the known issues in Data Analytics Studio (DAS), the impact or changes to the functionality, and the workaround.

  • CDPD-49281: DAS WebApp logs are not captured in the var/logs/das/ directory, as expected.

    Workaround: To obtain the DAS WebApp logs, check the stderr.log file in the runtime process directory for the DAS WebApp.

  • CDPD-40579: DAS does not display database or table information during a repl dump on a High Availability cluster.

    DAS may display "No Tables Available" on the Database page or "No tables found" on running a "show tables" query and you may see the following error in the das-event-processor logs: Error while compiling statement: FAILED: Execution Error, return code 40000 from org.apache.hadoop.hive.ql.exec.repl.ReplDumpTask. Operation category READ is not supported in state standby. This happens when an active NameNode moves to standby state while DAS is processing the repl dump operation.

    Workaround: Clear the db_replication_info table as follows:
    1. Stop the DAS service from Cloudera Manager.
    2. SSH into the database host and run the following command:
      delete from db_replication_info;
    3. Start the DAS service from Cloudera Manager.
    If you see the Notification events are missing in the meta store error after starting DAS, then reset the PostgreSQL database by running the following command:
    curl -H 'X-Requested-By: das' -H 'Cookie: JSESSIONID=[***SESSION-ID-COOKIE***]' http(s)://[***HOSTNAME***]:[***PORT***]/api/replicationDump/reset
    Where:
    • [***SESSION-ID-COOKIE***] is the cookie value which you have to get for an admin user on the DAS UI from the browser
    • [***HOSTNAME***] is the DAS Webapp hostname
    • [***PORT***] is the DAS Webapp port
  • You may not be able to add or delete columns or change the table schema after creating a new table using the upload table feature.
  • For clusters secured using Knox, you see the HTTP 401: Forbidden error message when you click the DAS quick link from Cloudera Manager and are unable to log into DAS.

    Workaround: The admin user will need to provide the DAS URL from the Knox proxy topology to the users needing access to DAS.

  • The download logs feature may not return the YARN application logs on a Kerberized cluster. When you download the logs, the logs contain an error-reports.json file which states that no valid Kerberos tokens are available.

    Workaround: An admin user with access to the machine can use the kinit command as a hive user with hive service user keytabs and trigger the download.

  • The task logs for a particular task may not be available in the task swimlane. And the zip file generated by download logs artifact may not have task logs, but instead contain an error-reports.json file with the error log of the download failures.
  • You may not see any data for a report for any new queries that you run. This can happen especially for the last one day's report.

    Workaround:
    1. Shut down the DAS Event Processor.
    2. Run the following command from the Postgres server:
      update das.report_scheduler_run_audit set status = 'FAILED' where status = 'READING';
    3. Start the DAS Event Processor.
  • On clusters secured with Knox proxy only: You might not be able to save the changes to the JDBC URL in the DAS UI to change the server interface (HS2 or LLAP) on which you are running your queries.
  • You may be unable to upload tables or get an error while browsing files to upload tables in DAS on a cluster secured using Knox proxy.
  • DAS does not parse semicolons (;) and double hyphens (--) in strings and comments.

    For example, if you have a semicolon in query such as the following, the query might fail: select * from properties where prop_value = "name1;name2";

    If a semicolon is present in a comment, then run the query after removing the semicolon from the comment, or removing the comment altogether. For example:
    select * from test; -- select * from test;
    select * from test; /* comment; comment */
    Queries with double hyphens (--) might also fail. For example:
    select * from test where option = '--name';
  • You might face UI issues on Google Chrome while using faceted search. We recommend you to use the latest version of Google Chrome (version 71.x or higher).
  • Visual Explain for the same query shows different graphs on the Compose page and the Query Details page.
  • While running some queries, if you restart HSI, the query execution is stopped. However, DAS does not reflect this change and the queries appear to be in the same state forever.
  • After a fresh installation, when there is no data and you try to access the Reports tab, DAS displays an "HTTP 404 Not Found" error.
  • Join count does not get updated for tables with partitioned columns.

Technical Service Bulletins

TSB 2022-581: Issues with “DAG ID” and “APP ID” visibility when exploring jobs in Data Analytics Studio
When using Data Analytics Studio (DAS) with Cloudera Data Platform (CDP) Private Cloud Base, sometimes the DAG ID and APP ID will not be visible to DAS.
Knowledge article:
For the latest update on this issue see the corresponding Knowledge article: TSB 2022-581: Issues with “DAG ID” and “APP ID” visibility when exploring jobs in Data Analytics Studio