Known issues

Learn about the known issues in Data Catalog, the impact or changes to the functionality, and the workaround.

Unsupported Features

Known Issues

CDPDSS-342: Deleted table is still shown in dataset
Problem: Created table in a dataset if deleted, does not get deleted completely and shows up in the dataset. If the same table is created again, the edit and search options are enabled. When you save the table, there are two tables displayed, thought the first instance of the same table was deleted.
CDPDSS-362: dpprofiler.submitter.batch.size property is not taking effect
Problem: As part of application.conf safety valve under profiler scheduler, applying the property dpprofiler.submitter.batch.size is not taking effect. It defaults to 50.
CDPDSS-1150: A job which is in running state remains in the running state, due to some scenarios (restarting workload cluster) and it gets stuck
Problem:A running job does not get killed or timed-out after certain duration of time.
Workaround: N/A
CDPDSS-1897: Show process nodes option is displayed for entity types which do not have process nodes
Problem:The toggle option to show the "process nodes" is present even for entity types which do not have process nodes.
Workaround: N/A
CDPDSS-1953: Location field does not load absolute path for entities in contents tab
Problem:After navigating to the Azure Blob which is created within the Azure directory and later clicking the "Content" tab, the path for Azure blob is displayed as "/" instead of the path that mentions: containers/directory/Blob.
CDPDSS-1954: User cannot delete profilers when no entities are displayed in search result
Problem: The "Action" button containing the option to delete profilers is also removed when the search results yeild no entity type.
CDPDSS-1956: Issue with count on search page as AWS S3 V2 Object
Problem: The count against the Data Lake displays only the count of AWS S3 V2 Object entity type.
CDPDSS-1965: Cannot traverse to Asset Details page of "AWS S3 V2 Directory" by clicking on its node
Problem: The node for "AWS S3 V2 Directory" is not clickable
CDPDSS-31173: Atlas API on saving the classifications in Data Catalog is failing while using Cloudera Runtime version 7.2.12
Problem:When the suggested classification by Custom Sensitivity Profiler is saved, Atlas API throws an error response and it is not saved in Data Catalog. This behavior is specifically observed in Cloudera Runtime version 7.2.12.
Workaround: If you are using 7.2.12 version, from Cloudera Manager > Atlas > Configuration > search for 'application.prop' > enter the value as "atlas.tasks.enabled=false". Restart Atlas. Later, log into Data Catalog to use the classification feature.