Collect and send diagnostic bundle using CLI

Learn about collecting diagnostic logs and uploading them using the CLI to facilitate troubleshooting both environment and deployment issues.

cdp df start-get-diagnostics-collection \
--df-service-crn '[***DATAFLOW SERVICE CRN***]' \
--description 'some description' \
--destination 'CLOUD_STORAGE' \
--environment-components ‘[“CFM_OPERATOR”, “CERT_MANAGER”]’ \
--start-time [***YYYY-MM-DDTHH:MM***] \
--end-time [***YYYY-MM-DDTHH:MM***] \
--deployments ‘[“[***CRN1***]”, “[***CRN2***]”]’ \
--collection-scope ALL
--df-service-crn
Description
Provide the Cloud Resource Name (CRN) of the DataFlow service for which you want to collect diagnostic data.
Possible values
Valid CRN
Importance
Required
--description
Description
Provide a free text description of the case for which diagnostic data is being collected. The provided description is persisted in the database and returned in the listing for future reference.
Possible values
Free text
Importance
Required
--destination
Description
Specify the upload destination for the generated bundle.
Possible values
  • CLOUD_STORAGE - uploads the bundle to cloud storage associated with the DataFlow Service environment. A bundle path is included in the listing output with the format <bucket/container>/<datalake-name>/cdf/….
  • SUPPORT - uploads bundle to the UDX backend where the bundle is associated with the provided case number and is automatically attached to the case.
Importance
Required
--environment-components
Description

Specify a list of DataFlow Environment Components for which you want to collect logs. This list operates as an allowed list when present, only collecting logs for specified components. If this list is not present, it implies log collection from all components.

Possible values
Environment component Namespace
CFM_OPERATOR cfm-operator-system
CADENCE cadence
CERT_MANAGER cert-manager
DFX_LOCAL dfx-local
DFX_LOGGING dfx-logging
FLUXCD flux-system
NFS_PROVISIONER nfs-provisioner-system
NGINX nginx-ingress
REDIS redis
VAULT vault
VPA vpa
ZOOKEEPER_OPERATOR zookeeper-operator-system
Importance
Optional
--start-time
Description

Specify the time from which you want logs to be collected. No logs generated prior to the start time are collected.

If no end or start time is specified, log collection defaults to logs from the past 24 hours. Time zone is assumed to be UTC. Collection throws an error and does not begin if the provided end time is earlier than the provided start time.

Possible values
Date and time in YYYY-MM-DDThh:mm format. For example, 2023-02-03T11:00
Importance
Optional
--end-time
Description
Specify the time to which you want logs to be collected. No logs generated after the end time are collected.

If no end or start time is specified, log collection defaults to logs from the past 24 hours. Time zone is assumed to be UTC. Collection throws an error and does not begin if the provided end time is earlier than the provided start time.

Possible values
Date and time in YYYY-MM-DDThh:mm format. For example, 2023-02-03T11:01
Importance
Optional
--deployments
Description

Specify a list of DataFlow Deployment CRNs for which logs should be collected. This list operates as an allowed list when present, only collecting logs for specified deployments. If this list is not present, it allows log collection from all Flow Deployments in the environment.

Possible values
List of valid CRNs
Importance
Optional
--collection-scope
Description
Specify the scope of data collection.
Possible values
ALL collects logs from environment components and deployments, honoring the provided allowed lists.
ENVIRONMENT collects logs from environment components, honoring the environment-components allowed list, and no logs from deployments except for those specified in the deployments allowed list.
DEPLOYMENT collects logs from deployments, honoring the deployments allowed list, and no logs from environment components except for those specified in the environment-components allowed list.
Importance
Optional
Default collection:
cdp df start-get-diagnostics-collection \
--df-service-crn '[***DATAFLOW SERVICE CRN***]' \
--description 'some description' \
--destination 'CLOUD_STORAGE'

Support upload:

cdp df start-get-diagnostics-collection \
	--df-service-crn ‘[***DATAFLOW SERVICE CRN***]’ \
	--description ‘some description’ \
	--destination ‘SUPPORT’ \
	--case-number ‘123456’

Note that case-number is required when destination is SUPPORT

List collection attempts:
cdp df list-diagnostics \
--df-service-crn ‘[***DATAFLOW SERVICE CRN***]'