Deploying workflows as model endpoints
The Cloudera AI Agent Studio's deployment system enables you to transform AI workflows into production-ready endpoints. When deployed, each workflow operates as an independent service, leveraging both Cloudera AI Workbench models and a Cloudera AI Workbench application. For more information see, Models overview Models overview and Analytical Applications Analytical Applications.
- A Cloudera AI Workbench Model that functions as the workflow engine to execute tasks.
- A Cloudera AI Workbench Application that provides a user interface for interacting with the workflow.
Workflows are fundamentally asynchronous. When a workflow start request is sent to a deployment, a trace ID is immediately returned to track the workflow’s status.
-
The workflow can be initiated either by starting it from the deployed workflow Application or by sending a kickoff request directly to the model endpoint.
-
The workflow runs asynchronously within the model deployment, simultaneously streaming events and logs to the Operations and Metrics server.
-
The workflow status can be checked and monitored for completion by polling the /events endpoint on the Operations & Metrics server.
For more information, see Monitoring feature in Agent StudioMonitoring feature in Agent Studio.
