Open Inference Protocol Using Curl
Consider the following instructions for using Open Interference Protocol Using Curl.
You can construct the input of the inference call based on the Open Inference Protocol
V2:
# cat ./examples/wine-input.json
Your input payload will look something like the following:
{ "parameters": { "content_type": "pd" }, "inputs": [ { "name": "float_input", "shape": [ 1, 11 ], "datatype": "FP32", "data": [ 7.4, 7.4, 7.4, 7.4, 7.4, 7.4, 7.4, 7.4, 7.4, 7.4, 7.4 ] } ] }
curl -H "Content-Type: application/json" -H "Authorization: Bearer ${CDP_TOKEN}" "https://${DOMAIN}/namespaces/serving-default/endpoints/[***ENDPOINT_NAME***]/v2/models/model/infer" -d @./examples/wine-input.json
You will receive response similar to the following:
{ "model_name":"model", "model_version":"1", "outputs":[ { "name":"variable", "datatype":"FP32", "shape":[1,1], "data":[5.535987377166748] } ] }