Monitoring
Publish inference
Publish an inference data point to an inference pipeline.
Use this endpoint to stream individual inference data points to Openlayer. If you want to upload many inferences in one go, please use the batch upload method instead.
Authorizations
Bearer authentication header of the form Bearer <token>
, where <token>
is your workspace API key. See Find your API key for more information.
Path Parameters
The inference pipeline id (a UUID).
Body
application/json
A list of inference data points with inputs and outputs
Configuration for the data stream. Depends on your Openlayer project task type.
Response
200 - application/json
Was this page helpful?