Monitoring
Publish inference
Publish an inference data point to an inference pipeline.
Use this endpoint to stream individual inference data points to Openlayer. If you want to upload many inferences in one go, please use the batch upload method instead.
Authorizations
Authorization
string
headerrequiredBearer authentication header of the form Bearer <token>
, where <token>
is your workspace API key. See Find your API key for more information.
Path Parameters
inferencePipelineId
string
requiredThe inference pipeline id (a UUID).
Body
application/json
rows
object[]
requiredA list of inference data points with inputs and outputs
config
object
requiredConfiguration for the data stream. Depends on your Openlayer project task type.
Response
200 - application/json
success
boolean
requiredWas this page helpful?