viam.gen.app.mlinference.v1.ml_inference_pb2
@generated by mypy-protobuf. Do not edit manually! isort:skip_file
Attributes
Classes
Abstract base class for protocol messages. |
|
Abstract base class for protocol messages. |
Module Contents
- viam.gen.app.mlinference.v1.ml_inference_pb2.DESCRIPTOR: google.protobuf.descriptor.FileDescriptor
- class viam.gen.app.mlinference.v1.ml_inference_pb2.GetInferenceRequest(*, registry_item_id: str = ..., registry_item_version: str = ..., binary_id: viam.gen.app.data.v1.data_pb2.BinaryID | None = ..., organization_id: str = ...)
Bases:
google.protobuf.message.Message
Abstract base class for protocol messages.
Protocol message classes are almost always generated by the protocol compiler. These generated types subclass Message and implement the methods shown below.
- registry_item_id: str
The model framework and model type are inferred from the ML model registry item; For valid model types (classification, detections) we will return the formatted labels or annotations from the associated inference outputs.
- registry_item_version: str
- organization_id: str
- property binary_id: viam.gen.app.data.v1.data_pb2.BinaryID
- HasField(field_name: Literal['binary_id', b'binary_id']) bool
Checks if a certain field is set for the message.
For a oneof group, checks if any field inside is set. Note that if the field_name is not defined in the message descriptor,
ValueError
will be raised.- Parameters:
field_name (str) – The name of the field to check for presence.
- Returns:
Whether a value has been set for the named field.
- Return type:
bool
- Raises:
ValueError – if the field_name is not a member of this message.
- viam.gen.app.mlinference.v1.ml_inference_pb2.global___GetInferenceRequest
- class viam.gen.app.mlinference.v1.ml_inference_pb2.GetInferenceResponse
Bases:
google.protobuf.message.Message
Abstract base class for protocol messages.
Protocol message classes are almost always generated by the protocol compiler. These generated types subclass Message and implement the methods shown below.
- viam.gen.app.mlinference.v1.ml_inference_pb2.global___GetInferenceResponse