viam.services.mlmodel

Submodules

Package Contents

Classes

File

Abstract base class for protocol messages.

LabelType

Metadata

Abstract base class for protocol messages.

TensorInfo

Abstract base class for protocol messages.

MLModelClient

MLModel represents a Machine Learning Model service.

MLModel

MLModel represents a Machine Learning Model service.

class viam.services.mlmodel.File(*, name: str = ..., description: str = ..., label_type: global___LabelType = ...)

Bases: google.protobuf.message.Message

Abstract base class for protocol messages.

Protocol message classes are almost always generated by the protocol compiler. These generated types subclass Message and implement the methods shown below.

name: str

name of the file, with file extension

description: str

description of what the file contains

label_type: global___LabelType

How to associate the arrays/tensors to the labels in the file

class viam.services.mlmodel.LabelType

Bases: _LabelType

class viam.services.mlmodel.Metadata(*, name: str = ..., type: str = ..., description: str = ..., input_info: collections.abc.Iterable[global___TensorInfo] | None = ..., output_info: collections.abc.Iterable[global___TensorInfo] | None = ...)

Bases: google.protobuf.message.Message

Abstract base class for protocol messages.

Protocol message classes are almost always generated by the protocol compiler. These generated types subclass Message and implement the methods shown below.

property input_info: google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___TensorInfo]

the necessary input arrays/tensors for an inference, order matters

property output_info: google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___TensorInfo]

the output arrays/tensors of the model, order matters

name: str

name of the model

type: str

type of model e.g. object_detector, text_classifier

description: str

description of the model

class viam.services.mlmodel.TensorInfo(*, name: str = ..., description: str = ..., data_type: str = ..., shape: collections.abc.Iterable[int] | None = ..., associated_files: collections.abc.Iterable[global___File] | None = ..., extra: google.protobuf.struct_pb2.Struct | None = ...)

Bases: google.protobuf.message.Message

Abstract base class for protocol messages.

Protocol message classes are almost always generated by the protocol compiler. These generated types subclass Message and implement the methods shown below.

property shape: google.protobuf.internal.containers.RepeatedScalarFieldContainer[int]

shape of the array/tensor (-1 for unknown)

property associated_files: google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___File]

files associated with the array/tensor, like for category labels

property extra: google.protobuf.struct_pb2.Struct

anything else you want to say

name: str

name of the data in the array/tensor

description: str

description of the data in the array/tensor

data_type: str

data type of the array/tensor, e.g. float32, float64, uint8

HasField(field_name: Literal[extra, b'extra']) bool

Checks if a certain field is set for the message.

For a oneof group, checks if any field inside is set. Note that if the field_name is not defined in the message descriptor, ValueError will be raised.

Parameters:

field_name (str) – The name of the field to check for presence.

Returns:

Whether a value has been set for the named field.

Return type:

bool

Raises:

ValueError – if the field_name is not a member of this message.

class viam.services.mlmodel.MLModelClient(name: str, channel: grpclib.client.Channel)[source]

Bases: viam.services.mlmodel.mlmodel.MLModel, viam.resource.rpc_client_base.ReconfigurableResourceRPCClientBase

MLModel represents a Machine Learning Model service.

This acts as an abstract base class for any drivers representing specific arm implementations. This cannot be used on its own. If the __init__() function is overridden, it must call the super().__init__() function.

async infer(input_tensors: Dict[str, numpy.typing.NDArray], *, timeout: Optional[float] = None) Dict[str, numpy.typing.NDArray][source]

Take an already ordered input tensor as an array, make an inference on the model, and return an output tensor map.

Parameters:

input_tensors (Dict[str, NDArray]) – A dictionary of input flat tensors as specified in the metadata

Returns:

A dictionary of output flat tensors as specified in the metadata

Return type:

Dict[str, NDArray]

async metadata(*, timeout: Optional[float] = None) viam.services.mlmodel.mlmodel.Metadata[source]

Get the metadata (such as name, type, expected tensor/array shape, inputs, and outputs) associated with the ML model.

Returns:

The metadata

Return type:

Metadata

async do_command(command: Mapping[str, viam.utils.ValueTypes], *, timeout: Optional[float] = None, **kwargs) Mapping[str, viam.utils.ValueTypes][source]

Send/Receive arbitrary commands to the Resource

Parameters:

command (Mapping[str, ValueTypes]) – The command to execute

Raises:

NotImplementedError – Raised if the Resource does not support arbitrary commands

Returns:

Result of the executed command

Return type:

Mapping[str, ValueTypes]

classmethod from_robot(robot: viam.robot.client.RobotClient, name: str) typing_extensions.Self

Get the service named name from the provided robot.

Parameters:
  • robot (RobotClient) – The robot

  • name (str) – The name of the service

Returns:

The service, if it exists on the robot

Return type:

Self

classmethod get_resource_name(name: str) viam.proto.common.ResourceName

Get the ResourceName for this Resource with the given name

Parameters:

name (str) – The name of the Resource

get_operation(kwargs: Mapping[str, Any]) viam.operations.Operation

Get the Operation associated with the currently running function.

When writing custom resources, you should get the Operation by calling this function and check to see if it’s cancelled. If the Operation is cancelled, then you can perform any necessary (terminating long running tasks, cleaning up connections, etc. ).

Parameters:

kwargs (Mapping[str, Any]) – The kwargs object containing the operation

Returns:

The operation associated with this function

Return type:

viam.operations.Operation

async close()

Safely shut down the resource and prevent further use.

Close must be idempotent. Later configuration may allow a resource to be “open” again. If a resource does not want or need a close function, it is assumed that the resource does not need to retun errors when future non-Close methods are called.

class viam.services.mlmodel.MLModel(name: str)[source]

Bases: viam.services.service_base.ServiceBase

MLModel represents a Machine Learning Model service.

This acts as an abstract base class for any drivers representing specific arm implementations. This cannot be used on its own. If the __init__() function is overridden, it must call the super().__init__() function.

SUBTYPE: Final
abstract async infer(input_tensors: Dict[str, numpy.typing.NDArray], *, timeout: Optional[float]) Dict[str, numpy.typing.NDArray][source]

Take an already ordered input tensor as an array, make an inference on the model, and return an output tensor map.

Parameters:

input_tensors (Dict[str, NDArray]) – A dictionary of input flat tensors as specified in the metadata

Returns:

A dictionary of output flat tensors as specified in the metadata

Return type:

Dict[str, NDArray]

abstract async metadata(*, timeout: Optional[float]) viam.proto.service.mlmodel.Metadata[source]

Get the metadata (such as name, type, expected tensor/array shape, inputs, and outputs) associated with the ML model.

Returns:

The metadata

Return type:

Metadata

classmethod from_robot(robot: viam.robot.client.RobotClient, name: str) typing_extensions.Self

Get the service named name from the provided robot.

Parameters:
  • robot (RobotClient) – The robot

  • name (str) – The name of the service

Returns:

The service, if it exists on the robot

Return type:

Self

abstract async do_command(command: Mapping[str, viam.utils.ValueTypes], *, timeout: Optional[float] = None, **kwargs) Mapping[str, viam.utils.ValueTypes]

Send/Receive arbitrary commands to the Resource

Parameters:

command (Mapping[str, ValueTypes]) – The command to execute

Raises:

NotImplementedError – Raised if the Resource does not support arbitrary commands

Returns:

Result of the executed command

Return type:

Mapping[str, ValueTypes]

classmethod get_resource_name(name: str) viam.proto.common.ResourceName

Get the ResourceName for this Resource with the given name

Parameters:

name (str) – The name of the Resource

get_operation(kwargs: Mapping[str, Any]) viam.operations.Operation

Get the Operation associated with the currently running function.

When writing custom resources, you should get the Operation by calling this function and check to see if it’s cancelled. If the Operation is cancelled, then you can perform any necessary (terminating long running tasks, cleaning up connections, etc. ).

Parameters:

kwargs (Mapping[str, Any]) – The kwargs object containing the operation

Returns:

The operation associated with this function

Return type:

viam.operations.Operation

async close()

Safely shut down the resource and prevent further use.

Close must be idempotent. Later configuration may allow a resource to be “open” again. If a resource does not want or need a close function, it is assumed that the resource does not need to retun errors when future non-Close methods are called.