viam.services.mlmodel
=====================
.. py:module:: viam.services.mlmodel
Submodules
----------
.. toctree::
:maxdepth: 1
/autoapi/viam/services/mlmodel/client/index
/autoapi/viam/services/mlmodel/mlmodel/index
/autoapi/viam/services/mlmodel/service/index
/autoapi/viam/services/mlmodel/utils/index
Classes
-------
.. autoapisummary::
viam.services.mlmodel.File
viam.services.mlmodel.LabelType
viam.services.mlmodel.Metadata
viam.services.mlmodel.TensorInfo
viam.services.mlmodel.MLModelClient
viam.services.mlmodel.MLModel
Package Contents
----------------
.. py:class:: File(*, name: str = ..., description: str = ..., label_type: global___LabelType = ...)
Bases: :py:obj:`google.protobuf.message.Message`
Abstract base class for protocol messages.
Protocol message classes are almost always generated by the protocol
compiler. These generated types subclass Message and implement the methods
shown below.
.. py:attribute:: name
:type: str
name of the file, with file extension
.. py:attribute:: description
:type: str
description of what the file contains
.. py:attribute:: label_type
:type: global___LabelType
How to associate the arrays/tensors to the labels in the file
.. py:class:: LabelType
Bases: :py:obj:`_LabelType`
.. py:class:: Metadata(*, name: str = ..., type: str = ..., description: str = ..., input_info: collections.abc.Iterable[global___TensorInfo] | None = ..., output_info: collections.abc.Iterable[global___TensorInfo] | None = ...)
Bases: :py:obj:`google.protobuf.message.Message`
Abstract base class for protocol messages.
Protocol message classes are almost always generated by the protocol
compiler. These generated types subclass Message and implement the methods
shown below.
.. py:attribute:: name
:type: str
name of the model
.. py:attribute:: type
:type: str
type of model e.g. object_detector, text_classifier
.. py:attribute:: description
:type: str
description of the model
.. py:property:: input_info
:type: google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___TensorInfo]
the necessary input arrays/tensors for an inference, order matters
.. py:property:: output_info
:type: google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___TensorInfo]
the output arrays/tensors of the model, order matters
.. py:class:: TensorInfo(*, name: str = ..., description: str = ..., data_type: str = ..., shape: collections.abc.Iterable[int] | None = ..., associated_files: collections.abc.Iterable[global___File] | None = ..., extra: google.protobuf.struct_pb2.Struct | None = ...)
Bases: :py:obj:`google.protobuf.message.Message`
Abstract base class for protocol messages.
Protocol message classes are almost always generated by the protocol
compiler. These generated types subclass Message and implement the methods
shown below.
.. py:attribute:: name
:type: str
name of the data in the array/tensor
.. py:attribute:: description
:type: str
description of the data in the array/tensor
.. py:attribute:: data_type
:type: str
data type of the array/tensor, e.g. float32, float64, uint8
.. py:property:: shape
:type: google.protobuf.internal.containers.RepeatedScalarFieldContainer[int]
shape of the array/tensor (-1 for unknown)
.. py:property:: associated_files
:type: google.protobuf.internal.containers.RepeatedCompositeFieldContainer[global___File]
files associated with the array/tensor, like for category labels
.. py:property:: extra
:type: google.protobuf.struct_pb2.Struct
anything else you want to say
.. py:method:: HasField(field_name: Literal['extra', b'extra']) -> bool
Checks if a certain field is set for the message.
For a oneof group, checks if any field inside is set. Note that if the
field_name is not defined in the message descriptor, :exc:`ValueError` will
be raised.
:param field_name: The name of the field to check for presence.
:type field_name: str
:returns: Whether a value has been set for the named field.
:rtype: bool
:raises ValueError: if the `field_name` is not a member of this message.
.. py:class:: MLModelClient(name: str, channel: grpclib.client.Channel)
Bases: :py:obj:`viam.services.mlmodel.mlmodel.MLModel`, :py:obj:`viam.resource.rpc_client_base.ReconfigurableResourceRPCClientBase`
MLModel represents a Machine Learning Model service.
This acts as an abstract base class for any drivers representing specific
arm implementations. This cannot be used on its own. If the ``__init__()`` function is
overridden, it must call the ``super().__init__()`` function.
For more information, see `ML model service `_.
.. py:attribute:: channel
.. py:attribute:: client
.. py:method:: infer(input_tensors: Dict[str, numpy.typing.NDArray], *, extra: Optional[Mapping[str, viam.utils.ValueTypes]] = None, timeout: Optional[float] = None, **kwargs) -> Dict[str, numpy.typing.NDArray]
:async:
Take an already ordered input tensor as an array, make an inference on the model, and return an output tensor map.
::
import numpy as np
my_mlmodel = MLModelClient.from_robot(robot=machine, name="my_mlmodel_service")
image_data = np.zeros((1, 384, 384, 3), dtype=np.uint8)
# Create the input tensors dictionary
input_tensors = {
"image": image_data
}
output_tensors = await my_mlmodel.infer(input_tensors)
:param input_tensors: A dictionary of input flat tensors as specified in the metadata
:type input_tensors: Dict[str, NDArray]
:returns: A dictionary of output flat tensors as specified in the metadata
:rtype: Dict[str, NDArray]
For more information, see `ML model service `_.
.. py:method:: metadata(*, extra: Optional[Mapping[str, viam.utils.ValueTypes]] = None, timeout: Optional[float] = None, **kwargs) -> viam.services.mlmodel.mlmodel.Metadata
:async:
Get the metadata (such as name, type, expected tensor/array shape, inputs, and outputs) associated with the ML model.
::
my_mlmodel = MLModelClient.from_robot(robot=machine, name="my_mlmodel_service")
metadata = await my_mlmodel.metadata()
:returns: The metadata
:rtype: Metadata
For more information, see `ML model service `_.
.. py:method:: from_robot(robot: viam.robot.client.RobotClient, name: str) -> typing_extensions.Self
:classmethod:
Get the service named ``name`` from the provided robot.
::
async def connect() -> RobotClient:
# Replace "" (including brackets) with your API key and "" with your API key ID
options = RobotClient.Options.with_api_key("", "")
# Replace "" (included brackets) with your machine's connection URL or FQDN
return await RobotClient.at_address("", options)
async def main():
robot = await connect()
# Can be used with any resource, using the motion service as an example
motion = MotionClient.from_robot(robot=machine, name="builtin")
robot.close()
:param robot: The robot
:type robot: RobotClient
:param name: The name of the service
:type name: str
:returns: The service, if it exists on the robot
:rtype: Self
.. py:method:: do_command(command: Mapping[str, viam.utils.ValueTypes], *, timeout: Optional[float] = None, **kwargs) -> Mapping[str, viam.utils.ValueTypes]
:abstractmethod:
:async:
Send/receive arbitrary commands.
::
service = SERVICE.from_robot(robot=machine, "builtin") # replace SERVICE with the appropriate class
my_command = {
"cmnd": "dosomething",
"someparameter": 52
}
# Can be used with any resource, using the motion service as an example
await service.do_command(command=my_command)
:param command: The command to execute
:type command: Dict[str, ValueTypes]
:returns: Result of the executed command
:rtype: Dict[str, ValueTypes]
.. py:method:: get_resource_name(name: str) -> viam.proto.common.ResourceName
:classmethod:
Get the ResourceName for this Resource with the given name
::
# Can be used with any resource, using an arm as an example
my_arm_name = Arm.get_resource_name("my_arm")
:param name: The name of the Resource
:type name: str
:returns: The ResourceName of this Resource
:rtype: ResourceName
.. py:method:: get_operation(kwargs: Mapping[str, Any]) -> viam.operations.Operation
Get the ``Operation`` associated with the currently running function.
When writing custom resources, you should get the ``Operation`` by calling this function and check to see if it's cancelled.
If the ``Operation`` is cancelled, then you can perform any necessary (terminating long running tasks, cleaning up connections, etc.
).
:param kwargs: The kwargs object containing the operation
:type kwargs: Mapping[str, Any]
:returns: The operation associated with this function
:rtype: viam.operations.Operation
.. py:method:: close()
:async:
Safely shut down the resource and prevent further use.
Close must be idempotent. Later configuration may allow a resource to be "open" again.
If a resource does not want or need a close function, it is assumed that the resource does not need to return errors when future
non-Close methods are called.
::
await component.close()
.. py:class:: MLModel(name: str, *, logger: Optional[logging.Logger] = None)
Bases: :py:obj:`viam.services.service_base.ServiceBase`
MLModel represents a Machine Learning Model service.
This acts as an abstract base class for any drivers representing specific
arm implementations. This cannot be used on its own. If the ``__init__()`` function is
overridden, it must call the ``super().__init__()`` function.
For more information, see `ML model service `_.
.. py:attribute:: API
:type: Final
The API of the Resource
.. py:method:: infer(input_tensors: Dict[str, numpy.typing.NDArray], *, extra: Optional[Mapping[str, viam.utils.ValueTypes]] = None, timeout: Optional[float] = None) -> Dict[str, numpy.typing.NDArray]
:abstractmethod:
:async:
Take an already ordered input tensor as an array, make an inference on the model, and return an output tensor map.
::
import numpy as np
my_mlmodel = MLModelClient.from_robot(robot=machine, name="my_mlmodel_service")
image_data = np.zeros((1, 384, 384, 3), dtype=np.uint8)
# Create the input tensors dictionary
input_tensors = {
"image": image_data
}
output_tensors = await my_mlmodel.infer(input_tensors)
:param input_tensors: A dictionary of input flat tensors as specified in the metadata
:type input_tensors: Dict[str, NDArray]
:returns: A dictionary of output flat tensors as specified in the metadata
:rtype: Dict[str, NDArray]
For more information, see `ML model service `_.
.. py:method:: metadata(*, extra: Optional[Mapping[str, viam.utils.ValueTypes]] = None, timeout: Optional[float] = None) -> viam.proto.service.mlmodel.Metadata
:abstractmethod:
:async:
Get the metadata (such as name, type, expected tensor/array shape, inputs, and outputs) associated with the ML model.
::
my_mlmodel = MLModelClient.from_robot(robot=machine, name="my_mlmodel_service")
metadata = await my_mlmodel.metadata()
:returns: The metadata
:rtype: Metadata
For more information, see `ML model service `_.
.. py:method:: from_robot(robot: viam.robot.client.RobotClient, name: str) -> typing_extensions.Self
:classmethod:
Get the service named ``name`` from the provided robot.
::
async def connect() -> RobotClient:
# Replace "" (including brackets) with your API key and "" with your API key ID
options = RobotClient.Options.with_api_key("", "")
# Replace "" (included brackets) with your machine's connection URL or FQDN
return await RobotClient.at_address("", options)
async def main():
robot = await connect()
# Can be used with any resource, using the motion service as an example
motion = MotionClient.from_robot(robot=machine, name="builtin")
robot.close()
:param robot: The robot
:type robot: RobotClient
:param name: The name of the service
:type name: str
:returns: The service, if it exists on the robot
:rtype: Self
.. py:method:: do_command(command: Mapping[str, viam.utils.ValueTypes], *, timeout: Optional[float] = None, **kwargs) -> Mapping[str, viam.utils.ValueTypes]
:abstractmethod:
:async:
Send/receive arbitrary commands.
::
service = SERVICE.from_robot(robot=machine, "builtin") # replace SERVICE with the appropriate class
my_command = {
"cmnd": "dosomething",
"someparameter": 52
}
# Can be used with any resource, using the motion service as an example
await service.do_command(command=my_command)
:param command: The command to execute
:type command: Dict[str, ValueTypes]
:returns: Result of the executed command
:rtype: Dict[str, ValueTypes]
.. py:method:: get_resource_name(name: str) -> viam.proto.common.ResourceName
:classmethod:
Get the ResourceName for this Resource with the given name
::
# Can be used with any resource, using an arm as an example
my_arm_name = Arm.get_resource_name("my_arm")
:param name: The name of the Resource
:type name: str
:returns: The ResourceName of this Resource
:rtype: ResourceName
.. py:method:: get_operation(kwargs: Mapping[str, Any]) -> viam.operations.Operation
Get the ``Operation`` associated with the currently running function.
When writing custom resources, you should get the ``Operation`` by calling this function and check to see if it's cancelled.
If the ``Operation`` is cancelled, then you can perform any necessary (terminating long running tasks, cleaning up connections, etc.
).
:param kwargs: The kwargs object containing the operation
:type kwargs: Mapping[str, Any]
:returns: The operation associated with this function
:rtype: viam.operations.Operation
.. py:method:: close()
:async:
Safely shut down the resource and prevent further use.
Close must be idempotent. Later configuration may allow a resource to be "open" again.
If a resource does not want or need a close function, it is assumed that the resource does not need to return errors when future
non-Close methods are called.
::
await component.close()