Package ej.microai

Class MLInferenceEngine

  • All Implemented Interfaces:
    java.io.Closeable, java.lang.AutoCloseable

    public class MLInferenceEngine
    extends java.lang.Object
    implements java.io.Closeable
    The MLInferenceEngine class provides essential functionality for interacting with Machine Learning models, specifically for running inferences.

    It allows users to load a trained model from a resource (using MLInferenceEngine(String)) or from an InputStream (using MLInferenceEngine(InputStream)).

    Users can set input tensors, execute inference, and retrieve output tensor values.

    A MLInferenceEngine allocates native resources when it is opened. The MLInferenceEngine should be closed with the close() method in order to free the native allocation.

    • Constructor Summary

      Constructors 
      Constructor Description
      MLInferenceEngine​(java.io.InputStream is)
      Initializes the model from a given InputStream.
      MLInferenceEngine​(java.lang.String modelPath)
      Initializes the model from a pre-trained model resource.
    • Method Summary

      All Methods Instance Methods Concrete Methods 
      Modifier and Type Method Description
      void close()
      Closes this model and its associated resources.
      InputTensor getInputTensor​(int index)
      Gets the input tensor specified by the index.
      int getInputTensorCount()
      Gets the number of input Tensors.
      OutputTensor getOutputTensor​(int index)
      Gets the output tensor specified by the index.
      int getOutputTensorCount()
      Gets the number of output Tensors.
      boolean isClosed()
      Returns whether this model has been closed.
      int reset()
      Resets the state of the model interpreter.
      int run()
      Runs an inference on the model.
      • Methods inherited from class java.lang.Object

        clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
    • Constructor Detail

      • MLInferenceEngine

        public MLInferenceEngine​(java.lang.String modelPath)
        Initializes the model from a pre-trained model resource.

        Map the model into a usable data structure, build an interpreter to run the model with and allocate memory for the model's tensors.

        Parameters:
        modelPath - the path of the pre-trained model resource.
      • MLInferenceEngine

        public MLInferenceEngine​(java.io.InputStream is)
                          throws java.io.IOException
        Initializes the model from a given InputStream.

        This method will block until the model is completely retrieved in the native side from the input stream or an IOException occurs.

        Map the model into a usable data structure, build an interpreter to run the model with and allocate memory for the model's tensors.

        Parameters:
        is - the input stream to read model from.
        Throws:
        java.io.IOException - if an IOException is encountered by the input stream.
    • Method Detail

      • reset

        public int reset()
        Resets the state of the model interpreter.
        Returns:
        0 if the execution succeeds, 1 otherwise.
      • run

        public int run()
        Runs an inference on the model.
        Returns:
        0 if the execution succeeds, 1 otherwise.
      • getInputTensor

        public InputTensor getInputTensor​(int index)
        Gets the input tensor specified by the index.
        Parameters:
        index - the index of the tensor, usually index = 0 except models who have more than one tensor as input.
        Returns:
        the input tensor specified by index.
        Throws:
        java.lang.IllegalArgumentException - if index is invalid.
      • getOutputTensor

        public OutputTensor getOutputTensor​(int index)
        Gets the output tensor specified by the index.
        Parameters:
        index - the index of the tensor, usually index = 0 except models who have more than one tensor as output.
        Returns:
        the output tensor specified by index.
        Throws:
        java.lang.IllegalArgumentException - if index is invalid.
      • getInputTensorCount

        public int getInputTensorCount()
        Gets the number of input Tensors.
        Returns:
        the number of input Tensors.
      • getOutputTensorCount

        public int getOutputTensorCount()
        Gets the number of output Tensors.
        Returns:
        the number of output Tensors.
      • isClosed

        public boolean isClosed()
        Returns whether this model has been closed.
        Returns:
        whether this model has been closed.
      • close

        public void close()
        Closes this model and its associated resources.

        This method releases the native resources allocated when opening this model. Calling this method on a model that has already been closed has no effect.

        Specified by:
        close in interface java.lang.AutoCloseable
        Specified by:
        close in interface java.io.Closeable