public class Tensor extends Object
Tensor
class offers services to deal with MicroAI tensors.
Tensor
are the input/output of your MLInferenceEngine
.
Modifier and Type | Class and Description |
---|---|
static class |
Tensor.DataType
The
Tensor.DataType class enumerates the MicroAI data types. |
static class |
Tensor.QuantizationParameters
The
QuantizationParameters class represents the quantization parameters used by a MLInferenceEngine . |
Modifier and Type | Method and Description |
---|---|
int |
getDataType()
Gets tensor data type.
|
int |
getNumberBytes()
Gets the number of bytes of the tensor.
|
int |
getNumberDimensions()
Gets the number of dimensions of the tensor.
|
int |
getNumberElements()
Gets the number of elements of the tensor.
|
Tensor.QuantizationParameters |
getQuantizationParams()
Gets parameters of asymmetric quantization for the tensor.
|
void |
getShape(int[] sizes)
Gets the tensor shape.
|
boolean |
isQuantized()
Gets the quantization status of the tensor.
|
public int getDataType()
Tensor.DataType
).public int getNumberBytes()
public int getNumberDimensions()
public int getNumberElements()
public Tensor.QuantizationParameters getQuantizationParams()
Real values can be quantized using: quantized_value = real_value/scale + zero_point
.
Tensor.QuantizationParameters
).public void getShape(int[] sizes)
n-th
element of the array correspond to the n-th
dimension of the tensor.sizes
- is an array that contains the size of each dimension of the tensor.public boolean isQuantized()
true
if the tensor is quantized, false
otherwise.