Skip to content

inferencemanaged Lua Interface


get

Get a pointer to the InferenceManaged object

Arguments:
inferencemanaged ptr lua Pointer to InferenceManaged object
Returns:
inferencemanaged ptr Pointer to casted InferenceManaged object

Example

ptr = inferencemanaged:.get(ptr)


getConfig

Get the configuration object for the plugin

Arguments:
inferencemanaged ptr lua Pointer to InferenceManaged object
Returns:
table obj Map with configurations

Example

obj = inferencemanaged:.getConfig(ptr)


getStats

Get the statistics object for the plugin

Arguments:
inferencemanaged ptr lua Pointer to InferenceManaged object
Returns:
table obj Map with configurations

Example

obj = inferencemanaged:.getStats(ptr)


saveConfig

Save the configuration object for the plugin

Arguments:
inferencemanaged ptr lua Pointer to InferenceManaged object
table new_config Map with configurations

Example

inferencemanaged:.saveConfig(ptr,new_config)


getName

Get the plugin name

Arguments:

Returns:
string name Plugin Name

Example

name = inferencemanaged:.getName()


loadModel

Load an inference model on the plugin

Arguments:
inferencemanaged ptr Pointer to InferenceManaged object
string model Model file path to load
boolean disableExceptions When set to true the function returns false upon failure
Returns:
boolean success **

Example

success = inferencemanaged:.loadModel(ptr,model,disableExceptions)


loadModelFromConfig

Load an inference model on the plugin using the default model from the config

Arguments:
any ptr Pointer to InferenceManaged object
Returns:
bool success **

Example

success = inferencemanaged:.loadModelFromConfig(ptr)


getLoadedModelArch

Get the current used architecture

Arguments:
inferencemanaged ptr Pointer to InferenceManaged object
Returns:
string Current model architecture

Example

Current = inferencemanaged:.getLoadedModelArch(ptr)


runInference

Run Inference on a set of input data

Arguments:
inferencemanaged plugin Pointer to InferenceManaged object
InferenceSourceData data Region and source where to run the inference
Returns:
table result Output result

Example

result = inferencemanaged:.runInference(plugin,data)


runInference

Run Inference on a set of input data

Arguments:
inferencemanaged plugin Pointer to InferenceManaged object
InferenceSourceData[] data Regions and sources where to run the inference
Returns:
table result Output result

Example

result = inferencemanaged:.runInference(plugin,data)


runInference

Run Inference on a set of input data

Arguments:
inferencemanaged plugin Pointer to InferenceManaged object
Rect region Region where to run the inference
buffer source The source buffer to use on all Rect regions
Returns:
table result Output result

Example

result = inferencemanaged:.runInference(plugin,region,source)


runInference

Run Inference on a set of input data

Arguments:
inferencemanaged plugin Pointer to InferenceManaged object
Rect[] regions Regions where to run the inference
buffer source The source buffer to use on all Rect regions
Returns:
table result Output result

Example

result = inferencemanaged:.runInference(plugin,regions,source)


runInferenceAsync

Run Inference on a set of input data asynchronously

Arguments:
inferencemanaged plugin Pointer to InferenceManaged object
InferenceSourceData data Region and source where to run the inference

Example

inferencemanaged:.runInferenceAsync(plugin,data)


runInferenceAsync

Run Inference on a set of input data asynchronously

Arguments:
inferencemanaged plugin Pointer to InferenceManaged object
InferenceSourceData[] data Regions and sources where to run the inference

Example

inferencemanaged:.runInferenceAsync(plugin,data)


runInferenceAsync

Run Inference on a set of input data asynchronously

Arguments:
inferencemanaged plugin Pointer to InferenceManaged object
Rect region Region where to run the inference
buffer source The source buffer to use on all Rect regions

Example

inferencemanaged:.runInferenceAsync(plugin,region,source)


runInferenceAsync

Run Inference on a set of input data asynchronously

Arguments:
inferencemanaged plugin Pointer to InferenceManaged object
Rect[] regions Regions where to run the inference
buffer source The source buffer to use on all Rect regions

Example

inferencemanaged:.runInferenceAsync(plugin,regions,source)


getAsyncInferenceResults

Get the async inference results of the active instance

Arguments:
inferencemanaged plugin Pointer to InferenceManaged object
Returns:
table result Output result

Example

result = inferencemanaged:.getAsyncInferenceResults(plugin)


getOutputs

Get a list of current job outputs

Arguments:
inferencemanaged ptr Pointer to InferenceManaged object
Returns:
table detections Map with job Outputs

Example

detections = inferencemanaged:.getOutputs(ptr)


getOutput

Returns a table pointing to a specific entry in the results

Arguments:
inferencemanaged ptr Pointer to InferenceManaged object
number id Target entry ID
Returns:
table result Output result

Example

result = inferencemanaged:.getOutput(ptr,id)


getModelConfig

Returns a table with the configuration of the loaded model

Arguments:
inferencemanaged ptr Pointer to InferenceManaged object
Returns:
table result Output table result

Example

result = inferencemanaged:.getModelConfig(ptr)


getOutputLen

Get the size of the output list

Arguments:
inferencemanaged ptr Pointer to InferenceManaged object
Returns:
integer Number of outputs found

Example

Number = inferencemanaged:.getOutputLen(ptr)


inputBatchsize

Get the batchsize of the loaded model

Arguments:
inferencemanaged ptr Pointer to InferenceManaged object
Returns:
integer Batchsize **

Example

Batchsize = inferencemanaged:.inputBatchsize(ptr)


inputWidth

Get the input width of the loaded model

Arguments:
inferencemanaged ptr Pointer to InferenceManaged object
Returns:
integer Input width

Example

Input = inferencemanaged:.inputWidth(ptr)


inputHeight

Get the input height of the loaded model

Arguments:
inferencemanaged ptr Pointer to InferenceManaged object
Returns:
integer Input height

Example

Input = inferencemanaged:.inputHeight(ptr)


inputChannels

Get the input channels of the loaded model

Arguments:
inferencemanaged ptr Pointer to InferenceManaged object
Returns:
integer Input channels

Example

Input = inferencemanaged:.inputChannels(ptr)