Mech-DLK SDK C# API 2.0.2
C# API reference documentation for secondary development with Mech-DLK
All Classes Namespaces Files Functions Variables Enumerations Enumerator Pages
Public Member Functions | List of all members
MMind.DL.InferEngine Class Reference

This class defines infer engine. More...

Public Member Functions

 InferEngine ()
 Initializes a new instance of the InferEngine class.
 
StatusCode Create (string packPath, BackendType backendType=BackendType.GpuDefault, uint deviceId=0)
 Create an infer engine.
 
StatusCode SetBatchSizeAndFloatPrecision (uint batchSize, FloatPrecisionType floatPrecision, uint moduleIdx)
 Set the batch size and floating-point precision of the inference engine.
 
List< DLAlgoTypeGetModuleTypes ()
 Get Model type list.
 
StatusCode Infer (List< MMindImage > images)
 Make image inference using the model package inference engine.
 
StatusCode GetResults (out List< Result > results)
 Get the model inference result.
 
StatusCode ResultVisualization (List< MMindImage > images)
 Draw all the model results onto the images.
 
StatusCode ModuleResultVisualization (List< MMindImage > images, uint moduleIndex)
 Draw the model results of the specified index onto the images.
 
void Release ()
 Release the memory of the inference engine.
 

Detailed Description

This class defines infer engine.

Constructor & Destructor Documentation

◆ InferEngine()

MMind.DL.InferEngine.InferEngine ( )

Initializes a new instance of the InferEngine class.

Member Function Documentation

◆ Create()

StatusCode MMind.DL.InferEngine.Create ( string  packPath,
BackendType  backendType = BackendType::GpuDefault,
uint  deviceId = 0 
)

Create an infer engine.

Parameters
packPathThe path to the model package exported from Mech-DLK.
backendTypeSee BackendType for details.
deviceIdThe index of the specified GPU during model inference.
Returns
See StatusCode for details.

◆ GetModuleTypes()

List< DLAlgoType > MMind.DL.InferEngine.GetModuleTypes ( )

Get Model type list.

Returns
See DLAlgoType for details.

◆ GetResults()

StatusCode MMind.DL.InferEngine.GetResults ( out List< Result results)

Get the model inference result.

Parameters
resultsSee Result for details.
Returns
See StatusCode for details.

◆ Infer()

StatusCode MMind.DL.InferEngine.Infer ( List< MMindImage images)

Make image inference using the model package inference engine.

Parameters
imagesSee MMindImage for details.
Returns
See StatusCode for details.

◆ ModuleResultVisualization()

StatusCode MMind.DL.InferEngine.ModuleResultVisualization ( List< MMindImage images,
uint  moduleIndex 
)

Draw the model results of the specified index onto the images.

Parameters
imagesSee MMindImage for details.
moduleIndexSpecified model index in the model package.
Returns
See StatusCode for details.

◆ Release()

void MMind.DL.InferEngine.Release ( )

Release the memory of the inference engine.

◆ ResultVisualization()

StatusCode MMind.DL.InferEngine.ResultVisualization ( List< MMindImage images)

Draw all the model results onto the images.

Parameters
imagesSee MMindImage for details.
Returns
See StatusCode for details.

◆ SetBatchSizeAndFloatPrecision()

StatusCode MMind.DL.InferEngine.SetBatchSizeAndFloatPrecision ( uint  batchSize,
FloatPrecisionType  floatPrecision,
uint  moduleIdx 
)

Set the batch size and floating-point precision of the inference engine.

Parameters
batchSizeThe batch size of the model package.
floatPrecisionSee FloatPrecisionType for details.
moduleIdxSpecified model index in the model package.
Returns
See StatusCode for details.

The documentation for this class was generated from the following file: