Skip to content

Plugin Quick Start Guide

This guide helps you get started with CVEDIA-RT plugins quickly.

What are Plugins?

CVEDIA-RT plugins are modular components that extend the platform's functionality. Each plugin provides specific capabilities like:

  • Input plugins - Connect to cameras, video files, and streams
  • Inference plugins - Run AI models on various hardware
  • Output plugins - Export data to different destinations
  • Processing plugins - Transform and analyze data
  • Tracking plugins - Track objects and analyze behavior

Basic Plugin Usage

1. Loading a Plugin

Plugins are loaded automatically when CVEDIA-RT starts. They are discovered from the Plugins/ directory.

2. Using Plugins in Lua Scripts

-- Get current instance
local instance = api.thread.getCurrentInstance()

-- Create a plugin instance
local pluginInstance = api.factory.[PLUGIN_NAME].create(instance, "MyPlugin")

-- Configure the plugin
pluginInstance:configure(config)

-- Use plugin methods
pluginInstance:someMethod()

3. Common Plugin Patterns

Input Plugin Example

-- Create input plugin
local input = api.factory.input.create(instance, "VideoInput")

-- Configure input source
input:setInputType("rtsp")
input:setInputSource("rtsp://camera.example.com/stream")

-- Start input
input:start()

Inference Plugin Example

-- Create inference plugin
local inference = api.factory.inference.create(instance, "AIInference")

-- Configure model
inference:setModel("models/yolov8.onnx")
inference:setDevice(0)  -- GPU 0

-- Run inference
local detections = inference:runInference(frameBuffer)

Output Plugin Example

-- Create output plugin
local output = api.factory.writedata.create(instance, "DataWriter")

-- Configure output
output:setOutputPath("output/")
output:setFormat("json")

-- Write data
output:writeData(analysisResults)

Plugin Categories

Most Common Plugins

Use Case Recommended Plugin Quick Setup
IP Camera Input GStreamerReader Set RTSP URL
Video File Input FFmpegReader Set file path
NVIDIA GPU Inference TensorRT (via Engines) Set model path
Object Tracking Tracker Enable tracking
Data Export WriteData Set output format
MQTT Integration MQTT Configure broker

By Platform

Windows

  • All plugins except platform-specific ones
  • Use Screencap for desktop capture

Linux

  • All plugins available
  • Platform-specific plugins for embedded devices

NVIDIA Jetson

Configuration Patterns

JSON Configuration

Most plugins accept JSON configuration:

{
  "pluginName": {
    "parameter1": "value1",
    "parameter2": 123,
    "nested": {
      "option": true
    }
  }
}

Lua Configuration

local config = {
  parameter1 = "value1",
  parameter2 = 123,
  nested = {
    option = true
  }
}

pluginInstance:configure(config)

Common Workflows

Basic Video Analytics Pipeline

-- 1. Input: Connect to camera
local input = api.factory.input.create(instance, "CameraInput")
input:setInputSource("rtsp://camera.ip/stream")

-- 2. Inference: Run AI model
local inference = api.factory.inference.create(instance, "ObjectDetection")
inference:setModel("models/person_detection.onnx")

-- 3. Tracking: Track detected objects
local tracker = api.factory.tracker.create(instance, "ObjectTracker")

-- 4. Analytics: Analyze in zones
local zone = api.factory.zone.create(instance, "AnalyticsZone")
zone:addZone("entrance", {x1=100, y1=100, x2=300, y2=300})

-- 5. Output: Export results
local output = api.factory.mqtt.create(instance, "MQTTOutput")
output:setBroker("mqtt://broker.example.com")

Multi-Camera Setup

-- Create multiple input instances
for i = 1, 4 do
  local input = api.factory.input.create(instance, "Camera" .. i)
  input:setInputSource("rtsp://camera" .. i .. ".example.com/stream")

  -- Each camera gets its own inference
  local inference = api.factory.inference.create(instance, "Inference" .. i)
  inference:setModel("models/detection.onnx")
end

Troubleshooting

Plugin Not Loading

  1. Check plugin is in Plugins/ directory
  2. Verify dependencies are available
  3. Check CVEDIA-RT logs for errors

Configuration Issues

  1. Validate JSON syntax
  2. Check parameter names match documentation
  3. Verify data types (string vs number vs boolean)

Performance Issues

  1. Check hardware requirements
  2. Reduce input resolution or frame rate
  3. Use appropriate inference precision (FP16 vs FP32)

Next Steps

  1. Browse by category - Find plugins for your specific needs
  2. Check examples - Look at plugin-specific examples
  3. Read configuration docs - Understand all available options
  4. Join community - Get help from other users

See Also