Nabrio Help
Nabrio Help

Getting Started

Nara Overview

Understanding Nara

Using Nara

Components

Common Process InputsBrightnessCircle Detection1D/2D Code ReaderColor DetectionColor ThresholdContrastMulti CropCropDetection Count ZonesFace DetectionFace RecognitionFeature MatchingFire & Smoke DetectionFlipGeneral Object DetectionImage ClassificationImage SimilarityKey Points DetectionNumber Plate ReaderObject DetectionOCRPose EstimationResizeRotateSaturationInstance SegmentationWatershed SegmentPolygon DetectionWhite Balance

Miscellaneous

Nomenclature
Troubleshooting
Notice and DisclaimerEULA
NodesProcess

Feature Matching

Slot Usage: 3

Overview

Feature Matching node finds local feature correspondences between the input frame and one or more reference images by detecting keypoints in each image and matching descriptors.

Use this node to identify known objects or patterns regardless of scale or rotation changes, verify whether a specific part or template is present, or measure feature alignment quality.

Input

Input Image

image required

The live frame to match against. Connect this to a camera or upstream image output.

Reference Frames

array required

One or more reference images to match against. Each entry is compared independently and produces its own match result.

Detector

string

Algorithm used to detect and describe local features in each image.

Values:

  • ORB (default) — fast binary descriptor; good general-purpose choice for real-time flows.
  • SIFT — scale and rotation invariant; more accurate but slower than ORB.
  • AKAZE — fast and scale-invariant; a good balance between speed and robustness.
  • BRISK — binary descriptor optimized for speed on lower-power hardware.

Matcher

string required

Strategy used to pair descriptors between the input and reference images.

Values:

  • DEFAULT — automatically selects the appropriate matcher for the chosen detector.
  • BRUTEFORCE — exhaustively compares all descriptor pairs; most accurate.
  • FLANNBASED — approximate nearest-neighbour search; faster for large descriptor sets.

Match Number Threshold

integer required

Minimum number of passing matches required for a reference to be considered a successful match. If the number of good matches is below this value, the match is treated as a failure.

Match Ratio Threshold

number required

Lowe's ratio test threshold applied to filter ambiguous matches. A match is kept only when the best match is significantly closer than the second-best match. Lower values are stricter (fewer but more reliable matches); typical values are 0.7–0.8.

Output

Overlay Image

image

Output frame showing lines connecting matched keypoints between the input and reference images.

Matches

array

Array of match result objects, one per reference image. Each object contains:

  • referenceIndex — index of the matched reference in the Reference Frames list.
  • matchCount — number of good matches found.
  • isMatched — whether matchCount met or exceeded Match Number Threshold.

Face Recognition

Previous Page

Fire & Smoke Detection

Next Page

On this page

OverviewInputInput ImageReference FramesDetectorMatcherMatch Number ThresholdMatch Ratio ThresholdOutputOverlay ImageMatches