Feature Matching
Overview
Feature Matching node finds local feature correspondences between the input frame and one or more reference images by detecting keypoints in each image and matching descriptors.
Use this node to identify known objects or patterns regardless of scale or rotation changes, verify whether a specific part or template is present, or measure feature alignment quality.
Input
Input Image
image requiredThe live frame to match against. Connect this to a camera or upstream image output.
Reference Frames
array requiredOne or more reference images to match against. Each entry is compared independently and produces its own match result.
Detector
stringAlgorithm used to detect and describe local features in each image.
Values:
ORB(default) — fast binary descriptor; good general-purpose choice for real-time flows.SIFT— scale and rotation invariant; more accurate but slower thanORB.AKAZE— fast and scale-invariant; a good balance between speed and robustness.BRISK— binary descriptor optimized for speed on lower-power hardware.
Matcher
string requiredStrategy used to pair descriptors between the input and reference images.
Values:
DEFAULT— automatically selects the appropriate matcher for the chosen detector.BRUTEFORCE— exhaustively compares all descriptor pairs; most accurate.FLANNBASED— approximate nearest-neighbour search; faster for large descriptor sets.
Match Number Threshold
integer requiredMinimum number of passing matches required for a reference to be considered a successful match. If the number of good matches is below this value, the match is treated as a failure.
Match Ratio Threshold
number requiredLowe's ratio test threshold applied to filter ambiguous matches. A match is kept only when the best match is significantly closer than the second-best match. Lower values are stricter (fewer but more reliable matches); typical values are 0.7–0.8.
Output
Overlay Image
imageOutput frame showing lines connecting matched keypoints between the input and reference images.
Matches
arrayArray of match result objects, one per reference image. Each object contains:
referenceIndex— index of the matched reference in theReference Frameslist.matchCount— number of good matches found.isMatched— whethermatchCountmet or exceededMatch Number Threshold.