HUSKYLENS 2 Hand Recognition Function Description
This article provides a comprehensive guide to the hand recognition function of HUSKYLENS 2, explaining its ability to detect 21 key points on the palm, learn, recognize, and track hand gestures. It includes detailed instructions on using the hand recognition feature, adjusting parameters for gesture recognition, and the process of exporting and importing models for gesture learning and recognition across different devices. The article aims to help users understand and fully utilize the hand recognition capabilities of HUSKYLENS 2 to enhance their projects involving gesture learning and recognition.
1.Introduction to Hand Recognition
This function can detect the palm and its 21 key points in the image, locate and visualize each palm key point, and support learning, recognizing, and tracking hand gestures. The 21 key points include: 1 wrist (1), and 4 joints per finger (4 joints for each of the thumb, index finger, middle finger, ring finger, and little finger, corresponding to the root, first segment, second segment, and fingertip, respectively).

2.Hand Recognition Usage Instructions
In this section, we will learn how to use HUSKYLENS 2 to detect hands and hand keypoints in the image, and train it to recognize specified hand gestures.
2.1 Selecting Hand Recognition Function
Power on HUSKYLENS 2, after successful initialization, and locate the "Hand Recognition" function.
2.2 Observe Hand Detection Effect
Point HUSKYLENS 2 at an image containing a palm. When a palm is detected, the screen will display white bounding boxes enclosing all detected palms, with white dots marking the 21 key points on the palm.
After learning, if a pre-learned gesture is recognized, the screen will frame the gesture with a colored bounding box and display the gesture name, ID number, and confidence level above it.
For example, "Gesture: ID1 93%". Here, the name defaults to "Gesture"; "ID1" refers to the first gesture learned; "93%" is the confidence level, indicating the probability that the model identifies the recognized target gesture as a specific gesture in hand recognition.
"ID1 93%" means the model indicates a 93% probability that the gesture is ID1. For learning additional gestures, the same logic applies.
Multi-angle Learning: When learning a gesture, long-press Button-A to adjust the viewing angle of HUSKYLENS 2 and perform multi-angle learning for this gesture. Learning progress will be displayed during multi-angle learning.
3.Parameter Settings for Hand Gesture Recognition
The factory default parameters of HUSKYLENS 2 meet the requirements of basic functionality. For more refined features, individual parameters can be manually adjusted. All the following parameters are based on the "Hand Gesture Recognition" function; therefore, ensure you have first entered the "Hand Gesture Recognition" mode, as shown in the figure.
3.1 Forgetting IDs
To forget all previously learned gestures: Step 1: Tap the "Forget IDs" on the screen; Step 2: A pop-up window "Forget All IDs and Names" appears, then tap "Yes" to confirm.Align HUSKYLENS 2 with the previously learned gesture now. If a white box appears on the HUSKYLENS 2 screen without the gesture's ID displayed, it indicates that the "forget" action is completed.
3.2 Detection Threshold
The detection threshold controls the sensitivity of hand detection: a lower threshold makes the "determination of whether it is a hand" criterion more lenient (easily misidentifies non-hand shapes as hands but rarely misses real hands), while a higher threshold makes the criterion stricter (less prone to misjudgment but may miss true hands).
Setting steps: Click "Detection Threshold", and a parameter adjustment slider will appear above it. Sliding left reduces the value, and sliding right increases it. The effect is shown in the figure.
3.3 Recognition Threshold
The "Recognition Threshold" controls the strictness of gesture recognition: a lower threshold makes the condition for matching a gesture in the screen to a previously learned gesture more lenient (prone to misrecognition but less likely to miss recognition), while a higher threshold makes the condition stricter (prone to missing recognition but less likely to misrecognize).
Setting steps: Click "Recognition Threshold", and a parameter adjustment slider will appear above it. Sliding left reduces the value, and sliding right increases it. The effect is shown in the figure.
3.4 NMS Threshold
NMS Threshold is a common parameter in visual recognition used to filter detection boxes. In visual recognition tasks, models often predict multiple overlapping detection boxes around the same target. Without filtering, a single object may be boxed by multiple overlapping frames. Adjusting the NMS threshold can remove overlapping duplicate boxes, keeping only the optimal one.
In simple terms, the NMS threshold determines how much overlap between two boxes counts as "duplicate". For example, a low threshold (e.g., 0.3) considers two boxes as duplicate if they slightly overlap, removing one. A high threshold (e.g., 0.7) requires significant overlap to be considered duplicate, leaving more boxes.
A high threshold works well in dense/occluded scenarios, capturing more hands/gestures but possibly resulting in "multiple boxes per hand". A low threshold suits clear single-hand scenarios, ensuring each hand has one box but may miss blurry hands.
Setting steps: Click the "NMS Threshold" option, and a parameter adjustment slider will appear above it. Sliding left reduces the value; sliding right increases it.
3.5 Set Name
This parameter allows you to set names for the learned hand gestures, which can be in both Chinese and English.
Setting steps:
- Tap "Set Name".
- Slide up and down the ID number in the top-left corner to select which ID gesture to name.
- Tap the on-screen keyboard to enter the name (see left image for reference).
- After setting, tap the √ key in the bottom-right corner to save. A green checkmark will appear in the top-right corner once saved successfully.
3.6 Display Name
This parameter controls whether to display the name when a gesture is recognized; the default is display.
Setting steps: Click "Display Name" to toggle the switch above it. A blue switch indicates the on state, where the gesture name will display (see left figure). Click the switch again to set it to white, where a white switch indicates the off state, and the gesture name will not display (see Figure 4-6-3-6).
3.7 Restore Defaults
This parameter restores all settings to their default states and forgets the learned IDs and names, but does not clear the exported models (see the following section for details on exporting models).
Setting steps: Click "Restore Defaults", and after the "Restore Default Configuration" pop-up window appears, click "Confirm".
3.8 Export Model
This parameter saves and exports the currently set parameters, learned IDs, and set names to the local memory of HUSKYLENS 2. It is applicable to scenarios such as migrating parameters to another HUSKYLENS 2. This operation does not require inserting a TF card.
Export steps: Click "Export Model". When the "Save Configuration To" pop-up appears, slide the number up and down to select which model to save as (up to 5 models can be saved), then click the "Confirm" button in the lower-left corner of the pop-up to save. After confirmation, the export will be automatic, as shown in the figure.
View Exported Model: After the "Exporting" pop-up disappears, you can view the exported model file on your computer.
First, connect HUSKYLENS 2 to your computer's USB port. Then, access HUSKYLENS 2's memory using the path shown in the following figure.
You will see two model-related files with extensions .json and .bin. The number before the extension is the "Model Number" you selected when saving the configuration. Both files can be copied and pasted to other places.
3.9 Import Model
This parameter allows importing the exported model from HUSKYLENS 2 (hereinafter referred to as "HUSKYLENS 2 A") to another HUSKYLENS 2 (hereinafter referred to as "HUSKYLENS 2 B"), so that HUSKYLENS 2 B can replicate the gestures learned by HUSKYLENS 2 A and the adjusted parameters without re-adjusting or re-learning.
Import steps:
Step 1: Connect HUSKYLENS 2 A to the computer and copy the exported files to the computer desktop.
Step 2: Connect HUSKYLENS 2 B to the computer and paste the files from the previous step into the specified folder of HUSKYLENS 2 B, as shown in the figure. (If the "hand-recognition" folder is not found, first perform Step 3: After importing a model, the folder will be automatically created, then return to Step 2.)
Step 3: First, confirm that you have entered the "Hand Recognition" feature. Then, click "Import Model". When the "Load Configuration" pop-up appears, slide the number slider up and down to select which model to load. This should match the model number saved in the prior step. For example, if the model file pasted into HUSKYLENS 2 is config1.json, select number 1. Finally, click "OK" to confirm the import.
Step 4: Wait for the "Loading" pop-up to disappear; the import is then completed. Next, verify if the parameters and learned gestures of HUSKYLENS 2 B are consistent with those of HUSKYLENS 2 A. The figure shows the recognition status of HUSKYLENS 2 B before and after importing the model.
Was this article helpful?
