Example Code for Firebeetle 2 ESP32-E - Get Gesture Data
Last revision 2026/03/05
Learn how to use Firebeetle 2 ESP32-E for gesture recognition with AI Vision Sensor. Follow the guide for hardware and software setup, wiring diagrams, and Arduino code to obtain real-time gesture data including ID, score, and keypoint coordinates.
Hardware Preparation
- FireBeetle 2 ESP32-E (SKU: DFR0654) ×1
- Gravity: AI Vision Posture and Gesture Sensor(SKU: SEN0670) ×1
- PH2.0-4P Cable ×1
- USB Data Cable ×1
Software Preparation
- Download Arduino IDE: Click to Download Arduino IDE
- Install SDK: Visit the FireBeetle 2 ESP32-E WIKI page for SDK installation instructions
- Download Arduino Library: Click to download DFRobot_HumanPose and refer to the guide: How to Install a Library?
Wiring Diagram
I2C Wiring Diagram (This connection method is used in the Sample Code)

Pin Connection Description:
- Sensor: + Pin --- (Connects to) --- Main Controller: 3V3
- Sensor: - Pin --- (Connects to) --- Main Controller: GND
- Sensor: SCL Pin --- (Connects to) --- Main Controller: 22/SCL
- Sensor: SDA Pin --- (Connects to) --- Main Controller: 21/SDA
UART Wiring Diagram (This connection method is used in the Sample Code)

Pin Connection Description:
- Sensor: + Pin --- (Connects to) --- Main Controller: 3V3
- Sensor: - Pin --- (Connects to) --- Main Controller: GND
- Sensor: RX Pin --- (Connects to) --- Main Controller: 26/D3
- Sensor: TX Pin --- (Connects to) --- Main Controller: 25/D2
When the connection method of the sensor is changed from I2C connection to UART connection, please check if the communication mode switch is also synchronously switched to UART.
Sample Code
Function: Initialize I2C gesture sensor, set to hand mode, print gesture ID, name, score, bounding box and 21 keypoint coordinates.
#include <DFRobot_HumanPose.h>
#define I2C_ADDR 0x3A
DFRobot_HumanPose_I2C humanPose(&Wire, I2C_ADDR);
void setup()
{
Serial.begin(115200);
while (!humanPose.begin()) {
Serial.println("Sensor init fail!");
delay(1000);
}
Serial.println("Sensor init success!");
humanPose.setModelType(DFRobot_HumanPose::eHand);
}
void loop()
{
if (humanPose.getResult() == DFRobot_HumanPose::eOK) {
while (humanPose.availableResult()) {
HandResult *result = static_cast<HandResult *>(humanPose.popResult());
Serial.print("Gesture ID: "); Serial.println(result->id);
Serial.print("Gesture: "); Serial.println(result->name);
Serial.print("Score: "); Serial.println(result->score);
Serial.println("score: " + String(result->score));
Serial.println("xLeft: " + String(result->xLeft));
Serial.println("yTop: " + String(result->yTop));
Serial.println("width: " + String(result->width));
Serial.println("height: " + String(result->height));
Serial.println("wrist: " + String(result->wrist.x) + ", " + String(result->wrist.y));
Serial.println("thumbCmc: " + String(result->thumbCmc.x) + ", " + String(result->thumbCmc.y));
Serial.println("thumbMcp: " + String(result->thumbMcp.x) + ", " + String(result->thumbMcp.y));
Serial.println("thumbIp: " + String(result->thumbIp.x) + ", " + String(result->thumbIp.y));
Serial.println("thumbTip: " + String(result->thumbTip.x) + ", " + String(result->thumbTip.y));
Serial.println("indexFingerMcp: " + String(result->indexFingerMcp.x) + ", " + String(result->indexFingerMcp.y));
Serial.println("indexFingerPip: " + String(result->indexFingerPip.x) + ", " + String(result->indexFingerPip.y));
Serial.println("indexFingerDip: " + String(result->indexFingerDip.x) + ", " + String(result->indexFingerDip.y));
Serial.println("indexFingerTip: " + String(result->indexFingerTip.x) + ", " + String(result->indexFingerTip.y));
Serial.println("middleFingerMcp: " + String(result->middleFingerMcp.x) + ", " + String(result->middleFingerMcp.y));
Serial.println("middleFingerPip: " + String(result->middleFingerPip.x) + ", " + String(result->middleFingerPip.y));
Serial.println("middleFingerDip: " + String(result->middleFingerDip.x) + ", " + String(result->middleFingerDip.y));
Serial.println("middleFingerTip: " + String(result->middleFingerTip.x) + ", " + String(result->middleFingerTip.y));
Serial.println("ringFingerMcp: " + String(result->ringFingerMcp.x) + ", " + String(result->ringFingerMcp.y));
Serial.println("ringFingerPip: " + String(result->ringFingerPip.x) + ", " + String(result->ringFingerPip.y));
Serial.println("ringFingerDip: " + String(result->ringFingerDip.x) + ", " + String(result->ringFingerDip.y));
Serial.println("ringFingerTip: " + String(result->ringFingerTip.x) + ", " + String(result->ringFingerTip.y));
Serial.println("pinkyFingerMcp: " + String(result->pinkyFingerMcp.x) + ", " + String(result->pinkyFingerMcp.y));
Serial.println("pinkyFingerPip: " + String(result->pinkyFingerPip.x) + ", " + String(result->pinkyFingerPip.y));
Serial.println("pinkyFingerDip: " + String(result->pinkyFingerDip.x) + ", " + String(result->pinkyFingerDip.y));
Serial.println("pinkyFingerTip: " + String(result->pinkyFingerTip.x) + ", " + String(result->pinkyFingerTip.y));
Serial.println("--------------------------------");
}
}
delay(50);
}
Result
After successful initialization, the sensor cyclically detects hands, printing gesture ID, name, confidence, bounding box and 21 keypoint coordinates in real time, updating every 50ms.
As shown in the serial port output information below, an unknown gesture has been recognized.

As shown in the serial port output information below, the learned gesture 'yes' has been recognized.

Was this article helpful?
