Demo and evidence

From Detection to Physical Sorting.

This demo shows the robot performing detection, target following, grasping, and categorized placement.

Featured workflows

Bottle and cup sorting runs

These videos capture live detection, arm alignment, dexterous grasping, and final release into the correct category.

Validation

Two object classes

Both bottles and cups appear in the full workflow videos, matching the current scope of the prototype.

Validation

Depth-backed localization

The arm moves against measured 3D target positions rather than only 2D image-space detections.

Validation

Continuous arm alignment

The follow stage keeps the end effector positioned over the target before grasp execution begins.

Outcome

Physical grasp and release

The manipulation sequence closes with a grip, transfer, and categorized placement.

Run summary

What the featured run shows

The main demo now stands on its own in a full-width column so the motion reads cleanly: detect the object, align the arm, grasp with the dexterous hand, then release into the target category.

Primary object

Bottle workflow

The featured clip shows the bottle sequence end to end, from recognition to placement.

Secondary object

Cup workflow

The second clip confirms the same control pipeline can generalize across the two supported classes.

Supporting media

Additional views of the system in motion.

Supporting clips and stills highlight the follow stage, the detection interface, hardware integration, and the physical setup around the robot.

Seek and Follow

End-effector tracking before grasp

This clip isolates the tracking behavior and makes the arm-control stage easier to review.

Detection

Annotated localization output

This clip shows the visual front end directly, including labels, overlays, and the target used for robot alignment.

Hand Setup

Dexterous hand installation

This hardware snapshot shows the integration work behind the grasp sequence seen in the main videos.

Team members installing and adjusting the dexterous hand on the robot arm.
Workspace

Calibration and robot setup

This workspace image ties the demo results back to the calibrated setup that makes physical targeting possible.

Team member calibrating the robot workspace with the calibration board.
Evidence
1x

Featured robot run

A featured run captures the full detect-follow-grasp-place cycle in a single sequence.

Evidence
3x

Annotated system frames

Annotated frames separate perception, localization, and manipulation evidence for clear review.

Evidence
1x

Prototype outcomes

Summarize what is already validated now: bottle/cup recognition, coordinate alignment, dynamic following, and categorized placement.

Downloads

Support material

The project deck and supporting documents provide additional technical background for the prototype.

Repository

Technical source of truth

The repository contains the implementation details behind the demo pipeline.