Robotic Arm Pouring Demonstration
Precision robotic manipulation demo featuring a robotic arm pouring liquid with computer vision guidance.
Demo Video
Overview
This project demonstrates advanced robotic manipulation through precise liquid pouring using vision-guided control. The system combines computer vision for target detection, inverse kinematics for motion planning, and trajectory optimization for smooth, controlled pouring movements.
The demonstration showcases the integration of perception, planning, and control—three fundamental pillars of modern robotics.
Software Architecture
The software stack is built on ROS (Robot Operating System) with OpenCV for computer vision and MoveIt for motion planning. The vision pipeline uses color-based segmentation to detect the target cup position and orientation in 3D space using calibrated camera parameters.
Once the cup is localized, the system calculates the optimal pouring trajectory using inverse kinematics to determine joint angles for the UFactory robotic arm. MoveIt generates collision-free paths and smooth joint-space trajectories. The pouring action uses feedforward control with tilt rate adjusted based on liquid properties.
The system runs in a closed-loop configuration with visual servoing allowing real-time adjustments during pouring based on cup position changes.
Results & Achievements
Successfully demonstrated precise pouring with 95% success rate (liquid in cup without spills) across varied cup positions. The system adapts to cups placed within a 30cm workspace radius. Average task completion time is 8 seconds from detection to pour completion.
The project validates visual servoing techniques for dynamic manipulation tasks and demonstrates the maturity of open-source robotics frameworks like ROS and MoveIt.