Project Assessment

While we ultimately pivoted away from the granular "Pet Feeder" application due to gripper limitations, our final "Fetch" implementation successfully met all the core technical design criteria. The robot demonstrated autonomous perception of a target, smooth trajectory planning using Bézier curves, and multi-stage manipulation to interact with the environment.

The system proved that the Innate Mars platform, despite its documentation gaps and hardware quirks, is capable of complex tasks when governed by a robust software architecture.

Limitations & Future Improvements

Given the timeline constraints, several "hacks" were implemented to ensure stability. With additional time, we would address these technical debt items to create a truly production-ready system.

Robust Perception

Current Flaw: Our vision system relies on simple color thresholding and plane filtering. This makes it sensitive to lighting changes.
Improvement: Implement machine learning-based object detection (YOLO) and integrate the LiDAR sensor to fuse depth data, allowing detection of non-white objects in complex backgrounds.

True Localization

Current Flaw: The "Return to Base" feature currently relies on simple odometry tracking (reversing distance d), which accumulates drift over time.
Improvement: Implement SLAM (Simultaneous Localization and Mapping) to allow the robot to navigate back to a specific (x,y) coordinate map, regardless of the path taken.

Visual Servoing

Current Flaw: The arm moves to a pre-calculated coordinate and grabs blindly. If the robot slips during the approach, the grasp fails.
Improvement: Implement "Visual Servoing" (closed-loop control), where the arm camera continuously corrects the hand position during the descent phase.