1. The Original Vision: Autonomous Pet Sitter

Our initial goal was to expand the Innate Mars robot's portfolio by developing a "Pet Sitter" application designed to assist owners with feeding needs throughout the day. The concept involved a complex integration of monitoring and actuation.

Hourly Monitoring

Using onboard cameras to visually inspect water and food bowls to ensure sufficient hydration and nutrition.

Physical Refilling

Using the 6-DOF robotic arm to physically grab a utensil (scoop), fill it from a dispenser, and empty it into the pet's bowl.

Navigation

Autonomously navigating between food storage, water sources, and feeding areas without human intervention.

The Pivot: From Feeder to Fetcher

While the Pet Sitter concept was ambitious, the reality of working with the hardware revealed significant constraints. The complexity of manipulating granular objects (dog food) and liquids with the provided end-effector, combined with the tight project timeline, required a scope adjustment.

We pivoted from the specific "feeding" task to a generalized Pick-and-Place (Fetch) task. Instead of managing food scoops, the robot was tasked with locating a specific target object (a paper ball), picking it up with the arm, and delivering it to a set location. This allowed us to retain the core technical requirements—Computer Vision, Navigation, and Arm Manipulation—while eliminating the mechanical variables of handling loose food.

Innate Robot Vision

2. Shortened Timeline

While all the other teams started working on their projects as soon as the project assignments came in, we had to wait for a very critical part. The robot wasn't there. And no one knew when and how it would arrive. This made our final timeline for delivery very tight when we finally received the robot the day before Thanksgiving break. Below you can see our project timeline:

Project Timeline

From initial concept to final demo: a journey of debugging and iteration.

Week 1: Submission of project proposal

10/22

We started with an ambitious goal: An autonomous pet feeder that detects food, scoops it, and delivers it to a bowl.

Week 2: Assignment to Innate project

10/30

We got official confirmation that we will receive the Mars robot as our project.

Week 3: Initial contact with Innate

11/3-11/4

We were introduced to Acel Peytavin as our point of contact. As co-founder and CEO of Innate he was able to give us great insight into their vision with Mars and about what they were hoping to learn from us as hardware testers.

Week 4: No news about arrival

11/5

The course instructor said they received a robot but it would have to be shared with all the other remaining industry groups and needs to be rented. As this seemed to be a very complicated process, we continued focusing on finishing other course work to free up time down the line.

Week 5: "Robot will arrive later this week hopefully"

11/12

Check-In 1

11/20

Still no robot, so we unfortunately could not share much progress.

Week 6: Receipt of hardware

11/21

The Friday before the Thanksgiving week we finally received the hardware from the course instructors. Unfortunately, the team spread all across the country to be with friends and family for the holidays. (Week 7)

Week 8 Day 1: Setting up connection to the robot

Something that hasn't been communicated with us was that in order to connect the robot to Wi-Fi and hence get its IP, we needed to connect it to an App. This required us to request TestFlight access, but after some troubleshooting we finally set up a connection with the robot.

Week 8 Day 2: Teleop and path tracking

Mars is based on a slightly different platform compared to what we were used to in the lab. I.e. the topic list was very different and we had to get used to interacting with them. So our first challenge was to run something natively on the robot and eventually make it follow a predetermined path. Here we already noticed the inconsistencies in motor torque.

Week 8 Day 3: Vision | Change in objective and killing of the LiDAR

From discussions on the discord from the other teams with the CEO, we learned that the robot's camera does not have a depth perception model or visualization tools. So that was our next task. Working on this made us realize that the 2D LiDAR would not add much benefit to our revised goal (due to time constraints) and we stuck to the vision-only approach.

Week 8 Day 3: Vision + detection + driving

After having a solid detection and visualization code, we now connected that to the planning and driving approach.

Week 8 Day 4: Gripper control

Now that we were able to move forward and backward to the initial position, we added gripper control to pick the object. Challenge here was that we needed to account for the arm's camera, which we used to track the object for pickup.

Week 8 Day 5: Assembling it all

Now we had to put it all together, test and optimize the final result.

Week 8 Day 6: Finalization

It's done. Everything came together and worked.

3. Further Challenges

Next to the above mentioned shortened timeline, we ran into further unexpected issues that took quite some time and resources to resolve.

1 charger | 1 battery

We were only supplied with one working charger for all innate groups, which initially made logistics a bit complicated since we were told to not let the battery die - ever. After some communication with the others, we found a mutually satisfying solution.

Connection issues and outdated software

As mentioned above, to connect to the robot we needed an external phone app. To get the connection established we waited hours at a time trying to get the connection to go through. We also had to update the OS running on the robot since we still had an older version delivered to us. With constant communication with the CEO we were able to navigate these hurdles.

Issues with motors and wheel design

When moving the robot around, we noticed that it often lost its path even though it was meant to be on a straight trajectory. This swerving made us go for a PD controller to account for these imperfections.

No documentation for the topics

While Mars comes with an extensive list of pre-existent topics, there was no documentation about them yet. So we had to work through the topics on the robot to understand how they work and how we can utilize them.

End-effector wasn't strong enough

As our initial task was to pick up and transport a scoop of dog food, we realized during testing that the Innate arm wouldn't be strong enough to fulfill this task.

Inverse Kinematics

Since Mars came with a pre-installed IK model, we decided to work with that. Unfortunately, it had some inherently weird behavior.

© Intro to Robotics Project.