top of page

Ball Launcher With Facial Tracking

This robot uses a remote to send a signal to an Arduino with IR sensors to determine which quadrant to spin the base of the machine to. A Raspberry Pi running OpenCV uses a camera to then track the face of whoever is standing in that area. This will allow the base to center itself on that face and aim at that person. Pressing another button on the remote will activate the spinning wheels and will release a ball, shooting it at the person.

This video demonstrates the robot responding to commands given by the remote and then launching the ball after targeting a face

This video demonstrates the facial tracking moving the base as the person walks by

How Does it Work?

Screenshot 2022-02-19 225159.jpg

The ball return system

Image from iOS.jpg

Arduino responding to commands given by remote and received by IR sensors

The whole system rotates on this central axis. This works by a motor, controlled by the Raspberry Pi, being attached to a gear. When this gear rotates, it moves another gear. That second gear is attached to the plate of the launcher and this rotates the entire system at once. This base is made from laser cut acrylic and laser cut wood. Additional parts were custom designed and 3d printed.

Certain buttons on the remote are mapped to certain commands. For instance, pressing the left arrow on the remote turns the system to the person's right to look for a face. Another example shown in this video is how the center button tells the robot to start the spinning wheels and releases a ball to be fired

A closer look of the release and firing mechanisms of the launcher

  • LinkedIn
  • YouTube
  • Facebook
bottom of page