Googley Googley Eyes Googley Eyes, but they Google what they see 26 July 2023 4 minute read By Kevin McAleer Share this article on Table of Contents Bill of Materials3d printed partsAssembleySetting up the image recognitionTweak the codeResultsMaking the eyes moveCode Tags Raspberry Pi Python opencv Pimoroni XEyes Code Repo View Code Repository on GitHub
You’ve seen googley eyes or xeyes, but what if the googley eyes could Google what they see (and then announce it); I present googley googley eyes. (Ok, they don’t actually ‘Google’ the image, they actually use something far more cool - OpenAI). The eyes rotate using the motors with encoders for accurate positioning. This means they can rotate all the way round, but also rotate to a specific position, which is needed for making the eyes look in a direction accurately. Bill of Materials Item Description Qty Cost Pi Zero 2W Raspberry Pi Zero 2W 1 £17.10 Pi Camera Raspberry pi camera 1 £36.90 Camera Cable Camera Cable - Pi Zero Edition 150mm 1 £3.90 Motor Controller Pimoroni Inventor HAT Mini 1 £24.00 Motors 2x MMME motors with encoders 2 £9.90 Motor Cable Pimoroni JST-SH cable - 6 pin (pack of 4) - 300mm 1 £3.90 M2 Screws Screws for the camera mount 2 £0.50 M2.5 Screws Screws for the Raspberry Pi Zero 2W 4 £0.50 Speaker Mini Oval Speaker - 8 Ohm 1 Watt 1 £2.70 3d printed parts Here are the 3d printed parts for the project: pi_holder.stl - Holds the Raspberry Pi, this is from a modular robotics system I’m working on eye_holder - Holds the motors and provides a desktop mount for the project eye.stl - These are the white eye pieces, you’ll need to print two of them pupil.stl - These are the black, pupil pieces, you’ll need to print two of them Assembley Setup the Raspberry Pi with a fresh install of Raspberry Pi OS 64-bit, Click here for a how to video Push the Pimoroni Inventor HAT Mini onto the top of the Raspberry Pi Zero 2W Connect the motors up to the Inventor Hat Mini Push the motors into the eye_holder If you have access to a vinyl cutting machine such as a Cricut Maker 3, you can cut out a circle 50mm in diameter for the white eye, and 30mm for the black pupil. The White eye has a small 7mm hole in the center. The 3d printed parts may require some filing to ensure a smooth motion Push the black pupil 3d printed part into the white eye piece Glue the eye 3d printed disk onto the eye_holder with some super-glue, be sure not to glue the pupil part ensuring it can still turn friction free Screw the Raspberry Pi Zero 2W into the pi_holder using the M2.5 screws Carefully push the Raspberry Pi camera cable into both the Raspberry Pi camera module and the Raspberry Pi Zero 2W Screw the Raspberry Pi camera module into the eye_holder with some M2 screws Push the small PicoBlade speak cable into the speaker connector on the Inventor HAT Mini Setting up the image recognition OpenAI has a nice automatic image captioning service, you can get this up and running by: Download the project code: git clone https://www.github.com/kevinmcaleer/googley_eyes Sign up for an account on https://platform.openai.com Create a new API key: Click on your profile picture at the top right, and then View API Keys, then click create new secret key Copy the secret API key into a file named .env into the downloaded code: OPENAI_API_KEY=<INSERT YOUR OPENAI API KEY HERE> Create a virtual environment: python3 -m venv venv Activate the environment: source venv/bin/activate Install the dependencies: pip install -r requirements.txt Running the demo program caption_this.py: python3 caption_this.py You can upload your own image and update the code that loads the image. Tweak the code You can also tweak the query, to ask OpenAI how to describe the image, here is a response generated from the query: result = index.query('describe what is in the image, be nonchalant and snarky') Results “A cat and dog lounging on a blanket, like they own the place.” Making the eyes move The eyes can be moved using the Pimoroni Inventor HAT mini python library; for instructions on how to install that, click here. Once you’ve installed that, headover to the motors example code and run the motor_wave.py program: cd InventorHATMini/examples/motors python3 motor_wave.py Code Here is the code you will need, please note this is a work in progress so the code will improve over time.