Buddy Jr. Build a tiny robot arm and learn how to control it with a Raspberry Pi and Python. 6 April 2024 13 minute read By Kevin McAleer Share this article on Table of Contents Buddy Jr.Design goalsDesignBaseAssemblyBill of MaterialsPython codeSetup and InitializationCalibrationMoving the ServosCalculating Position with Inverse KinematicsImplementation Details and ConsiderationsUsing on Raspberry Pi 5Demo programPowering the arm Tags: robot arms inverse kinematics raspberry pi python Difficulty: beginner Category: robotarms raspberrypi 3dprinting Code: https://www.github.com/kevinmcaleer/buddy_jr Home Blog Buddy jr. Buddy Jr. Build a tiny robot arm and learn how to control it with a Raspberry Pi and Python. 6 April 2024 | 13 minute read | By Kevin McAleer | Share this article on Video For every project I create, I often make a corresponding YouTube video. Sometimes, there might be more than one video for a single project. You can find these videos in this section. Explore more through this this dedicated video. Buddy Jr. I created Buddy Jr as a troubleshooting step with a larger Robot arm project I’m working on called Buddy (Buddy is the name of one of the robot arms Tony Stark uses in the movie Ironman). I wanted something that is really quick to print so I can test out different methods of moving the arm. The larger robot arm Buddy takes around 8 - 10 hours to print each individual part (with an arm peice being around 4 parts each), Buddys arm takes around 40 minutes to print. Design goals When designing Buddy Jr, I wanted to create a robot arm that was: Conceptually simple Quick to print Easy to assemble Fun to play with Asthetically pleasing to look at Cheap to build Functional Educational Able to hold a Raspberry Pi camera module Design I used Fusion 360 to design Buddy Jr. The design is simple and consists of 4 servos, a base, a shoulder, an elbow, and a camera holder. The servos are controlled by a PCA9685 servo driver board, which is connected to a Raspberry Pi 5 (though any Pi with header pins will do). Base Assembly The assembly of Buddy Jr is straightforward. The servos are screwed into the base, shoulder, and elbow, and the camera holder is attached to the elbow. The servos are then connected to the PCA9685 board, which is connected to the Raspberry Pi. Bill of Materials Item Description Quantity Price Total Servos SG90 servos 4 $2.50 $10.00 Servo Driver PCA9685 1 $14.00 $14.00 Camera Raspberry Pi Camera Module 1 $25.00 $25.00 Raspberry Pi Raspberry Pi 5 1 $80.00 $80.00 Note: You can use any Raspberry Pi with header pins, and any Raspberry Pi camera module compatible with that model of Raspberry Pi. Python code The latest version of the Python coe for Buddy Jr can be found on GitHub. The code uses the adafruit-circuitpython-servokit library to control the servos. For now I’m not using the camera in the Python code, but I plan to add this in the future. You can watch the output of the camera using the rpicam-hello -t 0 command in the terminal. import board import busio from adafruit_servokit import ServoKit from time import sleep import asyncio from math import atan2, sqrt, pi, acos, degrees # import time class Arm(): # Set the servo channels for each part of the arm BASE = 0 SHOULDER = 1 ELBOW = 2 CAMERA = 3 # Base is the 0,0 origin SHOLDER_LENGTH = 80 ELBOW_LENGTH = 80 def __init__(self): self.kit = ServoKit(channels=16) self.base = self.kit.servo[self.BASE] self.shoulder = self.kit.servo[self.SHOULDER] self.elbow = self.kit.servo[self.ELBOW] self.camera = self.kit.servo[self.CAMERA] # Set initial angles self.base.angle = 90 self.shoulder.angle = 90 self.elbow.angle = 90 self.camera = 90 async def calibrate(self): """ Moves all the servos to the middle position """ print('Calibrating\n') print('setting base') await self.move_to(channel = self.BASE, duration=2, target_angle=90) print('setting shoulder') await self.move_to(self.SHOULDER, 2, 90) print('setting elbow') await self.move_to(self.ELBOW, 2, 90) print('setting camera') await self.move_to(self.CAMERA, 2, 90) async def move_to(self, channel: int, duration: float, target_angle: int): """ Move the servo to a target angle over a duration """ print(f"Channel is {channel}") if channel > 15: raise ValueError(f'The servo channel was greater than 1 - it was {channel}') if (target_angle) > 180 or (target_angle < 0): # raise ValueError(f'The angle value was outside the valid range or 0 to 180 - was {target_angle}') target_angle = 0 print(f'self.kit.servo[{channel}].angle is {self.kit.servo[channel].angle}') current_angle = round(self.kit.servo[channel].angle,0) if current_angle is None: current_angle = 0 # or any other value you deem appropriate # Calculate the total distance to move angle_difference = target_angle - current_angle # Define the number of steps you want to use steps = 1000 # Calculate the time to wait between steps to achieve the overall duration sleep_duration = duration / steps # Calculate the angle to move at each step angle_step = angle_difference / steps range_limit = self.kit.servo[channel].actuation_range for _ in range(steps): # Update the current angle current_angle += angle_step if current_angle > range_limit or current_angle < 0: continue self.kit.servo[channel].angle = current_angle # Async wait before the next step await asyncio.sleep(sleep_duration) import math def calculate_position(self, x, y): """ Returns Shoulder, Elbow and Camera angles to move to""" shoulder_length = self.SHOLDER_LENGTH elbow_length = self.ELBOW_LENGTH # Calculate distance to target point d = sqrt(x**2 + y**2) # Law of Cosines to find angles angle_to_target = atan2(y, x) cos_angle_shoulder = (shoulder_length**2 + d**2 - elbow_length**2) / (2 * shoulder_length * d) shoulder_angle = acos(cos_angle_shoulder) + angle_to_target cos_angle_elbow = (shoulder_length**2 + elbow_length**2 - d**2) / (2 * shoulder_length * elbow_length) elbow_angle = acos(cos_angle_elbow) # Assuming a simple scenario where the camera angle needs to compensate the shoulder and elbow movement # to stay level. This needs adjustment based on your setup. camera_angle = -(shoulder_angle + elbow_angle - pi/2) print(f'Cam angle: {degrees(camera_angle)}') # Convert radians to degrees if necessary shoulder_angle_deg = degrees(shoulder_angle) elbow_angle_deg = degrees(elbow_angle) camera_angle_deg = degrees(camera_angle) + 90 # camera_angle_deg = degrees(camera_angle) return shoulder_angle_deg, elbow_angle_deg, camera_angle_deg This code outlines a basic framework for a robotic arm controlled by a Raspberry Pi, utilizing the Adafruit CircuitPython ServoKit library. It’s designed to move parts of the arm (base, shoulder, elbow, and a camera) to specific angles smoothly over a period and calculate positions using inverse kinematics. Here’s a breakdown of how it works: Setup and Initialization The Arm class initializes the servo controller with 16 channels, assuming you’re using a PCA9685 PWM servo driver, which is common for such projects. This setup supports controlling up to 16 individual servo motors. Four servos are defined for different parts of the arm, each assigned to a specific channel. Initial angles are set to 90 degrees, aiming to start the servos in a neutral position. Calibration The calibrate async method moves all servos (base, shoulder, elbow, camera) to a middle position (90 degrees). This method serves as a starting reference point to ensure all parts are aligned before any further operations. Moving the Servos The move_to method is designed to move a servo on a specified channel to a target angle smoothly over a given duration. It does so by calculating the steps needed based on the duration and gradually moving the servo to the target angle using these steps. This method incorporates several checks, including ensuring the target angle is within a valid range (0 to 180 degrees) and the channel is within the expected range (0 to 15). If the target angle is outside the valid range, it defaults to 0 degrees as a safety measure. Calculating Position with Inverse Kinematics The calculate_position method applies inverse kinematics principles to calculate the necessary angles for the shoulder, elbow, and camera to reach a target point (x, y). This method is crucial for determining how the arm should move to position the end effector (in this case, a camera) at a specific location in space. It calculates the distance to the target point and then uses the law of cosines to find the angles for the shoulder and elbow. The camera angle is adjusted to maintain a specific orientation, presumed to keep the camera facing forward by compensating for the movement of the other two joints. Implementation Details and Considerations The Adafruit CircuitPython ServoKit library is utilized to simplify interfacing with the PCA9685 module, providing an easy way to control multiple servos. Asynchronous programming (asyncio) is used to allow the arm movements to be non-blocking, enabling the possibility of running other tasks concurrently or managing multiple arms simultaneously. This code includes mechanisms to deal with potential issues, such as invalid angles or servo channels, by setting safe defaults or skipping adjustments that would exceed the servo’s capabilities. Using on Raspberry Pi 5 To run this on a Raspberry Pi 5, ensure that the CircuitPython libraries are installed and that the I2C interface is enabled on the Pi. This can usually be done through the raspi-config tool or the Raspberry Pi Configuration tool in the desktop interface. Make sure to connect the PCA9685 module correctly to the Pi’s I2C pins (SCL, SDA, VCC, and GND) and to supply adequate power to the servos, especially if they require more current than the Pi can provide directly. Demo program Here is a simple demo program that moves the arm around. Note how the asycio.gather is used to run the async methods in parallel. from buddyjr.arm import Arm import asyncio arm = Arm() async def main(): # Reset all parts simultaneously speed = .05 angle = 180 await asyncio.gather( arm.move_to(Arm.CAMERA, speed, angle), arm.move_to(Arm.ELBOW, speed, angle), arm.move_to(Arm.BASE, speed, angle), arm.move_to(Arm.SHOULDER, speed, angle) ) await asyncio.sleep(1) # Use asyncio.sleep for async compatibility print('Testing IK') x, y = 10, 20 shoulder_pos, elbow_pos, camera_pos = arm.calculate_position(x, y) # Move all parts based on IK calculations simultaneously await asyncio.gather( arm.move_to(Arm.SHOULDER, 1, shoulder_pos), arm.move_to(Arm.ELBOW, 1, elbow_pos), arm.move_to(Arm.CAMERA, 1, camera_pos), arm.move_to(Arm.BASE, 1, 90) ) await asyncio.sleep(2) # Use asyncio.sleep for async compatibility # Choose where the arm should move to x, y = 10, 90 shoulder_pos, elbow_pos, camera_pos = arm.calculate_position(x, y) # Move all parts based on IK calculations simultaneously await asyncio.gather( arm.move_to(Arm.SHOULDER, 1, shoulder_pos), arm.move_to(Arm.ELBOW, 1, elbow_pos), arm.move_to(Arm.CAMERA, 1, camera_pos), arm.move_to(Arm.BASE, 1, 90) ) await asyncio.sleep(2) # Use asyncio.sleep for async compatibility # Reset all parts simultaneously again await asyncio.gather( arm.move_to(Arm.CAMERA, 1, 90), arm.move_to(Arm.ELBOW, 1, 90), arm.move_to(Arm.BASE, 1, 90), arm.move_to(Arm.SHOULDER, 1, 90) ) await asyncio.sleep(1) # Use asyncio.sleep for async compatibility asyncio.run(main()) Powering the arm I used a Pimoroni Yukon with the Bench Power supply module to provide power to the servos via the PCA9685 board. I could have used the Yukon to control the servos too, but I wanted to run the project from the Raspberry Pi as the camera is attached to the arm. Code View Code Repository on GitHub - https://www.github.com/kevinmcaleer/buddy_jr 3D Models Here are the 3D printable STL files: Base The base of the robot arm, with space to mount the PCA9685 board Download buddy_jr_base.stl Pivor Download buddy_jr_base_pivot.stl Arm The elbow of the robot arm, print 2 of these Download buddy_jr_basic_arm.stl Camera Holder The camera holder for the Raspberry Pi camera module Download buddy_jr_camera_holder.stl Liked this article? You might like these too. Robot Arms Extend your reach with these robotic arm projects. From simple pick and place robots to complex inverse kinematics, we have a range of projects to suit all skill levels. Guiding Light Hey Robot Makers! Sync Files on your Pis, with Syncthing NextCloud You can host your own NextCloud server on your Raspberry Pi, and share files online with your friends and family. TherePi - Music Making with a Raspberry Pi In this project, we will build a theremin using a Raspberry Pi and rangefinders Motion Controlled Arm In this project, we will build a motion controlled robotic arm using a Raspberry Pi and a camera
Buddy Jr. Build a tiny robot arm and learn how to control it with a Raspberry Pi and Python. 6 April 2024 13 minute read By Kevin McAleer Share this article on Table of Contents Buddy Jr.Design goalsDesignBaseAssemblyBill of MaterialsPython codeSetup and InitializationCalibrationMoving the ServosCalculating Position with Inverse KinematicsImplementation Details and ConsiderationsUsing on Raspberry Pi 5Demo programPowering the arm Tags: robot arms inverse kinematics raspberry pi python Difficulty: beginner Category: robotarms raspberrypi 3dprinting Code: https://www.github.com/kevinmcaleer/buddy_jr
Buddy Jr. I created Buddy Jr as a troubleshooting step with a larger Robot arm project I’m working on called Buddy (Buddy is the name of one of the robot arms Tony Stark uses in the movie Ironman). I wanted something that is really quick to print so I can test out different methods of moving the arm. The larger robot arm Buddy takes around 8 - 10 hours to print each individual part (with an arm peice being around 4 parts each), Buddys arm takes around 40 minutes to print. Design goals When designing Buddy Jr, I wanted to create a robot arm that was: Conceptually simple Quick to print Easy to assemble Fun to play with Asthetically pleasing to look at Cheap to build Functional Educational Able to hold a Raspberry Pi camera module Design I used Fusion 360 to design Buddy Jr. The design is simple and consists of 4 servos, a base, a shoulder, an elbow, and a camera holder. The servos are controlled by a PCA9685 servo driver board, which is connected to a Raspberry Pi 5 (though any Pi with header pins will do). Base Assembly The assembly of Buddy Jr is straightforward. The servos are screwed into the base, shoulder, and elbow, and the camera holder is attached to the elbow. The servos are then connected to the PCA9685 board, which is connected to the Raspberry Pi. Bill of Materials Item Description Quantity Price Total Servos SG90 servos 4 $2.50 $10.00 Servo Driver PCA9685 1 $14.00 $14.00 Camera Raspberry Pi Camera Module 1 $25.00 $25.00 Raspberry Pi Raspberry Pi 5 1 $80.00 $80.00 Note: You can use any Raspberry Pi with header pins, and any Raspberry Pi camera module compatible with that model of Raspberry Pi. Python code The latest version of the Python coe for Buddy Jr can be found on GitHub. The code uses the adafruit-circuitpython-servokit library to control the servos. For now I’m not using the camera in the Python code, but I plan to add this in the future. You can watch the output of the camera using the rpicam-hello -t 0 command in the terminal. import board import busio from adafruit_servokit import ServoKit from time import sleep import asyncio from math import atan2, sqrt, pi, acos, degrees # import time class Arm(): # Set the servo channels for each part of the arm BASE = 0 SHOULDER = 1 ELBOW = 2 CAMERA = 3 # Base is the 0,0 origin SHOLDER_LENGTH = 80 ELBOW_LENGTH = 80 def __init__(self): self.kit = ServoKit(channels=16) self.base = self.kit.servo[self.BASE] self.shoulder = self.kit.servo[self.SHOULDER] self.elbow = self.kit.servo[self.ELBOW] self.camera = self.kit.servo[self.CAMERA] # Set initial angles self.base.angle = 90 self.shoulder.angle = 90 self.elbow.angle = 90 self.camera = 90 async def calibrate(self): """ Moves all the servos to the middle position """ print('Calibrating\n') print('setting base') await self.move_to(channel = self.BASE, duration=2, target_angle=90) print('setting shoulder') await self.move_to(self.SHOULDER, 2, 90) print('setting elbow') await self.move_to(self.ELBOW, 2, 90) print('setting camera') await self.move_to(self.CAMERA, 2, 90) async def move_to(self, channel: int, duration: float, target_angle: int): """ Move the servo to a target angle over a duration """ print(f"Channel is {channel}") if channel > 15: raise ValueError(f'The servo channel was greater than 1 - it was {channel}') if (target_angle) > 180 or (target_angle < 0): # raise ValueError(f'The angle value was outside the valid range or 0 to 180 - was {target_angle}') target_angle = 0 print(f'self.kit.servo[{channel}].angle is {self.kit.servo[channel].angle}') current_angle = round(self.kit.servo[channel].angle,0) if current_angle is None: current_angle = 0 # or any other value you deem appropriate # Calculate the total distance to move angle_difference = target_angle - current_angle # Define the number of steps you want to use steps = 1000 # Calculate the time to wait between steps to achieve the overall duration sleep_duration = duration / steps # Calculate the angle to move at each step angle_step = angle_difference / steps range_limit = self.kit.servo[channel].actuation_range for _ in range(steps): # Update the current angle current_angle += angle_step if current_angle > range_limit or current_angle < 0: continue self.kit.servo[channel].angle = current_angle # Async wait before the next step await asyncio.sleep(sleep_duration) import math def calculate_position(self, x, y): """ Returns Shoulder, Elbow and Camera angles to move to""" shoulder_length = self.SHOLDER_LENGTH elbow_length = self.ELBOW_LENGTH # Calculate distance to target point d = sqrt(x**2 + y**2) # Law of Cosines to find angles angle_to_target = atan2(y, x) cos_angle_shoulder = (shoulder_length**2 + d**2 - elbow_length**2) / (2 * shoulder_length * d) shoulder_angle = acos(cos_angle_shoulder) + angle_to_target cos_angle_elbow = (shoulder_length**2 + elbow_length**2 - d**2) / (2 * shoulder_length * elbow_length) elbow_angle = acos(cos_angle_elbow) # Assuming a simple scenario where the camera angle needs to compensate the shoulder and elbow movement # to stay level. This needs adjustment based on your setup. camera_angle = -(shoulder_angle + elbow_angle - pi/2) print(f'Cam angle: {degrees(camera_angle)}') # Convert radians to degrees if necessary shoulder_angle_deg = degrees(shoulder_angle) elbow_angle_deg = degrees(elbow_angle) camera_angle_deg = degrees(camera_angle) + 90 # camera_angle_deg = degrees(camera_angle) return shoulder_angle_deg, elbow_angle_deg, camera_angle_deg This code outlines a basic framework for a robotic arm controlled by a Raspberry Pi, utilizing the Adafruit CircuitPython ServoKit library. It’s designed to move parts of the arm (base, shoulder, elbow, and a camera) to specific angles smoothly over a period and calculate positions using inverse kinematics. Here’s a breakdown of how it works: Setup and Initialization The Arm class initializes the servo controller with 16 channels, assuming you’re using a PCA9685 PWM servo driver, which is common for such projects. This setup supports controlling up to 16 individual servo motors. Four servos are defined for different parts of the arm, each assigned to a specific channel. Initial angles are set to 90 degrees, aiming to start the servos in a neutral position. Calibration The calibrate async method moves all servos (base, shoulder, elbow, camera) to a middle position (90 degrees). This method serves as a starting reference point to ensure all parts are aligned before any further operations. Moving the Servos The move_to method is designed to move a servo on a specified channel to a target angle smoothly over a given duration. It does so by calculating the steps needed based on the duration and gradually moving the servo to the target angle using these steps. This method incorporates several checks, including ensuring the target angle is within a valid range (0 to 180 degrees) and the channel is within the expected range (0 to 15). If the target angle is outside the valid range, it defaults to 0 degrees as a safety measure. Calculating Position with Inverse Kinematics The calculate_position method applies inverse kinematics principles to calculate the necessary angles for the shoulder, elbow, and camera to reach a target point (x, y). This method is crucial for determining how the arm should move to position the end effector (in this case, a camera) at a specific location in space. It calculates the distance to the target point and then uses the law of cosines to find the angles for the shoulder and elbow. The camera angle is adjusted to maintain a specific orientation, presumed to keep the camera facing forward by compensating for the movement of the other two joints. Implementation Details and Considerations The Adafruit CircuitPython ServoKit library is utilized to simplify interfacing with the PCA9685 module, providing an easy way to control multiple servos. Asynchronous programming (asyncio) is used to allow the arm movements to be non-blocking, enabling the possibility of running other tasks concurrently or managing multiple arms simultaneously. This code includes mechanisms to deal with potential issues, such as invalid angles or servo channels, by setting safe defaults or skipping adjustments that would exceed the servo’s capabilities. Using on Raspberry Pi 5 To run this on a Raspberry Pi 5, ensure that the CircuitPython libraries are installed and that the I2C interface is enabled on the Pi. This can usually be done through the raspi-config tool or the Raspberry Pi Configuration tool in the desktop interface. Make sure to connect the PCA9685 module correctly to the Pi’s I2C pins (SCL, SDA, VCC, and GND) and to supply adequate power to the servos, especially if they require more current than the Pi can provide directly. Demo program Here is a simple demo program that moves the arm around. Note how the asycio.gather is used to run the async methods in parallel. from buddyjr.arm import Arm import asyncio arm = Arm() async def main(): # Reset all parts simultaneously speed = .05 angle = 180 await asyncio.gather( arm.move_to(Arm.CAMERA, speed, angle), arm.move_to(Arm.ELBOW, speed, angle), arm.move_to(Arm.BASE, speed, angle), arm.move_to(Arm.SHOULDER, speed, angle) ) await asyncio.sleep(1) # Use asyncio.sleep for async compatibility print('Testing IK') x, y = 10, 20 shoulder_pos, elbow_pos, camera_pos = arm.calculate_position(x, y) # Move all parts based on IK calculations simultaneously await asyncio.gather( arm.move_to(Arm.SHOULDER, 1, shoulder_pos), arm.move_to(Arm.ELBOW, 1, elbow_pos), arm.move_to(Arm.CAMERA, 1, camera_pos), arm.move_to(Arm.BASE, 1, 90) ) await asyncio.sleep(2) # Use asyncio.sleep for async compatibility # Choose where the arm should move to x, y = 10, 90 shoulder_pos, elbow_pos, camera_pos = arm.calculate_position(x, y) # Move all parts based on IK calculations simultaneously await asyncio.gather( arm.move_to(Arm.SHOULDER, 1, shoulder_pos), arm.move_to(Arm.ELBOW, 1, elbow_pos), arm.move_to(Arm.CAMERA, 1, camera_pos), arm.move_to(Arm.BASE, 1, 90) ) await asyncio.sleep(2) # Use asyncio.sleep for async compatibility # Reset all parts simultaneously again await asyncio.gather( arm.move_to(Arm.CAMERA, 1, 90), arm.move_to(Arm.ELBOW, 1, 90), arm.move_to(Arm.BASE, 1, 90), arm.move_to(Arm.SHOULDER, 1, 90) ) await asyncio.sleep(1) # Use asyncio.sleep for async compatibility asyncio.run(main()) Powering the arm I used a Pimoroni Yukon with the Bench Power supply module to provide power to the servos via the PCA9685 board. I could have used the Yukon to control the servos too, but I wanted to run the project from the Raspberry Pi as the camera is attached to the arm.