The primary objective of Piano Hand is to explore the perspective of robotics in replicating human-motion. Fascination for this included looking at tasks that would help us understand biomechanics, which is an area of robotics that has gained a lot of traction recently.
The goal of the project is to build a fully autonomous robot arm that can play the piano. The human hand, with 27 degrees of freedom (DOF), has so far been the most dextrous mechanism to play the piano and the closer we get to replicating that degree of freedom and movement, the better it is to move the arm and play the piano.
This semester (Fall 2022), we are planning to refine our working model of the animatronic hand built last semester with the help of accurate servo motor and flex sensor functioning. We would expect to extend functioning to two hands as well. Additionally, we are starting a new path in software along the lines of machine learning and optical image recognition by building a model that can read sheet music, given an image format.
Along the way, we will publish our progress, code, tutorials and workshops.
Project GitHub Repository: https://github.com/purdue-arc/arc-piano-hand
What we have done
With guidance of PhD. student mentor Sheeraz Athar, we decided to focus on the fingers hitting the keys first. So we designed and build a single-phalange-finger design that includes some curvature and bends on the fingers to make it look and feel human-like for processing. More clarity in terms of actuating the hand was also obtained, and methodologies on using specifiec file formats between different software subteams was established. A new subteam, Algorithms, was also established to work with hand and finger states as the hand is playing the piano.
Ideated and designed new mechanisms for the hand that involves the usage of linear actuators and bevel gears. This design will be assembled and tested in Fall 2022 as a new iteration from the hand design in Spring 2022.
Worked on fixing issues that came about in software in Spring 2022, and in getting ready the design to implement on the hand in Fall 2023. Introduced new course of action alongside software to start with machine learning and model development in Optical Music Recognition.
Clip from working of servos:
Worked on improving model developed in Fall 2021 by printing and testing. An add-on for attaching the servo motors was developed and the design was 3D-printed.
Software work primarily included setup of environment on Arudino/Raspberry Pi, along with flex sensors. Issues with the usage of continuous servos were addressed. 8-bit ADCs were also used to improve testing.
Worked primarily on developing models and getting an idea of the different parts necessary to 3-D print. Produced the following first iteration of the hand by the end of the semester from hardware.
Software primarily worked on simulation and testing, and the following simulation was produced on TinkerCAD (TinkerCAD’s electronic component simulator had Arduino testing capabilities and hence was useful for the first stage of testing). TInkerCAD’s use-cases for simulation testing were visible from early testing with the software for multiple fingers, using MG90S servos.
Subteams and Roster
Revanth Krishna Senthilkumaran, Computer Engineering
Hardware primarily works on making and refining CAD models with tools available, 3D printing models, assembling, testing and identifying points of improvement in the model and testing functionality.
- Rugved Dikay, Aeronautical and Aerospace Engineering
- Ian Ou, Computer Engineering
- Archis Behere, Mechanical Engineering
Software primarily works on developing the code and algorithms for the movement of the hand to locations computed, along with setup of the electrical systems. More recent initiatives include model development for optical music recognition and Raspberry Pi conversion from Arduino.
- Dhruv Sujatha, Data Science
- Jacob Aldridge, Computer Science
- Manas Paranjape, Computer Science
- Visuwanaath Selvam, Computer Engineering