Piano Hand

Building a human-like hand capable of playing piano

Goal

The primary objective of Piano Hand is to explore the perspective of robotics in replicating human-motion. Fascination for this included looking at tasks that would help us understand biomechanics, which is an area of robotics that has gained a lot of traction recently.

The goal of the project is to build a fully autonomous robot arm that can play the piano. The human hand, with 27 degrees of freedom (DOF), has so far been the most dextrous mechanism to play the piano and the closer we get to replicating that degree of freedom and movement, the better it is to move the arm and play the piano.

This semester (Fall 2022), we are planning to refine our working model of the animatronic hand built last semester with the help of accurate servo motor and flex sensor functioning. We would expect to extend functioning to two hands as well. Additionally, we are starting a new path in software along the lines of machine learning and optical image recognition by building a model that can read sheet music, given an image format.

Along the way, we will publish our progress, code, tutorials and workshops.

Project GitHub Repository: https://github.com/purdue-arc/arc-piano-hand


What we have done

Spring 2023

Fall 2022

With guidance of PhD. student mentor Sheeraz Athar, we decided to focus on the fingers hitting the keys first. So we designed and build a single-phalange-finger design that includes some curvature and bends on the fingers to make it look and feel human-like for processing. More clarity in terms of actuating the hand was also obtained, and methodologies on using specifiec file formats between different software subteams was established. A new subteam, Algorithms, was also established to work with hand and finger states as the hand is playing the piano.

Summer 2022

Ideated and designed new mechanisms for the hand that involves the usage of linear actuators and bevel gears. This design will be assembled and tested in Fall 2022 as a new iteration from the hand design in Spring 2022.

Spring2022_Design Spring2022_Design

Worked on fixing issues that came about in software in Spring 2022, and in getting ready the design to implement on the hand in Fall 2023. Introduced new course of action alongside software to start with machine learning and model development in Optical Music Recognition.

Clip from working of servos:

Spring 2022

Spring2022_Design

Worked on improving model developed in Fall 2021 by printing and testing. An add-on for attaching the servo motors was developed and the design was 3D-printed.

Spring2022_ServoAttach

Software work primarily included setup of environment on Arudino/Raspberry Pi, along with flex sensors. Issues with the usage of continuous servos were addressed. 8-bit ADCs were also used to improve testing.

Fall 2021

Worked primarily on developing models and getting an idea of the different parts necessary to 3-D print. Produced the following first iteration of the hand by the end of the semester from hardware.

Fall2021_Design

Software primarily worked on simulation and testing, and the following simulation was produced on TinkerCAD (TinkerCAD’s electronic component simulator had Arduino testing capabilities and hence was useful for the first stage of testing). TInkerCAD’s use-cases for simulation testing were visible from early testing with the software for multiple fingers, using MG90S servos.


Subteams and Roster

Project Manager

Revanth Krishna Senthilkumaran, Computer Engineering

Hardware

Hardware primarily works on making and refining CAD models with tools available, 3D printing models, assembling, testing and identifying points of improvement in the model and testing functionality.

  • Rugved Dikay, Aeronautical and Aerospace Engineering
  • Ian Ou, Computer Engineering
  • Archis Behere, Mechanical Engineering

Software

Software primarily works on developing the code and algorithms for the movement of the hand to locations computed, along with setup of the electrical systems. More recent initiatives include model development for optical music recognition and Raspberry Pi conversion from Arduino.

  • Dhruv Sujatha, Data Science
  • Jacob Aldridge, Computer Science
  • Manas Paranjape, Computer Science
  • Visuwanaath Selvam, Computer Engineering

Resources

Inspiration / Other Projects

  • A team of researchers attempted replicating the pressure applied in the grasping mechanism and achieved 17 DOF in a 5-finger hand. The actuators used are the most interesting: McKibben Actuators, which move on the basis of difference in Air Pressure. ScienceDirect article
  • The ILDA Robot Hand: A 15 DOF, highly tactile robot hand, motion along surface of palm, and stepper motor actuation. Some of its capabilities include crushing cans, delicate grasping, and tactile tasks such as tapping. Overview and Videos of Functioning | Nature article
  • Soft actuated robots - Harvard paper |
  • WPI - MQP
  • Allegro Hand - ROS Documentation | MIT Grad Student's Piano Hand
  • Robot Nano Hand - YouTube | Site
  • InMoov Robot Hand - Parts and Paper Link
  • Other similar project resources: Arduino Project Hub | Instructables
  • Video Links/Pages referred: Will Cogley | Automation Robotics' Clone
  • References

  • Piano Keys Research - Dimensions
  • Joint Design - Ball and Socket
  • Optical Music Recognition Datasets - Apacha Database List | Deepscores | Primus
  • Raspberry Pi-Arduino Connectivity - AranaCorp | Pi4 GPIO
  • Parts - ADCs | Multi-channel Servo Controller (16, 18) | Linear Servo Actuators - CAD Files 1, CAD Files 2
  • Add-ons: Strings/Thread 1 , 2 | Gloves
  • DOF Analysis - +Real-time Hand Tracking | Calculator | Mecharithm | Technopedia
  • Interesting Actuation Methods - Voltage-Controlled Linear Actuation
  • Last modified April 6, 2023: Rename index.md to _index.md (c1aade8)