students engaged in learning

Engineering-Based Medicine

Merging Robotics and Neuroscience to Develop 3D-Printed Prosthetics

CSL Professor Tim Bretl investigates two distinct areas of research: robotics and neuroscience. Yet Bretl’s research, enabled by a multidisciplinary team of students, merges the two disparate areas in applications that aim to positively impact society.

“It has been wonderful to work in these two areas because although they sound different, all my projects overlap. A lot of the same tools can be applied to all my projects,” said Bretl, who received his Ph.D. in 2006 from Stanford in aeronautics and astronautics and is now an associate professor of aerospace engineering. “I enjoy working with a diverse group of people who have expertise different from my own, and I’m lucky to have a great group of students who come to me with good ideas.”

One of Bretl’s four active projects involves developing upper-limb prosthetic devices. He and graduate student Aadeel Akhtar are building a prosthetic hand that connects to the user with electrodes that read muscle activity (called an electromyographic [EMG]-based interface) and incorporates sensory feedback.

Merging Robotics and Neuroscience to Develop 3D-Printed Prosthetics

“Developing prosthetics is all about building effective communication with the device,” said Bretl. “There’s the EMG interface that translates muscle activity into commands for the device—so how the person tells the device what to do—and then there’s how the device tells the person what it’s doing—so that’s sensory feedback—and we do all of that.”

Researchers from a number of disciplines are involved in this project, including John Rogers, professor of materials science and engineering, who specializes in the development of the electrodes used for the EMG interface. The group also works with the Rehabilitation Institute of Chicago, where the prosthetics will ultimately be applied and tested on amputees.

3D Printed Prosthetic Hand in Action

Students from Engineering at Illinois have created the first 3D printed
prosthetic with pattern recognition capability.

In August of 2014, Akhtar led a team of Illinois graduate students under Bretl's advisement to South America where they put their open-source dexterous artificial hand to the test on an Ecuadorian man.

The group has created one of the first 3D-printed prosthetic hands with pattern recognition capability. A machine-learning algorithm allows it to do more than just open and close. It learns other positions of the hand for more functionality. In addition, it can be created for a mere $270 compared to the average myoelectric prosthetic, which retails for between $30,000-$40,000. Even taking in consideration mark-up, it still represents a significant cost decrease to the patient.

The hand is trained to replicate several motions by taking the electrical signal from muscles in the arm and sending it to an EMG board, which is then sent to a microprocessor with a machine-learning algorithm on board. Based on those signals, it sends commands to motor drivers, which churn the motor and make the hand move. Although the EMG board that is being used for the current prototype is the size of a standard audio mixing board, it will eventually shrink to a size that can fit into the socket of a residual limb.

Akhtar’s team has created a mathematical model of five actions – a hand at rest, open-faced, closed (tool grip), a three-finger grasp, and a fine pinch. The initial training takes about one to two minutes and involves a patient going through each one of the gestures.

“Using the machine-learning algorithm based off the signals it picks up from the muscles, it can figure out which of these grips he is actually doing,” explained Akhtar. “The micro- controller with the machine-learning algorithm will then replicate the grip he’s trying to make.”

Through 3D-Printed Prosthetic, Illinois Students Lending a Hand in Ecuador

A connection last spring with David Krupa, an Illinois alumnus, accelerated the project even more. Krupa co-founded the Range of Motion Project (ROMP), a non-profit organization in Guatemala and Ecuador that provides prosthetic and orthotics to those without access to rehabilitative care. Krupa was back on campus to receive the International Young Humanitarian Award from the U of I, and Akhtar met with Krupa to discuss his team’s research. After hearing about its work, Krupa approached the U.S. embassy in Ecuador about sponsoring team members in travel to Ecuador to test the device on a patient as soon as August. That put the project on a much quicker timeline.

Akhtar and Mary Nguyen, a master’s student in aerospace engineering, spent two weeks in Quito, Ecuador, putting the final touches on the prototype, demonstrating it for members of the embassy and working with patient Juan Suquillo, who has a below-elbow amputation on his left arm for an injury suffered 33 years ago in a war with Peru. The team demonstrated the product first by having Adam Namm, the U.S. ambassador to Ecuador, successfully control the arm.

The event attracted its share of attention, including national media from Ecuador. The hand itself takes about 30 hours to print, then another two hours to assemble. All the electronics that are necessary to convert the neural signals into movements are located within the hand.Through a mechanical connection from one of the artificial fingers directly to the skin, the patient will also be able to better feel the position of their hand without looking at it.

“No commercial prosthetic device has any sort of feedback,” Akhtar said. “We’re going to put sensors in the fingers. Based on the amount of force that the fingertips are detecting, we are going to send a proportional amount of electrical current across your skin to stimulate your sensory nerves. By stimulating your sensory nerves in different ways with different amounts of current, we can make it feel like vibration, tingling, pain, or pressure.”

"It’s really awesome to be able to help people,” Nguyen said. “I didn’t imagine doing something that has this direct impact on the world while still in college.”

Watch Prof. Tim Bretl's Ted Talk

The last of Bretl’s active projects, which are all funded by the National Science Foundation, involves brain-computer interfaces (BCI). His team is developing an interface that uses brain activity gathered from an electroencephalogram (EEG) to allow people to do anything from typing words to flying an aircraft.

Though the applications are endless, Bretl says one of the main objectives is to help people with severe motor disabilities interact more fully with the world around them. Finding a way for BCIs to work would be a significant scientific accomplishment.

“You have to understand a lot of about brain function and how that’s reflected in EEG activity,” said Bretl. “So in thinking about how to get BCIs to work, it causes you to ask questions about the brain that tend to be a little different than what has traditionally been asked in neuroscience, and this could provide new insight for the scientific community.”

Robotics has always been a passion for Bretl, who, during his graduate studies at Stanford, worked on the algorithms that helped control the movement of the robotic legs that were used in Mars rovers. His interest in neuroscience began from collaborations with Todd Coleman, a former University of Illinois professor who specialized in network information theory.

“I mean, who doesn’t love robots? You figure out how to tell them what to do, and they do it—it’s a lot of fun,” Bretl said. “Even on the neuroscience side, a prosthetic hand is a robot that happens to be attached to a person. Robotics opens up a whole set of humanitarian applications and interesting science.”

***

Contributors: Mike Koon (mkoon@illinois.edu) | Josh Nielsen (jniels@illinois.edu) | August Schiess (aschiess@illinois.edu)