• Nenhum resultado encontrado

2016.1 Emanuel de Jesus Lima

N/A
N/A
Protected

Academic year: 2021

Share "2016.1 Emanuel de Jesus Lima"

Copied!
14
0
0

Texto

(1)

Development of a 3D Printed Prosthetic Myoelectric Hand Driven by

DC Actuators

Emanuel de Jesus Lima, Armando S. Sanca

Technology Department, State University of Feira de Santana, Feira de Santana/BA, 44036-900, Brazil

Upper limb amputees receiving myoelectric devices are currently limited to either highly limiting single actuating devices or extremely expensive multi-digit actuation designs. Technological advances have led to increasingly sophisticated prostheses and control of such devices necessitates real-time classification of electromyographic (EMG) signals. In this paper, we show that it is possible to construct a hand prosthesis with a varied, multi-digit gestures set that is affordable for low income amputees. The system includes an innovative, noninvasive device to be placed on the forearm muscles for capturing EMG signals, a unique intelligent system for classification of hand gestures, packaged with an easy to use, assemble, and maintain prosthetic device. This paper describes a real-time, portable system based on the Myo armband and a 3D printed prosthesis, the results demonstrate that this approach represents a significant step towards more intuitive, low cost myoelectric prostheses with the possible extension to other assistive robotic devices.

Index Terms—Hand, Prosthetics, Myoelectric Signals, k-NN, 3D Printing.

I. INTRODUCTION

L

IMB loss has a significant, negative impact on amputees, both personal and social. With issues ranging from psychosocial relationships, social dynamics, and functional ability or inability to engage in prior activities, amputees face significant challenges achieving their desired outcomes and capabilities. This can be particularly profound for upper limb amputees; they are generally less able to disguise their limb loss and their loss generally has a more substantial negative impact on their daily activities.

In medicine, a prosthesis is an artificial device that replaces a missing body part, which may be lost through trauma, dis-ease, or congenital conditions. These devices can be designed to provide a better aesthetic appearance and psychological feeling of the whole to the patient, to improve the functions of lost limbs, or some combination of these goals. For years development has been stymied by limitations of available technology [1]. However, in the late twentieth century and early twenty-first century upper limb prosthetic devices that provide natural control based on remaining neuromuscular connections became commercially available, first using analog control systems[2], and recently, digital signals processing systems [3]-[4].

Modern prosthetists can offer a range of devices that lever-age different, and in many cases highly advanced technologies, but in many cases these same prostheses have a high cost, severely limiting the access of people to purchase this type of equipment. According to ABOTEC (Brazilian Association of Technical Orthopedics), less than 3% of Brazilian disabled people can have access to high-tech prostheses [5]. There are now numerous devices available to upper limb amputees (arms and hands) that use sensors to capture information from muscle contractions responsible for activation of human motor units, and send this information to a system control to activate the electro-mechanism of the prosthetics. Devices that make use of this kind of interface are commonly referred

to as myoelectric arms, and myoelectric hands [6]. The cost of a highly dexterous (providing individual control of digits) commercially available myoelectric hand is extremely high, reaching values over R$150.000.

Currently, there are many efforts to address market failure in options with low cost and high efficiency. In Brazil, researchers from the Federal University of Goi´as (UFG) in Catal˜ao, are developing a 3D prosthetic device intended for low-income people. The product is a mobile application designed for controlling a bionic prosthesis of a hand that are printed in 3D printers [7]. In Europe and the United States, prosthetics projects like The Open Hand Project and the Enable Community Foundation (ECF) also develop assistive technology, accessible and at low cost. The latter provides the free 3D prototypes project files so that anyone with access to a 3D printer can make parts for prosthetics.

In preceding research aiming towards a human-like control for prostheses, Electromyogram (EMG) signals have been widely used as an interface tool for prosthetic hands [8], [9]. Improvements to EMG recognition and processing would bring a superior control and replicates the neural control of human hand. However, at present most of these operate exclusively as binary (on/off) states that depend on the EMG data [10], [11]. Current control schemes are often non-intuitive and introduce “cognitive burden” in the sense that the user is required to learn to associate muscle remnants actions to unrelated postures of the prosthesis [12]. In order to bridge the gap towards human like control, a Local Hand Control (LHC) replicating the muskuloskeletal control of human hand is needed.

Following up this line of thought, this project was begun in collaboration with Seattle Pacific University, establishing a goal to develop a myoelectric hand that has features com-parable to high end dexterous prosthetic hands but accessible and at a substantially lower cost than available on the market. The initial step in this process was determined to be the development of a control system that uses electromyographic

(2)

(EMG) signals. This system would then transfer the signals to an instrumented embedded system, which in turn will effect the control of electric actuators for moving a dedicated set of joints or “fingers” and thus affording fully dexterous operation. As an additional study, the use of three-dimensional additive manufacturing (3D Printing) was investigated for producing the mechanical components of the hand. The application of these technologies enables a significant contribution to the social sphere.

In this paper, a brief introduction was first provided in section I. In section II a literature review is presented, ad-dressing existing prosthetics hands both in academic research and in the commercial marketplace, a succinct explanation of electromyography and its history, the improvements in signal processing achieved by machines learning algorithms, and the benefits of 3D printing. In section III, describes the materials and methods, presents the components used to build the prototype as well as control system descriptions. Numerical analysis and experimental results are presented to show the performance of the system, including data collection and gesture recognition in section IV. Finally, section V provides some conclusions and potential future works.

II. LITERATUREREVIEW

A. Prosthetics Hands in the World

Upper limb prostheses are used following amputation at any level from the hand to the shoulder. The major goals of the upper limb prosthesis are to restore natural appearance and function. Reaching these goals also requires sufficient comfort and ease of use for continued acceptance by the user. The level of amputation (digits, hand, wrist, elbow, and shoulder), therefore the complexity of joint movement, results in significant increases in technical challenge for higher level amputations [13].

There are different types of prosthetic limbs that are de-signed with different goals in mind. These goals often depend on the needs of the patient and the site of the amputation. The different prostheses developed by the main prosthetic companies: Motion Control and Otto Bock are responsible for 90% of the myoelectric device market, and a wide range of manufacturers produce products for passive or cosmetic needs [14].

1) Aesthetics Prostheses

A Aesthetic prostheses are designed with appearance rather than mechanical in mind. The prosthetic part is created from a standard mold and resembles an intact limb. They may be cov-ered with expensive, detailed cosmetic covers (cosmesis) that incorporate advanced plastics and pigments that are uniquely matched to the patient’s skin tone [14].

2) Body Powered Prostheses

Body powered prostheses rely on cables, typically con-trolled by the trapezius muscles, to control function of the terminal device. This kind of prostheses is designed to func-tionally provide assistive motions and operations to support the opposing, intact limb. It can be manual (use with the assistance of the healthy member) or with cable [14].

3) Myoelectric Prostheses

Myoelectric prostheses are prosthetic limbs with a control system in which operation of the output apparatus is controlled by the processing of the electromyography signals which accompany the contraction of one or more muscles [14].

B. Electromyography

Electromyography (EMG) is an electrodiagnostic technique for evaluating and recording the electrical activity produced by skeletal muscles. This can be achieved invasively, by wires or needles inserted directly into the muscles (Intramuscular Electromyography), or noninvasively, by recording electrodes placed above the skin surface overlying the investigated mus-cles (Surface Electromyography) [15]. There are many factors affecting the EMG signal such as motion devices, electrode misplacement and noise interpolation. In order to obtain more information, signal processing is done over EMG signals such as filtering, rectification, baseline drifiting and threshold leveling. In this development we used a surface device, thus, we will focus on explain about Surface Electromyography (SEMG).

Recent research has resulted in a better understanding of the properties of SEMG recording. Over time, corrections have been made to this system so that the amplifiers currently have high input impedance and attenuate noise levels, which allows the reproduction of experiments without interference with the results [16]. Improved control options are now available to those who previously could not be fitted with such devices.

SEMG includes a device with surface electrodes that record electrical impulses of nerves at rest and during activity in order to characterize the electrical potential of a specific muscle or group of muscles. Electrical activity can be assessed by computer analysis of the frequency spectrum, amplitude, or root mean square of the electrical action potentials.

In the recent past, SEMG has been applied to develop multi-degree of freedom robotic mechanisms that can effectively imitate the motion of the human limb. Work [17] at UCLA approached the multifunctional (MF) control problem using a large number of electrodes, though still considering only a limited part of the EMG spectrum. Researchers from Mepco Schlenk Engineering College in India [18] have also developed a human hand prosthesis.

The use of SEMG has many advantages. SEMG recordings provide a safe, easy, and noninvasive method that allows objective quantification of the energy of the muscle. The numerous applications of EMG include the diagnosis of neuro-muscular disease or trauma in clinical practice, rehabilitation and the study of kinesiological muscle function in specific activities[16].

C. Machines Learning Algorithms

Recent innovations in signal processing techniques and mathematical models have made it practical to develop ad-vanced EMG recognition and analysis methods [19]. Differ-ent mathematical description methods and machine learning techniques such as Artificial Neural Networks (ANN), Fuzzy Systems, Probabilistic model algorithms, Metaheuristic and

(3)

Swarm intelligence algorithms, and some hybrid algorithms are used for characterization of EMG signals.

Machine learning is one of the fastest growing areas of computer science, with far-reaching applications including EMG signals classification. The term machine learning refers to the automated detection of meaningful patterns in data. In the past couple of decades it has become a common tool in almost any task that requires information extraction from large data sets [20]. The Learning algorithm is an adaptive method by network computing units self-organizes to realize the target (or desired) behavior.

Common machine learning algorithms types include [21].

• Supervised Learning (Figure 1) - the algorithm

gener-ates a function that maps inputs to desired outputs. One standard formulation of the supervised learning task is the classification problem: the learner is required to learn (to approximate the behavior of) a function which maps a vector into one of several classes by looking at several input-output examples of the function.

• Unsupervised Learning - which models a set of inputs:

labeled examples are not available. There is no target outputs (no any labeled examples). The learner receives no feedback from environment.

• Semi-supervised Learning - the algorithm creates both

labeled and unlabeled examples a special function or classifier.

• Transduction - similar to supervised learning, but does

not explicitly construct a function: instead, it tries to predict new outputs based on training inputs, training outputs, and new inputs.

• Learning to Learn - where the algorithm learns its own

inductive bias based on previous experience.

In this development we focus on the supervised learning algorithm k-Nearest Neighbors (k-NN). Due to its simplicity and effectiveness have led it to be widely used in a large number of classification problems [21].

Fig. 1: Supervised Learning Algorithm

The k-NN classification rule is one of the most well-known and widely used nonparametric pattern classification methods. It classifies objects based on closest feature space in the training set (Figure 2). The training sets are mapped into

multi-dimensional feature space. The feature space is partitioned into regions based on the category of the training set. A point in the feature space is assigned to a particular class if it is the most frequent class among the k nearest training data. Generally Euclidean Distance is used in computing the distance between the vectors [21].

Fig. 2: k-NN Classification

The scikit-learn project [22] provides an open source ma-chine learning library for Python programming language and it was used to classify the EMG data in this development. To reduce the required number of distance calculations by efficiently encoding aggregate distance information for the sample, this library uses a data structure named KD tree (short for K-dimensional tree). The KD tree is a binary tree, each of whose nodes represents an axis-aligned hyper-rectangle. Each node specifies an axis and splits the set of points based on whether their coordinate along that axis is greater than or less than a particular value. The basic idea is that if point A is very distant from point B, and point B is very close to point C, then we know that points A and C are very distant, without having to explicitly calculate their distance. In this way, the computational cost of a nearest neighbors search can be reduced.

In scikit-learn, neighbors-based classification is a type of instance-based learning or non-generalizing learning: it does not attempt to construct a general internal model, but simply stores instances of the training data. Classification is computed from a simple majority vote of the nearest neighbors of each point: a query point is assigned the data class which has the most representatives within the nearest neighbors of the point [23].

(4)

III. MATERIAL ANDMETHOD

A. Components Description

In this section, we are going to provide information such as hardware and electronics details, device configuration, and features about the components used to build the prototype.

1) Myo Armband and Gestures

Developed by Thalmic Labs, Myo (Figure 3) is a lightweight elastic armband to register gesture commands. Myo consists of a number of EMG sensors that measure electrical activity in a forearm muscle to transmit gestures that are made with a hand to a connected device via Bluetooth. The EMG sensors of the armband has 8-channel EMG sensor group along supported with an Inertial Measurement Unit (IMU) which reports linear and rotational acceleration as well as the rotation angles around 3-axes [24]. The sensor readings are transferred over a low power Bluetooth adapter to the computer.

Fig. 3: Myo Armband

To access the EMG signals and motion parameters on the worn arm, Thalmics Labs provides a Software Development Kit (SDK) which offers complete facilities for writing ap-plications that make use of the Myo armband’s capabilities. Currently, the SDK is available for four different platforms: Windows, Mac, iOS, and Android. To use Myo with Windows and Mac, it is necessary to attach the Bluetooth adapter to a computer, while using Myo with iOS and Android is made by installing an app. There is no official support to the Linux platform.

In addition to the SDK, some tasks can be also accom-plished through scripting. Myo Connect is an API that runs connectors that can handle Myo events and use them to control applications. Myo Scripts are connectors written in Lua, which is a simple scripting language that is often used for application scripting[25].

Due to the lack of a SDK and support, access to the EMG signals on a Linux system can only be accomplished through scripting. Instead of using Myo Connect which is supported only on the Windows and Mac OS platforms, we use myo-raw library, a Linux alternative to what Myo Connect scripting does under Windows. While Myo Connect uses Lua scripting,

myo-raw uses Python. According to the creator, the purpose of this library is to be as close as possible to the original Lua scripting in Myo Connect. More about this library will be discussed in the software section.

As the measured EMG signals depend on the sensor location on the muscles, Myo armband applications require special calibration gestures that must be performed every time the armband is put on the arm or taken off. In this way, the application calibrate the sensor locations depending on the sensed sensor direction and rotation of the armband.

The IMU reports the motion related parameters at a fre-quency of 50 Hz, whereas the frefre-quency for the EMG signal is 200 Hz. The SDK provides a gesture object by which five different hand gestures are reported. The hand gesture set and the motions are composed of Hand Fist, Wave In, Wave Out, Spread Fingers, Double Tap, Rotate, and PAN(Figure 4).

Fig. 4: Gesture Set 2) Intel Edison

The Intel Edison is a system on a chip (SoC) powered by the Intel Atom SoC, dual-core, 500MHz x86 CPU and an Intel Quark processor - supported by 1GB of LPDDR3 RAM, 4GB of flash, and, integrated WiFi and Bluetooth (2.1 and 4.0/BLE) [26]. It comes pre-installed with a Yocto Linux distribution to assist in hardware management.

Interfacing with the Intel Edison board is made by con-necting a or device to its fine-pitch, 70-pin, Hirose connector. On that connector, the Edison breaks out I2C, SPI, multiple UARTs, and multiple GPIOs. In this development we also use three blocks: a Base block, a GPIO block, and a Battery block, all procured from SparkFun, Inc. The Base Block serves as add-on for the Intel Edison by allowing attachment of the USB dongle used to connect with Myo, and a USB cable to connect to a terminal screen. The GPIO Block is a breakout board to bring the GPIO from the Intel Edison to the user. The Battery Block brings a single cell LiPo Charger and 400mAh battery to power the Intel Edison and expansion blocks.

3) 3D Printed Hand with Actuators

The 3D printed hand used in this development is an adap-tation of the Raptor Hand created by the e-NABLE project. Developed collaboratively by some of e-NABLE’s [27] top designers, the Raptor Hand is designed with ease of printing and assembly in mind. Features include 3D printed snap pins, a modular tensioning system, and compatibility with both velcro and leather palm enclosures. The Raptor Hand is licensed

(5)

under the Creative Commons-Attribution-Share Alike license, which says that it can be transformed, and built upon the material for any purpose, even commercially.

Figure 5(a) and Figure 5(b) show the top and the bottom view of the 3D printed Raptor Hand. The printing was made using Printrbot FDM printers at Seattle Pacific University, Seattle, WA, USA from polylactide (PLA) that is a biodegrad-able and bioactive thermoplastic aliphatic polyester derived from renewable resources.

(a) Top View

(b) Bottom View

Fig. 5: 3D Printed Hand

As stated in [28], the most common actuator used in prosthetics today, excluding body power, is a direct current (DC) servomotor. DC servomotors, are powerful, inexpensive, and small motors that are widely used in practical applications, such automobile, aircraft, portable electronics, and in speed control applications. Being small and lightweight makes DC servomotors easy to be packaged in the hand or forearm. The speed of a DC servomotor is proportional to the voltage applied to the motor. When using digital control, a pulse-width modulated (PWM) signal is used to generate an average voltage. The motor winding acts as a low pass filter so a PWM waveform of sufficient frequency will generate a stable current in the motor winding [29].

B. System Development

In this section, we describe the procedures for capturing, conditioning, processing of EMG signals, and operating the servomotor using PWM signals. Figure 6 presents the software overview.

1) Signals Capture

As stated in the section III, there is no official support for the Myo band in Linux. Instead this projected used myo-raw, a

Fig. 6: Sofware Overview

third-party library written in Python that is able to extract raw signals from the Myo and operates on a Linux platform. The library provides an interface to communicate with the Myo, providing the ability to scan for and connect to a nearby Myo, and giving access to data from the 200hz EMG sensor data and the 50hz IMU data.

Once the system is started, the program will continuously stream the EMG readings and passing them to a thread where they are processed.

2) Signals Classification and Output Signals

The main goal here is to determine the performed hand gesture, based on the received EMG data from the forearm, while maintaining real-time response.

The signal classification is done by implementing the k-NN algorithm available in the Python module for machine learning scikit-learn. The k-NN algorithm [23] implements learning based on the k nearest neighbors of each query point, where k is an integer value specified by the user. The optimal choice of the value k is highly data-dependent: in general a larger k suppresses the effects of noise, but makes the classification boundaries less distinct. The value of k is highly dependent on the size of the database. In the literature, there is no consensus on how this value can be calculated. The alternative adopted in this work was to use one of the recommendations that says that the value of k can be equal to the square root of the size of the training base divided by 2, yet this value must be an odd value to decrease the chances of a tie. As the training base is defined in the calibration stage where approximately 900 samples are collected for each gesture, the value of k in this case was set to 15. The percentage of correctness of the classifier using this value was considered as good or excellent. The considered algorithm is used because it is simple enough to be implemented in real-time with low computational costs.

The Calibration Process

To classify data we have to train it by defining gestures and assign numbers to them. The Myo is placed at the top of the subject‘s forearm and the subject is instructed to execute each gesture (resting, making a fist, index finger, thumb finger) for approximately twenty seconds. It is necessary to start the program while the gesture is being held, not while the limb is moving to or from the gesture, and try moving the limb around a little in the gesture while recording data to give the program a more flexible idea of what the gesture is. As

(6)

long as the algorithm receive a gesture number as argument, the current EMG readings will be labeled and recorded as belonging to the gesture of that number. It is done by holding down a number key on the keyboard.

The Classification Process

Having the program running, any time a new reading comes in, the program classifies it based on the trained values to determine which gesture it looks most like. If running in an environment with a screen, the screen can be used to display the number of samples currently labeled as belonging to each gesture, and a histogram displaying the classifications of the last 25 inputs. The most common classification among the last 25 is shown in and should be taken as the program’s best estimate of the current gesture.

Servomotors Positioning Process

When the system is started, three threads are created to manage each servo unit. Once a gesture is classified, the number assigned to that gesture on the calibration stage is sent to the threads, then the program checks if the current servo position needs to be changed based on the number was received (Figure 7). Then, the servomotors pull the cords to make the desired position.

Fig. 7: Sofware Overview Workflow

The servomotors units are operated by commands received from the Edison board in a form of Pulse-Width Modulation (PWM). The actuators can rotate approximately 180 degrees (90 in each direction). Position ”0” (1.5 ms pulse) is middle, ”90” ( 2 ms pulse) is all the way to the right. ”-90” ( 1 ms pulse) is all the way to the left.

The Intel Edison provides a 1.8V amplitude PWM output which is not enough to operate the servo units. To solve this problem we built a signal conditioning to get the signal at the necessary voltage.

IV. EXPERIMENTS ANDRESULTS

A. Hand Control System

The constructed system is composed of three main elements: Myo armband (EMG sensors), Intel Edison and signal condi-tioning boards powered by batteries and a 3D printed pros-thetic hand actuated by servomotors (Figure 8). The solution

allows the user to operate the prosthesis by contracting the forearm muscles in an intuitive way. The prototype was tested to demonstrate high classification success rates and multiple gestures support with a low cost.

Fig. 8: Myoelectric Hand Driven by Servomotors 1) Data Collection

Data collecting experiments were carried out in order to evaluate streaming performance of the device, and to compare different sets of data for a same gesture in different arm positions. To collect data, the device (Myo) was placed on the forearm and then the gestures were executed for three different arm positions (Figure 9).

Fig. 9: Arm positions for data collecting

For each arm position, the following gestures were per-formed:

(7)

Fig. 10: EMG Graph for the Closing Hand Movement

• Arm relaxing, pointing down, fingers extended and

re-laxed.

• Arm relaxing, pointing forward, fingers extended and

relaxed.

• Arm relaxing, pointing up, fingers extended and relaxed. • Arm pointing down, making a fist.

• Arm pointing front, making a fist. • Arm pointing up, making a fist. • Arm pointing down, closing the hand. • Arm pointing front, closing the hand. • Arm pointing up, closing the hand.

The EMG data for each gesture was stored in a file, and this file was used to plot graphs using MATLAB (Mathworks, Inc., Natick, MA, USA).

Figure 10 shows in details the sensors response for the closing hand gesture having the arm positioned in three directions: up, down, and forward. The blue line shows the response when the arm is pointing down, the green line shows the response when the arm is pointing forward, and

the red line shows the response when the arm is pointing up. S1, S2, S3, S4, S5, S6, S7, S8 are the sensors numbers where

S1represents the values of sensor 1, S2represents the values

of sensor 2, and so on. We can see that although it exists differences between the lines when a gesture is executed in different arm position, the classifier can handle it predicting the correct gesture, as we demonstrate in next section (Figure 11). We can conclude then that the effect of the arm position is not highly significant for gesture identification. Furthermore, we notice that as the hand is being closed which is done by increasing the muscular contraction, the values of the sensors increase as well. In addition, it is visible that some sensors can contribute (increase the value) to a determined gesture more than others. These values may change if the sensor is moved or placed in a different portion of the arm, thus necessitating the calibration stage every time the sensor is moved or removed from the arm.

The graphs for the others gesture can be seen in the Appendices section.

(8)

Fig. 11: EMG graph for tests

2) Gesture Recognition

After data collection was complete, experiments were con-ducted using the classifier to recognize gestures. In all the experiments described here, the gestures were performed by a person with no amputation or malformation in the arm, and a reasonably hair-free arm (hair on the arm makes the readings less accurate since it decreases the contact of the sensor with the skin). The Myo was placed at the top of the subject‘s forearm and the subject was instructed to execute the commands as described in each experiment. Figure 12 presents the implemented gestures which include (from top to bottom) rest position, making a fist, extending the index finger, and hiding the thumb.

Figure 11, shows the result of an experiment where all the four gestures (rest, making a fist, index finger, thumb finger) were performed for a time and the input signals from the eight sensors, and the predicted class for that input signal were recorded.

In figure 11, the blue lines represents the sensors values over the time, the red line represents the predicted class, the green line shows the time interval the gestures were performed, and the yellow circle the transition between gestures.

When the recording began the hand was in the “rest” position (class 0), a few seconds later, “making a fist” (class 1) gesture was performed and as can be seen there is a change in the sensor level (blue line), and in the predicted class (red line), the same fact occurs when the others gestures were executed over the time. We also notice some peaks and fast predicted class change (yellow circle). It happens mostly in the transition between gestures because the algorithm faces an unknown gesture and try to classify that input as belonging to the class that is most likely to match it. This effect is attenuated in the

Fig. 12: Hand Gestures Set

output signal to the servo by sending the signal only when a class is the most common in the last twenty-five classified classes. Because the sample rate is high relative to the speed of operation this did not have an effect on the response.

(9)

For visual purposes, since the Intel Edison does not have a graphic screen, the second gesture recognition test were carried out on a system with similar configuration (operating system Linux, Python 2.7, and necessary libraries) but with a screen to visualize the classification output.

The gesture set for this experiment was as follows:

• Fingers and arm relaxed, no contraction, moving the arm

in different planes (up, forward, down) (Gesture 0).

• Making a fist, little contraction, moving the arm around

in different planes (up, forward, down). (Gesture 3).

• Hand opened and curved back, moving the arm around

in different planes (up, forward, down). (Gesture 5). In this experiment, the screen displays the number of samples currently labeled as belonging to each pose, and a histogram displaying the classifications of the last 25 inputs. In the calibration stage the number assigned for each pose was held down for approximately 20 seconds. The samples average per file was 611. Once finished the calibration, the gesture were repeated and the screen displayed the results in real time. The result of the execution is seen in the Figure 13(a), Figure 13(b), and Figure 13(c)

(a) Arm relaxing

(b) Making a fist

(c) Hand opened and curved back

Fig. 13: Gesture Detection

As we can see in the Figure 13(a), Figure 13(b), and Figure 13(c), were taken 609, 611, and 613 samples for the pose assigned with numbers 0, 3, and 5 respectively. The green line shows that a current reading matches with the number

assigned to that pattern, which means that it is the most common classification among the last 25, and should be taken as the program’s best estimate of the current pose.

The k-NN algorithm computes the k nearest neighbors for a current reading, which means that if a pose not trained is performed, the algorithm will estimate the one in the training set which is most likely to match it. Future works may address this issue by, for example, limiting the distance range between two points. Figure 14 represents a non trained pose. It can be verified by the length of the green line (not full filled) and the presence of two white lines which means that among the last 25 poses were detected a number of pose 3, 1, and 2, and pose 2 was the most common.

Fig. 14: Untrained Pose

The final experiment was performed on the Intel Edison environment. At this point we chose the gestures to be implemented and put all parts of the system together.

In the calibration stage the implemented gestures were performed and was assigned a number to identify the class which that gesture belongs to. “Rest position” was labeled as number 0, “making a fist” as number 1, extending the index finger as number 2, and hiding the thumb as number 3. The algorithm was then executed and as a gesture was performed, the hand prosthesis responded with the same gesture. A video with the demonstration is available at the following link: (https://youtu.be/W8blCFG8PAI)

As demonstrated in the video the system responds with a very low latency.

Using the same gesture set which had already been cal-ibrated, we executed each gesture for approximately five seconds, recorded the number of samples taken in that time and the number of time that the most common gesture was predicted, and the command of servo angle sent to the actuators units. For example, gesture 1 was executed for 5 seconds, during this time were taken 235 samples, 227 of these samples were predicted as belonging to be class 0 (gesture 1), which represents a percentual of 96.5% of succes. The same calculation was made for the others gestures, and the result is presented in table I, next page.

Servo 1 controls the thumb finger, servo 2 controls the index finger, and servo 3 controls a joint with the three others fingers (middle, ring, and pinky finger). For the servos 1 and 2, the ”-90” servo angle command means the finger is open while ”90” means the finger is closed. Due to mechanical and spatial limitation, servo 3 operates in the opposite direction relative to the servo 1 and 2, so the angle command are switched ”-90” means the finger is open while ””-90” means the finger is

(10)

PERFORMED PREDICTED PREDICTION COMAND OF COMAND OF COMAND OF GESTURE GESTURE ACCURACY SERVO 1 ANGLE SERVO 2 ANGLE SERVO 3 ANGLE

REST (0) REST 96.5% 90 -90 -90

FIST (1) FIST 92.5% -90 90 90

INDEX (2) INDEX 89.5% -90 -90 90

THUMB (3) THUMB 91.5% -90 -90 -90

TABLE I: Input to output conversion table

closed. Observing the table I, the rest gesture was identified so all the fingers were open at the considered time.

3) Affordability of Hardware System

The costs of the components of the prototyped system appear in table II.

PART COST (R$)

3D printed hand 200.00 Actuator units (MG90S Servo) 65.70 Sensor Myo Armband 1100.00 Microprocessor Intel Edison 380.30 Base, GPIO, and Battery blocks 254.90

TOTAL 2000.90

TABLE II: Prototype investment cost

Comparing the costs to prototype the system to the price of the commercial myoelectric hand prosthesis available on the market, suggests that the proposed solution may increase the opportunities for low income people to have access to a sophisticated device to replace the missing limb.

V. CONCLUSION ANDFUTUREWORKS

The Myo armband is a promising interface for development of prosthetic devices. Our results show that the quality of the motion and muscle sensing data is sufficient for EMG signal classification. The greatest limitation is that, as of now, there is no official support for Linux platform. This resulted in a significant increase in debug time and would be addressed with commercial support from a manufacturer provided SDK. In this project, our goal was to design an affordable solution for the hand prosthesis problem. The experimental results of the trials described in this paper demonstrate that this myoelectric interface and control system has the great potential to become a usable means for amputees to achieving both ease of use and dexterous functionality by making it affordable for low income individuals and by allowing them at last to control their hand prosthesis in a more intuitive and natural way. In the video that can be accessed in this link (https://youtu.be/9s-8xCSUViU), we show an amputee using the system.

Future work will address known weaknesses of this solution. First, we plan to build the entire system in a embedded platform for aesthetic purposes. We additionally plan to exam-ine different classification algorithms to get a higher gesture recognition accuracy. Further, we plan to use the EMG signals to find conditions to regulate the movements and not just identify them, and extend this solutions in a way that each finger can be controlled separately.

ACKNOWLEDGMENT

First of all, I would like to thank God for giving power and wisdom to complete this development, and my parents for their love and support throughout my life.

The authors are immensely grateful to Dr. Adam Arabian for the student orientation while in academic exchange and for providing the components needed to build and complete this project, without his support we would never have success in getting this project done.

The student also thanks the State University of Feira de Santana, AERI, and CAPES for making possible his staying in the United States of America, providing him academic and financial assistance.

The student place on record, his sincere gratitude to Dr. Delmar B. Carvalho for orientation and advising in scientific research over the years.

REFERENCES

[1] V. Mikl´os, ”The History of Prosthetics Reveals a Long Tradition of Human Cyborgs”, 2014. [Online]. Available from: http://io9.gizmodo.com/the-history-of-prosthetics-reveals-a-long-tradition-of-1552921361. [Accessed: 2015- Dec- 25]

[2] A. Engelhardt et. al., ”Experiences in developing a bioeletric arm pros-thesis”. Biomed Tech (Berl), v16(4): 130-133, Aug. 1971.

[3] P. Chappell, H. Kyberd, ”Prehensile control of a hand prosthesis by microcontroller”. Journal Biomed. Eng., v (13): 363-362, Sep. 1991. [4] Lake, C., Miguelez, J.M. ”Evolution of microprocessor based control

systems in upper extremity prosthetics”. Technology and Disability v15: 63–71. 2003

[5] V. Garcia. ”Veja os primeiros resultados do Censo 2010 sobre Pessoas com Deficiˆencia. 2012. Available from: http://www.deficienteciente.com.br/2011/11/veja-os-primeiros-resultados-do-censo-2010-sobre-pessoas-com-deficiencia.html

[6] T. Inglis.3D Printed Prosthetic Hand with Intelligent EMG Control”. 2013

[7] C. C. Lisias, M. H. Stoppa. Desenvolvimento de Aplicativo de Controle de Pr´oteses Biˆonicas de M˜ao Para Sistemas Embarcados M´oveis. 2014 [8] Arieta, H., Katoh, R., Yokoi, H., and Wenwei, Y. ”Development of a

multi-dof electromyography prosthetic system using the adaptive joint mechanism”. Applied Bionics and Biomechanics 3, 2, 101–112. 2006 [9] Castellini, C., and Patrick, S. ”Surface EMG in Advanced Hand

Prosthet-ics”. Bio. Cybernetics 100, 1 (2009), 35–47.

[10] Ito, K.; Tsuji, T. K. A. I. M. ”An emg controlled prosthetic forearm in three degrees of freedom using ultrasonic motors”. In IEEE/ Interna-tional Conference on Engineering in Medicine and Biology Society, pp. 1487–1488. 1992

[11] Kakoty, N. M., and Hazarika, S. M. ”Electromyographic grasp recog-nition for a five fingered robotic hand”. IAES International Journal of Robotics and Automation 2, 1, 1–10. 2012

[12] Castellini, C., Gruppioni, E., Davalli, A., and Sandini, G. ”Fine detection of grasp force and posture by amputees via surface electromyography”. Journal of Physiology (Paris) 103, 3-5, 255–262. 2009

[13] BlueCross BlueShield Association Medical Policy Reference Manual. Myoelectric Prosthesis for the Upper Limb. Policy No. 1.04.04 [14] Thomann, G. Artigues, V. ”Mechanical Design, Control Choices and

first Return of Use of a Prosthetic Arm”. 12th IFToMM World Congress, Besanc¸on (France), June18-21, 2007

[15] Turker H, Sze H. ”Surface Electromyography in Sports and Exercise”. Electro diagnosis in New Frontiers of Clinical Research, 175-194. 2013

(11)

[16] Amorim, C.F, Marson, R.A. Application of Surface Electromyography in the Dynamics of Human Movement”. Chapter 16.

[17] J. H. Lyman, A. Freedy, and R. Prior, “Fundamental and applied research related to the design and development of upper-limb externally powered prostheses,” Bull Prosthet Res., 13 184 195

[18] J.Senthil Kumar, M. Bharath Kannan, S. Sankaranarayanan, A. Venkata Krishnan - Human Hand Prosthesis Based On Surface EMG Signals for Lower Arm Amputees, International Journal of Advanced Science, Engineering and Technology. ISSN 2250-2459, ISO 9001:2008 Certified Journal, Volume 3, Issue 4, April 2013.

[19] B. Peerdeman et al., ”Myoelectric forearm prostheses: State of the art from a user-centered perspective”. Journal of Rehabilitation Research & Development, vol. 48, no. 6, pp. 719–738, 2011.

[20] Shai Shalev-Shwartz , Shai Ben-David. ”Understanding Machine Learn-ing: From Theory to Algorithms”. Cambridge University Press, New York, NY, 2014

[21] Ayodele, T. O. (2010). ”Introduction to Machine Learning, New Ad-vances in Machine Learning”, Yagang Zhang (Ed.), ISBN: 978-953-307-034-6, InTech, Available from: http://www.intechopen.com/books/new-advances-in-machine-learning/introduction-to-machine-learning [22] F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O.

Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. Scikit-learn: Machine learning in Python. JMLR, 12:2825–2830, 2011. [23] Scikit-learn Home Page. (2016). [Online] Available from:

http://scikit-learn.org/stable/modules/neighbors.html#classification. [Accessed: 2016-Aug- 5].

[24] Myo.com, ”Myo Gesture Control Armband”, 2016. [Online]. Available from: https://www.myo.com/. [Accessed: 2016- Feb- 5].

[25] Thalmic’s Lab Web Page, 2016. [Online] Available from: https://www.thalmic.com [Accessed: 2016- Feb- 5].

[26] Intel.com, ”Intel Edison - One Tiny Module, Endless Possibility”, 2016. [Online]. Available from: http://www.intel.com/content/www/us/en/do-it-yourself/edison.html. [Accessed: 2016- Feb- 5]

[27] Enabling The Future Web Page, 2016. [Online] Available from: http://enablingthefuture.org/upper-limb-prosthetics/the-raptor-hand [Accessed: 2016- Feb- 15].

[28] J. T. Belter, A. M. Dollar. Performance Characteristics of Anthropomor-phic Prosthetic Hands, 2011.

[29] Condit, R. ”Brushed DC Motor Fundamentals”. Mi-crochip Application Notes. AN905 Available from: http://ww1.microchip.com/downloads/en/AppNotes/00905B.pdf [Accessed: 2016- Feb- 29]

APPENDIXA

GRAPHSPLOTTED INMATLAB

Graph containing the result of the EMG signals readings for a rest gesture is attached at the end of this paper, figures 15 and 16.

Graph containing the result of the EMG signals readings while executing the 4 gesture with the arm in 3 different positions in figure 17.

(12)
(13)
(14)

Referências

Documentos relacionados

Isto é, o multilateralismo, em nível regional, só pode ser construído a partir de uma agenda dos países latino-americanos que leve em consideração os problemas (mas não as

Ousasse apontar algumas hipóteses para a solução desse problema público a partir do exposto dos autores usados como base para fundamentação teórica, da análise dos dados

Iniciaremos enunciando alguns resultados e definições de teoria de corpos que nos servirão de base para os primeiros resultados de derivação que serão apresentados neste

Despercebido: não visto, não notado, não observado, ignorado.. Não me passou despercebido

i) A condutividade da matriz vítrea diminui com o aumento do tempo de tratamento térmico (Fig.. 241 pequena quantidade de cristais existentes na amostra já provoca um efeito

Levando em consideração as abordagens sobre comunicação pública da ciência apresentadas anteriormente, neste capítulo descreveremos a criação da Sociedade

Do amor," mas do amor vago de poeta, Como um beijo invizivel que fluctua.... Ella