One of the main themes in my writing has been helping people with special needs in every way that I can. I encourage developers to add as many accessibility features as possible into applications. In fact, I wrote Accessibility for Everybody with that specific goal in mind. It shouldn’t surprise you that I keep track of developments in robotics that could potentially help those with special needs. My last post on the topic, The Bionic Person, One Step Closer, discussed the use of new technology to give sight to those without it. I recently read an article on an entirely different plane of the topic, those who can’t move their own bodies much, if at all, quadriplegics.
The article, “Paralyzed Mom Controls Robotic Arm Using Her Thoughts,” tells of a mother who would just love to be able to feed herself a bar of chocolate. A new robotic setup can read the required movements directly from her brain and direct a robotic arm to perform them. I find this amazing! Imagine not being able to do anything for yourself one day and then being able to perform little tasks that we all take for granted the next. Can you imagine what this woman goes through every time her nose itches? The thought has entered my mind more than once.
This technology has been in the works for quite some time. However, engineers are steadily getting closer to making the technology more natural to work with. Before now, people who could master the techniques and had the money could use voice controlled robotic arms. However, these devices are incredibly clumsy and difficult to work with. You can even try one out yourself for the low cost of $55.00 (check out “How to make a voice-controlled robot arm for $55“). This particular device is limited when compared to robotic arms used by those with special needs, but it would be enough to give you the idea.
The most important part of this new technology is that it keeps the user involved. Even when robotic arms of the past achieved their goals, they often left the user feeling out of control or possibly out of the picture entirely. The article, “Quadriplegics Prefer Robot Arms on Manual, not Automatic” explains the issue. These older technologies are advanced enough to get a glass of water, feed the user, and even scratch that itchy nose, but the user needs to be involved. Mind controlled robots can keep the user involved in his/her own life.
We’re living in an incredibly exciting time. It’s a time when it’s becoming possible for everyone to participate in life more fully. People who would have lived diminished lives in the past are now starting to become engaged to activities that everyone else performs. The playing field of life is becoming more level, which helps humanity as a whole. Let me know your thoughts on robotic technologies at John@JohnMuellerBooks.com.
It wasn’t very long ago (see Robotics in Your Future) that I wrote about the role of robotics in accessibility, especially with regard to the exoskeleton. At that time, universities and several vendors were experimenting with exoskeletons and showing how they could help people walk. The software solutions I provide in Accessibility for Everybody are still part of the answer, but more and more it appears that technology will provide more direct answers, which is the point of this post. Imagine my surprised when I opened the September 2011 National Geographic and found an article about eLEGS in it. You can get the flavor of the article in video form on the National Geographic site. Let’s just say that I’m incredibly excited about this turn of events. Imagine, people who had no hope of walking ever again are now doing it!
We’ve moved from experimental to actually distributing this technology—the clinical trials for this device have already begun. The exoskeleton does have limits for now. You need to be under 6 foot 4 inches tall and weigh less than 220 pounds. The candidate must also have good upper body strength. Even so, it’s a great start. As the technology evolves, you can expect to see people doing a lot more walking. Of course, no one who has special needs is running a marathon in this gear yet. However, I can’t even begin to imagine the emotion these people feel when they get up and walk for the first time. The application of this technology is wide ranging. Over 6 million people currently have some form of paralysis that this technology can help.
eLEGS is gesture-based. The way a person moves their arms and upper body determines how the device reacts. Training is required. The person still needs to know how to balance their body and must expend the effort to communicate effectively with the device. I imagine the requirements for using this device will decrease as time goes on. The gestures will become less complex and the strength requirements less arduous.
So, what’s next? Another technology I’ve been watching for a while now is the electronic eye. As far as I know, this device hasn’t entered clinical trials as of yet, but the scientists are working on it. (It has been tested in Germany and could be entering trials in the UK.) The concept is simple. A camera in a special set of glasses transmits visual information to a chip implanted in the person’s eyeball. The chip transmits the required signals to the person’s brain through the optical nerve. However, the implementation must be terribly hard because our understanding of precisely how all of this works is still flawed.
Even so, look for people who couldn’t walk to walk again soon and those who couldn’t see to see again sometime in the future. There will eventually be technologies to help people hear completely as well. (I haven’t heard of any technology that restores the senses of smell, taste, or touch to those who lack it.) This is an exciting time to live. An aging population will have an increasing number of special needs. Rather than make the end of life a drudge, these devices promise to keep people active. Where do you think science will go next? Let me know at John@JohnMuellerBooks.com.