Remember that Six Million Dollar Man TV Series? When ace test-pilot Steve Austin's crashed, he ended up in a poor shape that his only chance of survival was to replace human spare parts with technologies. That was the beginning of “wearables” I supposed. The government decided to invest millions to use the latest technology to rebuild him with cybernetic parts which gave him superhuman speed, response and strength and the superagent.
Thirty-five years later, DARPA's Reliable Neural-Interface Technology (RE-NET) is working on it and trying to improve the speed and accuracy of neural interfaces. Such interfaces can happen in the brain, as in the case of Hemmes, or someplace else farther down the line, closer to the missing limb. It is a highly complicated task to try to translate these commands to muscles that drives the myoelectric arm. Most important is that any of these won’t need a brain surgery or implant. The challenges include weather change and other external factors can affect the myoelectric arm and there is the issue of electrodes slipping on the skin.
The idea of implanted electrodes is very attractive with the goal to provide near-natural hand movement. Another sign og progress is FDA just approved the world’s first approved bionic eye. The Argus II, which treats patients with the rare genetic condition known as retinitis pigmentosa was approved by the FDA after twenty years of research and $200mm of investments. The Argus II Retinal Prosthesis System, which was approved to treat people with severe to profound retinitis pigmentosa, doesn't actually restore vision or allowing them to see, just enough to detect light and dark and the contrast can identify the movement or location of objects. Almost like the first generation of cell phones. We are a few years away before they have advance features. When the photoreceptor cells in the eye's retina are damaged, people can tell the difference between light and dark.
To restore light and dark vision, a tiny chip containing many electrodes is implanted on the patient's retina. The patient wears a pair of glasses with a video cam attached to it. The camera sends the recorded image wirelessly to the electrodes in the chip, which then turns the information into a signal that the brain can understand. It will be a low-resolution 60-pixel image only and enough to give people a sense of what’s in front of them.
The breakthrough in nanochip will open up tons of opportunities to improve our senses or replace in medical cases or enhance in other cases. In less than ten years, those who can afford to spend $25,000 can get a bionic eye to capture wider light spectrum or advanced your visual search functions through Google. I won’t be an early adopter for sure.