Artificial nervous systems: Can robots finally feel?

IMAGE CREDIT:
Image credit
iStock

Artificial nervous systems: Can robots finally feel?

Artificial nervous systems: Can robots finally feel?

Subheading text
Artificial nervous systems might finally give prosthetic and robotic limbs the sense of touch.
    • Author:
    • Author name
      Quantumrun Foresight
    • November 24, 2023

    Insight summary

    Artificial nervous systems, drawing inspiration from human biology, are transforming the interaction between robots and the sensory world. Starting with a seminal 2018 study where a sensory nerve circuit could discern Braille, to the University of Singapore's 2019 creation of an artificial skin surpassing human tactile feedback, these systems are rapidly advancing. South Korean research in 2021 further demonstrated a light-responsive system that controls robotic movement. These technologies promise enhanced prosthetic senses, human-like robots, improved rehabilitation for neurological impairments, tactile robotic training, and even augmented human reflexes, potentially revolutionizing medical, military, and space exploration fields.

    Artificial nervous systems context

    One of the very first studies in artificial nervous systems was in 2018, when researchers from Stanford University and Seoul National University were able to create a nerve system that could recognize the Braille alphabet. This feat was enabled by a sensory nerve circuit that can be placed into a skin-like covering for prosthetic devices and soft robotics. This circuit had three components, the first being a touch sensor that could detect small pressure points. The second component was a flexible electronic neuron that received the signals from the touch sensor. The combination of the first and second components led to the activation of an artificial synaptic transistor that mimicked human synapses (nerve signals between two neurons that relay information). The researchers tested their nerve circuit by hooking it up to a cockroach leg and applying different pressure levels to the sensor. The leg twitched according to the amount of pressure applied.

    One of the main advantages of artificial nervous systems is that they can mimic the way humans respond to external stimuli. This capability is something that traditional computers can’t do. For example, traditional computers can’t react quickly enough to changing environments – something that’s essential for tasks like prosthetic limb control and robotics. But artificial nervous systems can do this by using a technique called “spiking.” Spiking is a way of transmitting information that’s based on how actual neurons communicate with each other in the brain. It allows for much faster data transmission than traditional methods like digital signals. This advantage makes artificial nervous systems well-suited for tasks that require quick reactions, such as robotic manipulation. They can also be used for jobs requiring experience learning, such as facial recognition or navigating complex environments.

    Disruptive impact

    In 2019, the University of Singapore was able to develop one of the most advanced artificial nervous systems, which can give robots a sense of touch that is even better than human skin. Called the Asynchronous Coded Electronic Skin (ACES), this device processed individual sensor pixels to rapidly transmit “feeling data.” The previous artificial skin models processed these pixels sequentially, which created a lag. According to experiments conducted by the team, ACES is even better than human skin when it comes to tactile feedback. The device could detect pressure over 1,000 times faster than the human sensory nervous system.

    Meanwhile, in 2021, researchers from three South Korean universities developed an artificial nervous system that can respond to light and do basic tasks. The study comprised a photodiode that converts light into an electric signal, a robotic hand, a neuron circuit, and a transistor that works as a synapse. Every time a light is turned on, the photodiode translates it into signals, which travel through the mechanical transistor. The signals are then processed by the neuron circuit, which commands the robotic hand to catch the ball that is programmed to drop as soon as the light turns on. Researchers are hoping to develop the technology so that the robotic hand can eventually catch the ball as soon as it drops. The main goal behind this study is to train people with neurological conditions to regain control of their limbs that they can’t control as quickly as they used to. 

    Implications of artificial nervous systems

    Wider implications of artificial nervous systems may include: 

    • The creation of humanoid robots with human-like skin that can respond to stimuli as quickly as humans.
    • Stroke patients and people with paralysis-related conditions being able to regain their sense of touch through sensory circuits embedded in their nervous system.
    • Robotic training becoming more tactile, with remote operators able to feel what the robots are touching. This feature can be handy for space exploration.
    • Advancements in touch recognition where machines can identify objects by simultaneously seeing and touching them.
    • Humans having augmented or enhanced nervous systems with quicker reflexes. This development can be advantageous for athletes and soldiers.

    Questions to comment on

    • Would you be interested in having an enhanced nervous system?
    • What are the other potential benefits of robots that can feel?

    Insight references

    The following popular and institutional links were referenced for this insight: