Hyundai Wearable Robotics for Walking Assistance Offer a Full Spectrum of Mobility

DongJin Hyun, PhD, Hyundai Motor Company

"Using LabVIEW and the LabVIEW RIO architecture allowed us to reduce development and test time for our new robot control algorithm to just one week, compared to one month with a text-based approach. We are able to prototype with software and hardware faster and adapt to rapidly changing control requirements."

- DongJin Hyun, PhD, Hyundai Motor Company

The Challenge:

Developing a system that can handle complex control algorithms to capture data remotely from various sensors simultaneously and perform real-time control of multiple actuators for a wearable robotics device for walking assistance.

The Solution:

Using the LabVIEW RIO platform, including a CompactRIO embedded system and a real-time controller with an FPGA control architecture provided by Single-Board RIO, to acquire data from various sensors and control peripheral units, high-speed communication devices, and actuators; and using LabVIEW software to acquire reliable data by conducting real-time analysis and applying various robot control algorithms to dramatically reduce development time.

The Central Advanced Research and Engineering Institute at Hyundai Motor Company develops future mobility technologies. Rather than provide conventional vehicle products to customers, this research center creates new mobility devices with a wide range of speeds for a variety of people, including the elderly and the disabled. As our society ages, there is a greater need for systems that can aid mobility. Thus, we are developing wearable exoskeleton robots with NI embedded controllers for the elderly and patients with spinal cord injuries to use.


In the field of wearable robotics, physical interfacing between the human body and a robot causes various engineering issues with mechanical design, control architecture construction, and actuation algorithm design. The allowed space and weight for electrical devices is extremely limited because a wearable robot needs to be put on like a suit. Additionally, the overall control sampling rate of the robot should be fast enough that it does not impede human motions and can properly react to external forces. Also, many questions remain regarding human augmentation and assistance control algorithms for wearable robots, even though many of the endeavors of robotic researchers have resulted in successful performances of wearable robots. Therefore, our group mainly considered the following requirements for selecting a main controller for our wearable robots:

  • High-speed processing of data obtained from various types of sensors
  • Size and weight
  • Real-time data visualization for developing control algorithms
  • Connectivity to other smart devices to provide more convenient functions

 

System Configuration

The real-time control and FPGA hardware environment ensure reliability and stability by providing I/O that is compatible with various robotic control devices. For instance, in the process of building our wearable robots, the overall control architecture drastically changed several times due to the replacement of sensors or changes in the control communication method. However, the unique onboard combination of the real-time controller and FPGA features provided by NI products empowered our group to manage these changes promptly, which helped reduce our development period.

 

In addition, adopting the compact sbRIO-9651 System on Module (SOM) device helped us reduce the robot’s weight to less than 10 kg while maximizing battery efficiency through a low-power base system configuration.

 

Why We Chose LabVIEW

The number of sensors and actuators increases significantly to achieve more complex tasks in robotics, and the complexity of the control algorithms increases exponentially. Therefore, simultaneously processing all data from multiple sensors and sending instructions to multiple actuators becomes one of the most important challenges to address in robotics. LabVIEW supports concurrent visualization for intuitive signal processing for installed sensors on robots and further control algorithm design in the experimental stages. Lastly, NI products are expandable and compatible, so we can possibly use smart devices as user interfaces (UIs) in the future.

 

Wearable Robotics for Walking Assistance

Originally, the following types of wearable robots were built:

  • Hip Modular Exoskeleton—A modular robot that provides walking assistance to people with discomfort in the hip area
  • Knee Modular Exoskeleton—A modular robot that provides walking assistance to people with discomfort in the knee area
  • Life-Caring Exoskeleton—A modular robot that combines the hip and knee parts to provide walking assistance to the elderly or people with difficulties moving the lower half of their bodies
  • Medical Exoskeleton—A modular robot that combines the hip and knee parts to provide walking assistance to patients who do not have the ability to move the lower half of their bodies on their own

 

Following the demonstration of the wearable Life-Caring Exoskeleton for walking assistance for the elderly at NIWeek 2015, we unveiled a wearable Medical Robot for people with paraplegia, which was also designed using LabVIEW and CompactRIO. In a joint clinical demonstration with the Korea Spinal Cord Injury Association in January 2016, a paraplegic patient equipped with this Medical Robot succeeded in sitting down, standing up, and walking on flat ground. The patient who participated in this clinical trial is paralyzed in the lower half of the body (injury at 2nd and 3rd lumbar vertebrae) with motor and sensory paralysis, but could walk successfully with the assistance of the wearable Medical Robot after a short training. Building on this achievement and current progress in development, we expect to manufacture a lighter and better product with added functions by 2018, and begin mass production in 2020.

 

Taking Advantage of Internet of Things Technologies for Future Development

We have research plans for integrating smart devices into the UI to address future challenges. Currently, robots for people with lower body disabilities are designed to use crutches as wireless UIs for changing configuration, such as converting to walking, sitting, climbing or going down steps, or normal mode. Embedding smart devices into this kind of UI can help users conduct tuning of additional parameters including stride, time for taking one step, or depth/width for sitting on a chair. Also, data related to walking patterns or normal activity range is useful for treatment or rehabilitation. Rehabilitation experts or doctors can configure more advanced parameters, such as forced walking time or adjusting joint movement, to continue to use them for treatment.

 

We started to develop the next-generation exoskeleton robot based on wireless technology to make gait analysis possible. When someone wears this robot, it is possible to identify intention and walking status by collecting data from an area between the ground and the sole of the foot. Technology that transmits this data through wireless ZigBee communication is already in place. This technology can be further expanded now using Internet of Things (IoT) technology. In other words, you can send information acquired wirelessly to a robot to make it assist with the walker’s movements. In addition, gathering relevant data can help users identify a personal range of activities and conditions based on location, and that information can be integrated into the robot and lead to more comprehensive service. If a patient wears this robot for rehabilitation purposes, doctors can monitor patient and robot conditions during rehabilitation and deliver real-time training or adjustments to enhance efficiency and effectiveness of treatment, a good example of implementation of data information-based technology.

 

Author Information:

DongJin Hyun, PhD
Hyundai Motor Company
37, Cheoldobangmulgwan-ro
Uiwang-si, Gyeonggi-do 437-815
South Korea
Tel: +82 (031) 596 0920
mecjin@hyundai.com

Figure 1. Wearable Robot System Configuration
Figure 2. Real-time processing, offered by the CompactRIO platform, enabled Hyundai to address one of their greatest challenges–recognizing human intention.
Figure 3. LabVIEW Front Panel for Robot Control
Figure 4. LabVIEW Block Diagram for Robot Control
Figure 5. Hyundai Lower-Limb Exoskeletons
Figure 6. Concept of Modular Exoskeleton and Lower-Limb Exoskeleton
Figure 7. Life-Caring Exoskeleton in Use
Hyundai wearable robotics are designed to offer users a better quality of life.
The NI System on Module (SOM) provides Hyundai a powerful heterogeneous architecture in a small form factor to control their wearable robotics.
Hyundai used CompactRIO to quickly implement and adjust to rapidly changing control requirements.