LabVIEW Robotics KUKA youBot API

Overview

Mobile manipulation is a crucial technology for future cognitive service and manufacturing robotics. However, meeting the stringent requirements from both industry and academia present many challenges in providing a streamlined hardware platform and robotics software framework. Recently launched, the KUKA youBot, an omni-directional mobile manipulator platform, bridges the gap between education, basic research, and industrial application development. On the other hand, LabVIEW Robotics Module and the graphical system design approach provides scalability in algorithm design to multiple heterogeneous targets and complex architectures. This tutorial highlights the KUKA youBot API and demonstrates how you can design complex algorithm utilizing the built-in functions.

Table of Contents

  1. Mobile Manipulator Platform
  2. LabVIEW KUKA youBot APIs
  3. Simulation and Hardware Abstraction
  4. Conclusion
  5. References

Mobile Manipulator Platform


Mobile manipulation, one of the key technologies for professional service and manufacturing robotics, requires a seamless integration and synchronization of mobility and manipulation. Despite the importance of such technology in the domain, exiting platforms serve primarily as proof of concepts with little to none potential to meet industrial needs. At the same time, majority of the platforms require proprietary interfaces that pose difficulties in collaborative efforts among domain experts.


In order to enable and accelerate mobile manipulation technology, the KUKA youBot serves as an ubiquitous platform to bridge the gap between education, basic research, and industrial application development. The omnidirectional holonomic platform is based on a steel structure that is robust enough to carry a payload of 20 kg. It is driven by four KUKA omni-wheels, which are based on the Mecanum principle. The steel structure is covered by a plastic covering, which can be demounted to pro-vide access to the structure and mounting holes which can be used to add additional sensors, e.g., laser scanners. The arm is made of magnesium cast weighing 6.3 kg. The arm can lift 0.5 kg with its two-finger gripper. Each finger can be mounted in three different positions increasing the gripping range to 70 mm. Each finger is powered by an in-dependent stepper motor, which enables fine grasps. Detaching the gripper from the flange is possible, e.g., to use the maximum payload of the arm (0.7 kg) or to mount a different end effector or an adapter between flange and gripper to add a (vision) sensor.

 

Figure 1. The Three "Islands" of Mobile Manipulation and the First Public Presentation of the KUKA youBot

All drives (4 platform drives, 5 arm drives, 2 stepper motors) can be accessed through EtherCAT, a real-time communication bus, which is very well established in the automation industry and used to control the latest KUKA industrial robots as well. The KUKA youBot drive protocol allows every user can access the drives at the lowest level (current) and highest rates possible (currently 1 kHz). Velocity and position control commands are available on joint level as well.


Establishing mobile manipulation as a powerful technology requires both modular hardware platform that is suitable for both education and industry and software environment that is intuitive and scalable. Here, an integrated solution between KUKA youBot and LabVIEW Robotics is identified as the key enabler to accelerate robotics development.

Back to top

LabVIEW KUKA youBot APIs


The LabVIEW KUKA youBot APIs were designed to follow the graphical system design approach with the ability to switch from the simulated youBot to the real physical youBot. There are three major components of the KUKA youBot VIs and we will discussion each of the components in this section.

  • Initialization VIs - to initialize simulated or physical KUKA youBot
  • Arm VIs - to configure KUKA youBot manipulator
  • Platform VIs - to configure KUKA youBot platform

Figure 3. LabVIEW Robotics KUKA youBot APIs

KUKA youBot Initialization APIs

The initialization Vis contain two instances: KUKA youBot and Simulated KUKA youBot. The design utilizes the hardware abstraction layer that will be discussed in the following section to allow the user to maintain the same codebase and architecture for both environments. This is a very powerful technology to enable rapid prototyping for robotics design.

Figure 4. LabVIEW Block Diagram for Controlling Simulated KUKA youBot

KUKA youBot Arm APIs

Figure. 5 KUKA youBot Arm APIs

The joints of the KUKA youBot arm are equipped with position encoders to measure joint angles. The used encoders are relative position encoders, which means, they do not yield an absolute joint angle but an angular displacement relative to a reference position. We also call these positions home positions or rest positions to indicate the arm should start its movement from these positions.

Figure 6. Movement of the Manipulator to Calibrate Joints to the Home Position

The Calibrate Arm. VI will calibrate and move the motors of KUKA youBot arm joints to the home position as shown above. It is recommended to utilize this VI as part of the initialization process to ensure a safe position for the arm. The other Vis will allow the user to send and read position commands in radians to commend the position of each joint. The built-in Inverse or Forward Kinematics APIs in the Robotics Module is often used in conjunction with KUKA youBot arm APIs.  You can find these APIs in Robotics>>Robotic Arm>>Kinematics.

KUKA youBot Platform APIs

Figure 7. KUKA youBot Platform APIs

The KUKA youBot base is an omni-directional mobile platform with four mecanum wheels. Figure below illustrates the attached base (coordinate) frame. Positive values for rotation about the z axis result in a counterclockwise movement, as indicated by the blue arrow. The wheel numbering for the mecanum wheels is also shown. The motor controllers of the four omni-directional wheels can be accessed via Ethernet and EtherCAT. The Ethernet cable can be either plugged into the onboard PC or to an external computer. The plug is a standard network cable plug that fits into each standard Ethernet port.

Figure 8. Overview of KUKA youBot Base

Create Platform Steering Frame.VI utilizes the configuration above and generates the steering frame for the KUKA youBot platform. Other APIs include Set Wheel Motor Speed. VI and Get Wheel Motor Speed.VI give access to control each of the four wheels of the youBot platform to achieve the desirable steering behavior. These VIs are typically used in conjunction with Apply Velocity to Motors.VI which can be found in Robotics>>Steering>>Apply Velocity to Motors.VI. These built-in functions in the LabVIEW Robotics Module allows the user to set and read steering frame velocities in all directions.


Utilizing all KUKA youBot APIs, the user can build and develop various applications from analyzing simple pick and place tasks to multi-agent collaborative work as shown below. LabVIEW Robotics Module can further extend the use of these APIs and we will go through a couple key technologies in the LabVIEW Robotics Module in the following section.

Figure 9. LabVIEW Front Panel of a Simulated KUKA youBot Application

Figure 10. Multiple KUKA youBots for Collaborative Mapping Application

Back to top

Simulation and Hardware Abstraction


One of the most common needs for robot development is simulation. Simulation allows developers to test control algorithms and codes much faster than on actual hardware. In a real-world scenario, being able to validate codes in simulation can also help lowing costs and increase productivity. Nevertheless, the development time gained from testing in simulation can easily be lost if there needs to be additional time to translate the algorithm to execute on deployable hardware. Ideally, the exact same code can run either on the actual robot or in a simulated state; the only change being the specified embedded target. This would give all the benefits of rapid testing in simulation without sacrificing time getting the code to execute with hardware
The LabVIEW Robotics simulator imports the CAD model of youBot into the simulation environment. Within the environment, the user can specify the positions for various components and insert additional simulated sensors and actuators such as a LIDAR or IR sensors for the simulation. The LabVIEW simulator then simulates the KUKA youBot based on Open Dynamics EngineTM for rigid body simulation.


At the same time, LabVIEW Robotics Module introduced a hardware abstraction layer to accelerate rapid prototyping by validating algorithm design in simulation first. Each I/O software interface has a general implementation, which could take the form of real or simulated device. This enables us to test algorithms like mapping, steering, and path planning in simulation, and then use the exact same code on the hardware. Due to the combination of a hardware abstraction layer, simulated I/O, and retargetable code, there is a seamless transition between running the entire application on a desktop to deployment on the robot.

Figure 11. KUKA youBot Simulation Model in LabVIEW Robotics Simulator

Figure 12. KUKA youBot with Simulated LIDAR Sensor

Back to top

Conclusion


KUKA youBot serves as an affordable mobile manipulator platform that can be used in both research and education in various ways. The modularity of the KUKA youBot, by adding additional kinematic arm, gripper, and sensors introduces additional challenges to mobile manipulation research. At the same time, LabVIEW Robotics KUKA youBot APIs provides real physical simulation and interfaces for KUKA youBot to further promote innovation and accelerate advanced robotics development. The hardware abstraction layer in LabVIEW can bridge the gap between simulation and real system and improve efficiency. As we move forward, incorporating advanced computing technology such as field-programmable gate array (FPGA) is a natural progression. This would enable much faster I/O response times and motor control loops at 40MHz with exceptional computing power. At the same time, sophisticated software architecture with different types of computing targets (FPGA, Real-Time, and Windows) can cross-communicate all using the same code base with deterministic processes.

Back to top