The Robotics and Autonomous Vehicles Summit unites the world's top roboticists, researchers, design engineers, and domain experts working on robotics applications. With in-depth technical sessions and corresponding live demonstrations at the Robotics Pavilion on the expo floor, learn how to apply the latest technology from real-time devices, FPGAs, and graphical and textual programming to design robotics systems faster than your peers.
Who Should Attend
Robotics Engineers, Mechatronics Engineers, Computer Scientists, Control Engineers, Systems Engineers, Embedded Design Engineers, Entrepreneurs, Engineering Professors, and StudentsKeynote
Past, Present, and Future of Robotics SoftwareBecause robots can be considered software that moves, software is critical to any robotic system. In the past, most researchers wrote their own robot middleware that required tremendous effort to develop and maintain. This created a significant barrier to entry and a fragmented field. Learn how you can overcome these challenges in your current and future robotic systems.
Technical Sessions
From Precision Motion Control to Mechatronics Systems Involving HumansLearn how mechatronics researchers at the UC Berkeley Mechanical Systems Control (MSC) Laboratory are using NI technology to study precision motion control for wafer scanners in semiconductor manufacturing. Recent research topics include narrowband disturbance rejection, enhanced repetitive control, and iterative learning control. Also hear how these researchers are sending the outputs of inertial measurement units mounted on human body parts over a wireless network to help rehabilitation therapists.
A Sound-Based Navigation System
Discover how a custom automated navigation system used a light wave to trigger the emission of discrete pulses of sound as specific frequencies from the perimeter of an open field. The time to receipt of each sound pulse by a microphone inside the field identified the location of the microphone. LabVIEW controlled timing, sound-signal identification, and position calculation for an autonomous lawnmower.Vision Guided Motion: Autonomous Optically Guided Tomato Harvester
Explore how to use machine vision and LabVIEW to autonomously guide a robotic arm to a target of arbitrary size and distance. LabVIEW matched optical templates to images from binocular cameras with parallel axes to drive stepper motors to rotate vertical and horizontal joints and extend a boom. The median displacement of the center of the target from the center of the cameras' fields of view was calculated by processing serial optical images of the target.Real-Time Position Controller for a UAV Quadrotor
Learn about the design and implemantation of a real-time control architecture for an unmanned aerial vehicle (UAV) quadrotor platform. Based on a CompactRIO controller, the architecture features a motion tracking system as a position and orientation sensor. The control system holds the aerial vehicle at hovering and allows the flying robot to track a trajectory efficiently. Moreover, you can easily add UAV quadrotors by connecting extra CompactRIO controllers to the control network.First Look at LabVIEW Robotics Simulators
Examine the broad range of features in the LabVIEW Robotics simulators. Also learn how to import your custom robot design from SolidWorks into the 3D physics-based simulator and seamlessly deploy algorithms to an embedded controller.Create Haptic-Enabled Mobile Robotics With LabVIEW Robotics
Discover how haptic feedback from the steering wheel and haptic gas pedal affects operator performance during mobile robot teleoperation. The lack of environment perception reduces the effectiveness of the operator. To increase operator awareness, you can add haptic effects with joystick-like applicators. At this session, explore how to use LabVIEW Robotics to create a force feedback steering wheel and haptic gas pedal during an obstacle avoidance task to reduce collisions.LabVIEW Hacker: Hacking the Real World
Watch NI engineers hack new sensors, actuators, and robots in this encore presentation of "LabVIEW Hacker." They spent the last year hunting for the cool devices students and hobbyists use to interact with the real world. They selected the best for live demonstrations and created tutorials and software packages you can use in your next project. Arrive early to get a seat at one of the most popular NIWeek sessions.Exploring Fuzzy Logic for the Masses
Take an introductory trip through the basics of fuzzy logic at this session designed for the developer or engineer who is not a seasoned controls engineer. Examine the basic terminology and theory of a fuzzy logic system without getting into the math and learn how to use the Fuzzy System Designer and LabVIEW pallete controls in code. Also view a simple demonstration of an elementary control system using fuzzy logic.KUKA youBot and LabVIEW: Scalable Mobile Manipulation Platform
Mobile manipulation is a crucial technology for future cognitive service and manufacturing robotics. But meeting stringent industry and academic requirements makes providing a streamlined hardware platform and robotics software framework challenging. At this session, explore the viability of using the KUKA youBot, an omnidirectional mobile manipulator platform, with the LabVIEW programming and simulation environment for a complex robot application.FIRST 101: How to Get Involved
Everybody's doing it. Learn how your company can get involved in FIRST by donating time, talent, or dollars to the largest after-school robotics competition in the world.A Data Acquisition Platform for Autonomous Ground Vehicles
Hear how the Virginia Tech Mechatronics Lab is establishing a database of rural scenes that autonomous ground vehicles encounter. Researchers created a sensor platform to capture both LIDAR and multispectral electro-optical data using a Windows OS. They supplemented this platform with an ARM target capable of capturing secondary data such as scene location, heading, and temperature. They used LabVIEW to communicate with the sensor platform and ARM target for data acquisition and database accumulation.Reconfigurable Automated Parameter-Identifying Dynamometer (RAPID) Powered by LabVIEW MathScript and NI myDAQ
Learn how the Coordinated Robotics Lab at the University of California, San Diego developed an automated dynamometer to characterize DC motors for robotics applications. Researchers used NI myDAQ to read the sensors integrated into the dynamometer and used the LabVIEW MathScript language to process measured sensor data. Then they calculated the parameters that best fit the observed data to the mathematical model. They also deployed LabVIEW code to a real-time target for embedded control.Robot Operating System and LabVIEW
The open-source Robot Operating System (ROS) is one of the most popular architectures for robotics research. See how you can use LabVIEW to control existing ROS-based systems and incorporate the power of LabVIEW Real-Time and LabVIEW FPGA hardware in a ROS architecture. Also watch a demonstration of ROS robot control using graphical system design.A Low-Cost Vision System for Autonomous Mobile Robots
Explore how you can integrate a low-cost vision system, built using LabVIEW and a web camera, into an autonomous mobile robot so that it can follow a single-colored line on the floor. Learn how you can easily adapt the vision system for other tasks and impove debugging by streaming what the robot sees, or what the processed images look like, to a host computer via a Wi-Fi connection. The system provides a reference design for students and engineers alike.Advanced Mobile Manipulation for Quality Control and Inspection
By combining the high degree of accuracy and repeatability of robots with complementary perception technology (machine vision and force sensing), the Loccioni Group developed several advanced robotic solutions for quality control and inspection. Learn about robotized quality control systems for visual and tactile inspections in industrial production lines, a mobile robot with manipulation and diagnostic capabilities, and a mobile robot for railway route and switch inspection and measurement.Bipedal Robots Imitate Human-Like Walking Through LabVIEW Autonomous Feedback Control
Examine human-inspired control strategies to achieve flat-ground walking in an underactuated physical bipedal robot named AMBER. See how to use the LabVIEW Real-Time and LabVIEW FPGA modules to develop and verify during simulation the formal models and controllers that guarantee physically realizable robotic walking. Then learn how to implement an efficient voltage-based controller to create the robust and efficient human-like robotic walking AMBER achieves.Complex Multidegree-of-Freedom Mobile Robots Using LabVIEW Robotics
Examine an NI-sponsored senior design project at Rose-Hulman Institute of Technology that integrated a 5-degree-of-freedom manipulator into an existing mobile robotics platform. Students used a variety of LabVIEW Robotics libraries to integrate the mobile platform with additional sensors (such as PID-controlled line following) and to integrate the control of the manipulator with an inverse kinematic model. The result was an educational reference architecture that demonstrated how LabVIEW graphical system design tools can control a complex and multidegree-of-freedom mobile platform.







