UC Berkeley Researchers Create a Virtual Lab for Cyber-Physical Systems Massive Open Online Course (MOOC) Based on LabVIEW

Sanjit A. Seshia, UC Berkeley

"We combined LabVIEW with auto-grading technology based on formal methods to enable lab-based online education. Our work with NI has helped spread cyber-physical systems education across the globe."

- Sanjit A. Seshia, UC Berkeley

The Challenge:

Providing courseware for a massively open online course (MOOC) on cyber-physical systems, including a realistic robotics simulator, a framework to develop and debug using low- and high-level programming methods, controllers for both simulation and real hardware, and an autograder that can grade hundreds of student solutions and automatically provide feedback.

The Solution:

Using LabVIEW software to develop two components, including a modular controller deployable on virtual or real hardware, with which students can develop platform-independent code using C or the LabVIEW Statechart Module, and a stand-alone lightweight application with a convenient GUI to simulate, test, give automatic feedback, and grade with an autograder developed in C++ external code from state-of-the-art research in formal methods.


Alexandre Donzé - UC Berkeley
Gavit Juniwal - UC Berkeley
Sanjit A. Seshia - UC Berkeley


NI and UC Berkeley Introduce Students to Cyber-Physical Systems



Since 2008, the Department of Electrical Engineering and Computer Science at the University of California, Berkeley has offered EECS149: Introduction to Embedded Systems. This course shows students how to design and analyze computational systems that interact with physical processes, also known as embedded, or more recently, Cyber-Physical Systems (CPS). System applications include:

  • Medical devices
  • Consumer electronics
  • Traffic control and safety
  • Automotive systems
  • Process control
  • Energy management and conservation
  • Aircraft control systems
  • Robotics
  • Distributed robotics (telepresence and telemedicine)
  • Smart structures
  • Critical infrastructure control, including electric power, water resources, and communications systems


A major theme of this course is the interplay of practical design with models of systems, including both software components and physical dynamics, emphasizing how to build high-confidence systems with real-time and concurrent behaviors. To achieve this goal, the course is based on a unique blend of theoretical lectures on model-based design, practical lectures, and projects involving hands-on hardware use. Since the course debuted in 2008, NI has been a critical contributor to its success by providing engineering services, software, and hardware for the students to use in a variety of original and concrete CPS projects.



After the Fall 2013 semester, the team behind EECS149 decided to take on the ambitious project of offering a MOOC version of the class, hosted on the growing and popular online course platform, edX. The first challenge was providing online students with a lab experience similar to that of the on-campus class, without assuming access to a physical lab or any specific NI hardware. Another challenge was grading. For the on-campus offering, instructors check that solutions are correct by looking at and physically manipulating the devices or robotic systems programmed by the students. This is impossible in an online offering. It is also not scalable due to the hundreds, if not thousands, of students involved in a MOOC. Implementing a scalable and automatic grading mechanism is a typical problem in MOOCs, regardless of the topic, but is even more challenging in the context of EECS149, where correct CPS design and analysis is required. The problem is also closely related to automatic CPS testing and formal verification, which constitutes an active and mostly open research topic.


From EECS149 to EECS149.1x

To provide a decent lab experience with autograding, the team decided to focus on a subset of the EECS149 campus offering. In the past few iterations of the course, students worked on their own projects as well as a common set of laboratory assignments. These lab assignments consisted of programming an iRobot Create to perform two tasks—navigating an environment while avoiding obstacles, and climbing hills and ramps when detected. Students are required to program statecharts—first using C code, and then using the LabVIEW Statechart Module. This exposes students to formal models (statecharts) in both low- and high-level graphical programming methods.



To test their controllers, the students had the option to use a realistic simulator powered by the LabVIEW Robotics Module. Once satisfied with the simulation results, a student can compile and upload the controller, with few modifications, directly onto the real robot to test it in a concrete environment. For the MOOC offering, it was natural to extend this simulation-based development approach to a virtual lab, which did not require actual hardware. Although no virtual environment can replace experimenting with real hardware, the high fidelity and realism of the LabVIEW Robotics Module provides a decent alternative experience for students without having access to the required hardware.


The first MOOC version of EECS149, dubbed EECS149.1x on edX, combined video lectures covering a subset of EECS149 material with virtual lab software, making it possible for students to work on the same obstacle-avoidance and hill-climbing robotics problems as on-campus students. The virtual lab software offers different environments, such as C and the LabVIEW Statechart Module, to program a platform-independent controller for iRobot Create, test and debug it through extensive realistic simulations, and, optionally, deploy and test it on a real robot (available for purchase, but not required). Additionally, through the virtual lab, instructors can collect students’ solutions and grade them against assignment design requirements.



Implementing the Virtual Lab With LabVIEW

To implement the course software, the team had to package the existing simulator into a virtual lab and implement the autograder technology. For the latter, the initial plan was to use Breach, a research toolbox initially created at the Verimag Laboratory (UJF-CNRS, Grenoble, France) and further developed in Dr. S. Seshia's research group at UCB. Breach is a toolbox which implements simulation-based verification techniques against formal specifications written in Signal Temporal Logic (STL), a variant of temporal logic adapted to specify formal requirements for CPS. The plan for the autograder was to specify lab exercise design requirements in STL, and automatically verify them with Breach on simulation traces generated by the LabVIEW Robotics Module. However, this raised a number of questions and issues. First, we needed to decide whether to run simulations and autograding locally on student machines, or remotely on one or several dedicated servers. A back-of-the-envelope estimate of the computational cost involved in running tens of simulations for several hundred students quickly made it clear that simulations had to be run on student machines. However, if we used Breach for autograding, we had to install it on student machines, because we could not deploy Breach as a stand-alone application.


To circumvent this problem, we used the LabVIEW Application Builder to develop and deploy a stand-alone tool incorporating both the iRobot Create simulator and additional external C++ code, partly reusing some Breach code. The resulting application, named CyberSim, is sufficiently lightweight to run on most Windows machines, even with older hardware, and still able to implement the required features of the virtual lab. On the student side, CyberSim can load a controller written either using C or the LabVIEW Statechart Module, and use it to control and simulate an iRobot Create in different 3D customizable environments. Moreover, CyberSim provides an autograder that can run in either debug or grading mode. In debug mode, it performs a sequence of tests, looking for incorrect or dubious behaviors with respect to the lab assignment goals, and provides feedback and suggestions when it detects a possible bug.



In grading mode, it runs a similar suite of tests and returns a report archive containing a log file with test results and corresponding grades, as well as the controller code and the simulation traces computed with it, for potential further instructor examination. On the instructor side, CyberSim comes with a domain-specific language to specify lab assignment test plans, along with a simple way to design new virtual test environments using the LabVIEW Robotics Module. Test plans essentially consist of sequences of simulations to run, STL properties to check for each simulation, and the textual feedback to provide if these properties are satisfied or not satisfied. Although writing test plans in CyberSim is relatively easy and intuitive, making sure the plans are relevant and efficient for assessing the correctness of a given assignment is not easy. Also, because specifications in formal languages such as STL are part of the topics covered by EECS149, future offerings of EECS149 and EECS149.1x will incite the students to write their own test plans to test their designs, further integrating CyberSim in the learning process.


EECS149.1x: First Offering Results and Feedback

EECS149.1x was offered on edX beginning on May 6, 2014, and running for six weeks. NI provided free six-month licenses for all registered students so that they could use the required software (LabVIEW, the LabVIEW Robotics Module, and the LabVIEW Statechart Module). NI also offered a discount on optional hardware for students desiring physical deployment targets. About 8,500 students registered, and an estimated 500 of them actually used the virtual lab software. Eventually, 342 students received a certificate of achievement for completing the course successfully. The ratio of active vs. registered students is typical for MOOCs. However, we believe that the involvement required for lab assignment programming is higher than average, producing, in fact, a good result.


We deployed CyberSim on hardware running all versions of Windows, from XP to the latest 8.1. The team had only a few minor issues, which were fixed usually within a day or two, with the help of NI engineers. A CyberSim survey showed generally positive feedback, with 94 percent of students finding it at least fairly usable, and 86 percent finding the autograder debugging tests useful. Students also provided positive feedback about LabVIEW. Although 59 percent of the students had no prior LabVIEW experience, 82 percent found that LabVIEW was equally capable or superior to other platforms (most notably, C) in accomplishing the tasks covered in the course, and 83 percent were interested in additional LabVIEW training leading to a certification. Finally, more than 90 percent of the students using myRIO reported that their CyberSim solution worked with little or no code modifications. This confirms that the virtual lab provides a good alternative to a live lab.


Although this first offering of EECS149.1x was a bold pioneering experiment in online CPS education, it was a successful one, which will undoubtedly be followed by future, improved online offerings. This course will also influence the on-campus course, and potentially, the way CPS courses are taught and designed in universities and companies all over the world.


Additional Resources: 

EECS149.1x on edX
Sanjit A. Seshia
Breach Toolbox



Author Information:

Alexandre Donze
UC Berkeley


Author Information:

Sanjit A. Seshia
UC Berkeley
Tel: (510) 643-6968

Figure 1. This shows the CyberSim main configuration panel. Users can load a new environment or a statechart implementing a robot controller, and save simulation traces. The running mode for the autograder is also specified here.
Figure 2. The statechart display panel shows a view of the statechart executing as the simulation runs.
Figure 3. This shows CyberSim in debug mode. The Feedback and Grading tab displays information about the tests run and their results. Here, after running a simulation, the autograder detected an erratic behavior and points the user to a possible cause.
Figure 4. This shows editing the statechart with LabVIEW.
Figure 5. This is CyberSim in grading mode. Here, the autograder ran all tests, and the solution passed them all. The student can upload an archive containing logs and additional files on edX to validate his or her maximum grade.
Figure 6. Grading and debugging is based on test plans specified in files like this one.