[Editor's Note: This was originally posted on the Embedded Master]
Robotics Engineering seems to be gaining momentum as a formal engineering discipline. Worcester Polytechnic Institute just became the third U.S. university to offer a doctoral program for Robotics Engineering. The university also offers a Bachelor and Masters program for Robotic Engineering. The interdisciplinary programs draw on faculty from the Computer Science, Electrical and Computer Engineering, and Mechanical Engineering departments. I fear though that there is an ambiguity about the type of engineering that goes into building robotics versus “smart” embedded systems.
When I worked on a Robotics hands-on project, I noticed parallels between the issues designers have to address regardless of whether they are working on robotic designs or embedded semi-autonomous subsystems. Additionally, relying on interdisciplinary skills is not unique to robotics – many embedded systems also rely on the same sets of interdisciplinary skills.
From my own experience working with autonomous vehicles, I know that these systems can sense the world in multiple ways – for example inertially and visually – they have a set of goals to accomplish, have a means to move, interact with, and affect the world around them, and are “smart enough” to be able to adjust their behavior based on how the environment changes. We never referred to these as robots, and I never thought to apply the word robot to them until I worked on this hands-on project.
Defining what makes something a robot is not clearly established. I found a description for robots from the Robot Institute of America (1979) that says a robot is “A reprogrammable, multifunctional manipulator designed to move material, parts, tools, or specialized devices through various programmed motions for the performance of a variety of tasks.” Our autonomous vehicles met that description. They were reprogrammable and they could manipulate the system through six degrees of freedom to accomplish a variety of tasks. Despite this, I think it would still be difficult to get people to call them robots.
Additionally, it seems there are many embedded subsystems, such as the braking systems or stability-control systems resident in many high-end automobiles, that might also fit this description—but we do not call them robots either. Even my clothes-washing machine can sense and change its behavior based on how the cleaning cycle is or is not proceeding according to a predicted plan; the system can compensate for many anomalous behaviors. These systems can sense the world in multiple ways, they make increasingly complex decisions as their designs continue to mature, they meaningfully interact with the physical world, and they adjust their behavior based on arbitrary changes in the environment.
It seems to me that the principles identified as fundamental to a robotics curriculum should be taught to all engineers and embedded developers – not just robotics majors. Do you think this is a valid concern? Are there any skills that are unique to robotics engineering that warrant a new specialized curriculum versus being part of an embedded engineering curriculum?