[Editor's Note: This was originally posted on the Embedded Master]
Robotics Engineering seems to be gaining momentum as a formal engineering discipline. Worcester Polytechnic Institute just became the third U.S. university to offer a doctoral program for Robotics Engineering. The university also offers a Bachelor and Masters program for Robotic Engineering. The interdisciplinary programs draw on faculty from the Computer Science, Electrical and Computer Engineering, and Mechanical Engineering departments. I fear though that there is an ambiguity about the type of engineering that goes into building robotics versus “smart” embedded systems.
When I worked on a Robotics hands-on project, I noticed parallels between the issues designers have to address regardless of whether they are working on robotic designs or embedded semi-autonomous subsystems. Additionally, relying on interdisciplinary skills is not unique to robotics – many embedded systems also rely on the same sets of interdisciplinary skills.
From my own experience working with autonomous vehicles, I know that these systems can sense the world in multiple ways – for example inertially and visually – they have a set of goals to accomplish, have a means to move, interact with, and affect the world around them, and are “smart enough” to be able to adjust their behavior based on how the environment changes. We never referred to these as robots, and I never thought to apply the word robot to them until I worked on this hands-on project.
Defining what makes something a robot is not clearly established. I found a description for robots from the Robot Institute of America (1979) that says a robot is “A reprogrammable, multifunctional manipulator designed to move material, parts, tools, or specialized devices through various programmed motions for the performance of a variety of tasks.” Our autonomous vehicles met that description. They were reprogrammable and they could manipulate the system through six degrees of freedom to accomplish a variety of tasks. Despite this, I think it would still be difficult to get people to call them robots.
Additionally, it seems there are many embedded subsystems, such as the braking systems or stability-control systems resident in many high-end automobiles, that might also fit this description—but we do not call them robots either. Even my clothes-washing machine can sense and change its behavior based on how the cleaning cycle is or is not proceeding according to a predicted plan; the system can compensate for many anomalous behaviors. These systems can sense the world in multiple ways, they make increasingly complex decisions as their designs continue to mature, they meaningfully interact with the physical world, and they adjust their behavior based on arbitrary changes in the environment.
It seems to me that the principles identified as fundamental to a robotics curriculum should be taught to all engineers and embedded developers – not just robotics majors. Do you think this is a valid concern? Are there any skills that are unique to robotics engineering that warrant a new specialized curriculum versus being part of an embedded engineering curriculum?
Robotics and embedded are two overlapping disciplines with their own special considerations and should be treated as such. I would bet that the definition of “robot” varies all over the place. How about – Involves unconstrained freedom of physical movement in real world space? That definition would remove car brakes and washing machines from robotics.
You ask a good question regarding what distinguishes Robotics Engineering from Embedded Systems. One could also ask the same for Mechatronics. There is no clear demarcation. I strongly agree that all engineers should be taught broad principles, but this does not happen in most programs. A feature of the WPI program is that it balances the roles played by CS, ECE, and ME. A recurring theme is that there are no boundaries — great engineers use whatever tools from whatever disciplines. The Unified Robotics curriculum, co-taught by faculty from the 3 departments, strongly reinforces this. Do students get it? A stroll through the lab — where one is equally likely to see free body diagrams / Java class definitions / circuit diagrams / differential equations — suggests that they do. Visitors welcome!
- M. G.
Director, Robotics Engineering, WPI
My M.S. Computer Science degree from the early 1990s was specialized in Computer Graphics, Robotics, and Artificial Intelligence. While some aspects of the math involved showed up in my two Linear Algebra courses, there was a lot of domain specific knowledge imparted in the two robotics courses that was the practical application of that mathematics, which of course is what engineering is all about – the practical application of science to a given domain.
I would also agree with R. S. that the academic definition of a “robot” would exclude braking systems and washing machines. Artificial intelligence or fuzzy logic embedded in a smart control device does not equal a robot. Is an autonomous vehicle a robot? Yes, absolutely. I’ve watched the Boston Dynamics videos as “Big Dog” evolves ( http://www.bostondynamics.com/robot_bigdog.html ) and just marvel at the progress being made – poetry of algorithms in motion.
If the robot were a teaching instrument and only system controls/coding were required, Mr. Robert Cravotta’s comments are understood. However, the question this author raises also underscores the misunderstanding one gets from considering the differences between the disciplines.
For example, does a robotics engineer need to know embedded systems design in order to create a robot? Not in every case but more than likely, yes. On the other hand, does an embedded systems engineer need to know robotics systems design? The answer is no, unless dealing specifically with robotics.
From this, a grasp of physics and mechanical engineering is required in the latter case but not necessarily in the former! Embedded systems design can be purely ’1′s and 0′s’ without the need for motion of any kind. With rare exception, functional robotics requires a conversion from the ’1′s and 0′s’ to physically moving something.
Without the expanded background, calculating the requirements and specifying the mechanical attributes of a final system, the embedded systems engineer could be lost!
I save “Bravo!” to WPI and other institutions like them. A multi-disciplinarian approach for problem moving forward will quicken design times and make for more robust products.
- K. M. (BSEEE, MSCS)
Marketing Manager, ADVANCED Motion Controls
Senior Member, IEEE & Robotics and Automation Society
Agreed…
The questions asked – are same as I have had for some time..
Personally,
I think engineering degree titles, should be more general.. until Masters or Doctorates are involved.
All engineers should have sound and reasonably detailed competence in physics, mechanics, electronics and software – before specializing in a specific application of this knowledge base.
and yes .. still have rounded education (history, language, etc….. and, yes, including other sciences).
too much specialization .. bad for society.
End up with people that create without regard (concern) of it’s impact on the world.
After a 50+ flirtation with AI, robotics is coming back to the RIA definition of “A reprogrammable, multifunction manipulator designed to move material, tools, [etc.] …” Moving things around is what real robots do in factories. They are starting to move out of factories into the field. Examples of modern robot designs are autonomous cruise missiles and autonomous fork lifts in warehouses.
.
A robot is not just an embedded system such as an iPod or an appliance (examples of the two kinds of embedded system designs). Robot design is not just software design, sensor design, electronic hardware design design, servo motor design or mechanical design. It is – at least – all of these, combined into a single system. No single aspect rules the system design.
.
Modern robot design is the system design of mobile robots for moving things around, including subsets of this definition, such as non-mobile manipulators, etc. It should be taught as such, with emphasis on the integration of each element of the system into an overall design, and the trade-offs of improvements in one system element against another.
.
Teaching robotics as merely an application of machine tool design, software design, mechanical design or motor servo design, etc. is a dead end. You wind up with sub-optimal robots at best, or – more commonly – fragile laboratory curiosities.
.
So, yes, robotics should be taught as a stand-alone robotic system design curriculum, IMHO.
I, too, like R. S.’s definition of robot but even then there are grey areas: a “pick-n-place” machine or warehouse stacker is certainly a robot under almost anyone’s definition; a mail-sorting machine or conveyor system, maybe not so clearly a robot.
There is certainly a lot of overlap between robotics and some other forms of embedded system (I consider robotics a subset of embedded) but there’s lots of domain-specific knowledge as well. For robotics, that seems to be largely related to motion control. When you slide into conveyor systems, there may be less motion control but the time domain may become a bigger issue.
Or not; now we probably have to start talking about specific applications.
I would consider robotics as a kind of specialization among embedded engineers, like being an internist is a specialization among doctors. As such, I’d probably want to have an emphasis on certain areas but I don’t think there’s enough difference to warrent a completely seperate curriculum. At least, not at the undergrad level.
“It seems there are many embedded subsystems, such as the braking systems or stability-control systems resident in many high-end automobiles, that fit the description of a robot—but we do not call them robots. Even my clothes-washing machine can sense and change its behavior based on how the cleaning cycle is or is not proceeding according to a predicted plan; the system can compensate for many anomalous behaviors.”
I Think the confusion here may stem from the focus on software that the preceding paragraph seems to imply.
Robotics is a multi-disciplinary field, that certainly involves software, but also involves a possibly even more significant measure of mechanical engineering.
Typically, a software practitioner in the field of robotics has a skill set overlap in the area of robotic kinematic theory with a mechanical practitioner in the field. I believe that this skill set overlap is the defining characteristic of practitioners in the field (regardless of their source discipline).
From a practitioners point of view, I would be inclined to define robotics, not as a distinct practice, but rather as a specialized set of skills (i.e. a specialization) that differentiates a particular practitioner from other practitioners of a more general discipline.
I think that if one views the mechanical engineering discipline as the lead discipline in the multi-disciplinary field of robotics, then it becomes much easier to define the skill set that identifies a software engineer as a suitable practitioner for the field.
I think that if a separate discipline of “Robotics Engineering” were to be defined, that it would imply a course of study that would be something close to dual degrees in mechanical and software engineering.
I find (due in large part to the fact that the intersection of skills between the software and mechanical disciplines is relatively small in relation to the overall set of skills required in either of these more general course of studies) that the concept of identifying a stand alone discipline is a bit of a stretch.
I would largely concur with R. S.’s definition of robotics, and would refer to the following paragraph from the definition of robotics in the Sci-Tech Encyclopedia, which nicely identifies a set of attributes for a robot:
“Robots produce mechanical motion that, in most cases, results in manipulation or locomotion. Mechanical characteristics for robotic mechanisms include degrees of freedom of movement, size and shape of the operating space, stiffness and strength of the structure, lifting capacity, velocity, and acceleration under load. Performance measures include repeatability and accuracy of positioning, speed, and freedom from vibration.”
Robotics as a subject would I presume include internet agents…”Bots”. These have purpose and intelligence but are not realtime or embedded.
Machine vision (and any similar conversion of sense data into higher forms of data) also would be a part of robotics technology, and again wouldn’t necessarily be realtime or embedded.
But then motion control is realtime and embedded.
I suppose there are two aspects to robotics: mind and body. The physical aspects use realtime embedded as a platform.
With regard to “mind”, in some cases decision-making might not be realtime…the robot might have to stop and think. And the intelligence might not be embedded, it could be running on a remote server given wireless communications.
“Robotics as a subject would I presume include internet agents…”Bots”. “
Absolutely not. Robots are, by definition, mechanical systems, that very often (but not always) controlled by software.
“I suppose there are two aspects to robotics: mind and body.”
Yes, and this is exactly why an Internet agent is definitely not a genuine robot.
I would refer you to the following link:
http://www.answers.com/topic/robotics
Interesting.
R., Would you consider a “bot” a “virtual” robot? What about simulations of robots – are they robots?
Had a look at the answers.com definition of robotics…culled this:
“Recently, however, the industry’s current working definition of a robot has come to be understood as any piece of equipment that has three or more degrees of movement or freedom”
The traditional definition of a robot was something like “artificial human”.
The original definition is perhaps not applicable to current industry due to the early stage of development.
I prefer the term mechatronics for electromechanical systems.
“What about simulations of robots – are they robots?”
They are simulations of robots… unless of course we’re in the matrix, in which case they are simulations of simulations of robots, and we’re just batteries
No, we’re just batty.
Otherwise, we wouldn’t be in this business
It’s an interesting discussion. Typically there are differences between robotics curricula that focus on the system level and embedded curricula that deal with the microprocessor or lower level code. However, they both have many aspects that overlap and if students are armed with the proper tools and enough time, a combined curriculum is the way to go. We see this overlap happening in educational institutions across the globe, where professors and students have access to easy-to-use common programming language tools that help them collaborate across disciplines – from engineering to computer science – so that they receive a well-rounded understanding of robotics, mechatronics, embedded and more. All this meshing of curricula is necessary today in order to give students the skills needed to become “holistic engineers,” which will ensure their success no matter what field of engineering they ultimately pursue.
T. G.
Education Marketing
The MathWorks
A coda to my earlier comment. Autonomous robots differ from other mechatronics applications in that they require navigation in an environment. This is a whole additional area of applied physics to absorb.
.
To be more precise, what is required for a field robot is portage, as in navigation within a port rather than at sea. Navigation at sea has no landmarks, so you need either GPS, LORAN or a compass, a sextant and a very accurate clock. Ports have landmarks and obstacles. Navigation in-port requires locating landmarks and obstacles around you, and some of these change from day to day. The ship in that dock may not be there tomorrow; and the dock that is free today may be occupied tomorrow. And do not run into the ships that pass you.
.
What we want and need are autonomous field robots that can navigate in known but unpredictable, continuously changing environments. This is a key area of R&D. Many schemes have been used to identify landmarks, such as bar codes, flashing lights, radio beacons, etc. Of more interest are newer and improved means of machine vision for object recognition to give the range, bearing and pose (RBP) of objects surrounding the robot. If you know the RBP of all objects around the robot, you can write a program to navigate to and around them.
.
This additional problem of autonomous field robot navigation is what differentiates field robots from factory machine tool robots. It is not a derivative of embedded machine control, but an additional area of knowledge and expertise. It is the secret sauce of the next generation of robots, field robots.