[Editor's Note: This was originally posted on the Embedded Master]
A few weeks ago, I asked if robotics engineering is different enough from embedded engineering to warrant being treated as a separate discipline. I asked this question because I think a general engineering curriculum is valuable – especially at the bachelor level. Engineers should at least have exposure and experience with a broad set of skills and tools because you never know which one will be the right one to solve any given situation, and it seems a shame to focus multi-discipline skills and tools into a narrow topic when they really apply to any embedded system that deals with real world interfaces and control.
I offered three examples in the original post: my clothes-washing machine, and the braking systems and stability-control systems resident in many high-end automobiles. Differential braking systems are fairly sophisticated and they involve much more than just a software control system. They must have an understanding of the complex interactions of friction, steering, engine force, and inertia in order to accomplish their coordinated task correctly. The same can be doubly said for stability control systems that work to prevent vehicles from skidding and rolling over. The humble washing machine is a modern marvel. I have watched as the washing machine gets into an unbalanced condition. Instead of walking and banging around like its predecessor, it performs a set of gyrations that actually corrects the imbalance and allows the washing machine to work at higher efficiency even with heavy, bulky loads. Each of these examples is a collection of interconnected embedded systems working together to affect the overall system in some mechanical fashion.
In thinking about this question, I remembered spending time with the Velodyne brothers, David and Bruce Hall, for the earlier DARPA Grand Challenges and their TeamDAD autonomous automobile. I was kindred spirits with these gentlemen as I also worked on autonomous vehicles almost 15 years earlier. In addition to their innovative LIDAR sensor system, I remember their answer to how they got involved in autonomous automobiles when their background was premium subwoofers. They use motion-feedback technology in their sub woofers and it is not a large leap to applying those concepts to motion control technology for robots.
In a similar fashion, none of the examples above act like a robot that exhibits obvious motion, and yet they all use motion-feedback to accomplish their primary tasks. The relevance here is that robots also use a collection of interconnected embedded systems that work together in order to achieve mechanical degrees of free motion in the real world, which I have observed as a bias in multiple online conversations as a necessity for an autonomous system to be considered a robot. None of the examples is limited to just embedded software – they all are firmly entrenched in multidiscipline mechanical control and manipulation.
Is there a fundamental difference in the skills required to build visibly moving robots versus autonomous embedded control subsystems? My experience says they are the same set of skills.
Having worked in the actual industry ( Apogee Robotics took over the HP AGV stuff and Continental Divide Robotics ) I’ll add my 2 cents. I also was part of the Development Group at CRI that built the X and Y-MP Supercomputers )
Robotics has to be a separate discipline. Too much of it requires basic understanding of ANALOG design and how the DIGITAL world reacts to ANALOG stimuli. My understanding of analog ( I also played with tube type systems alongside my digital experience )and that was invaluable when working with what was an analog circuit at Cray speeds. Understanding rise and fall during travel on tuned circuits was critical data which I supplied to my boss, Dr. Chen.
Robotics has the same problems and I apply my favorite saying to the issue:
NATURE is ANALOG, MAN created DIGITAL.
That basic understanding, IMHO, defines the whole ROBOTICS industry.
Great topic.
In response to ‘o.’:
I can only speak for “electrical engineering”, but every engineering specialization has this same dynamic: The message from those with more experience is that the generation of engineers with a “digital” solution to every problem does not really understand analog. Think of audio, wireless, high-speed circuit design, etc.
This fact actually contradicts the argument that robotics is unique in that regard. If there is an argument for a special curriculum for robotics, it may be for a course introducing the theory of ‘intelligent systems’, which is graduate level today. But even that topic applies to a much broader range of problems than just robotics.
The field of robotics is a superset of multiple disciplines, mainly involving mechanics, electronics, and of course, embedded systems. Computer aided engineering involving system-level simulation, such as Model-Based Design, has proven itself effective at developing embedded systems because it can mesh multiple domains. As robot autonomy increases, we cross a threshold beyond embedded control and signal processing algorithms into embedded learning algorithms. With total computing power continuing to grow through the use of faster, multicore processors ganged together in high speed networks, we can expect that robots will incorporate more sophisticated embedded learning algorithms to supplement their embedded control and signal processes. Many processes will happen automatically at lower processing levels, while adaption and reaction to operational situations will consume greater amounts of processing power. In the future practical robotics will extend embedded system design into embedded intelligence design. At the same time, mechanical and electrical sophistication will continue to evolve with increasing distributed control and signal processing systems incorporated into multiple systems-within-systems. Complex multi-input system-level simulation will be an integral part of the development of these embedded learning, control, and signal processing algorithms to validate their capacity to adapt and operate over a vast range of operating conditions.
T. L.
Industry Marketing Manager
MathWorks
Tony is right about robotics being multi disciplinary. Hence, the answer is that the electrical engineer cannot solve/create everything. The washing machine example by Robert was not designed by an electrical engineer. Obviously, mechanical engineers were involved. I don’t see any real difference in robots that move about like a car that navigates itself and the washing machine. One may be intrinsically more complex but implementation of either requires engineers from multiple disciplines. Products that meet functional and cost goals are ones that achieve the “right” mix of mechanical controls and feedback, analog signal processing, and digital logic.
It would be nice to have a washing machine that could take itself to the dirty clothes hamper, put itself in an easy to load position, take itself to where it can fill itself with water and detergent, do the wash in and out of the way location, take itself to the dryer for easy transfer, and put itself away when done. This washing machine is more like a “robot” because it navigates around the house.
As another designer with experience, who has also done both, I see the needs of robotics to be very similar to good embedded systems. Whether you are working on a robotic system where a 50 hz control loop is sufficient, of on a digital to audio package where you should have good fidelity to 30KHz or so, you are still dealing with the same kind of basic features. You still cannot afford to have memory leaks, you do not want infinite loops and you have a threshold where close is good enough.
In all cases the software guy should understand enough about the problem to avoid adding to the problem. The systems guy may have to have more skills on a AUV than for an iPod. but the software skills are much easier to adapt.
- – S.
Whether you see the problem as a nail or a screw depends on whether you have a hammer or a screwdriver in your hand. And each of the blind men had a different idea of the elephant depending on what part they were holding.
Factory robots are very close to classic autonomous embedded control systems in design. They are autonomous in the sense of automatic, and they are based on control system theory for the motion servos that move the arm(s) around.
Field robots add another level of design to this problem Field robots work in open environments such as offices, homes and the outdoors. Open environments are unpredictable but familiar. Things and people move around, they appear and disappear. Things to be picked up must be found, as they also move and can be in random orientations, etc.
Open environments have a bounded unpredictability. This is the meaning of unpredictable but familiar. Field robots work in these environments, by definition. Field robots that must work in open environments have a much larger design space than factory robots.
First the robot has to search and find things, such as landmarks for location, objects to navigate toward or around and objects to be found and picked up. Knowledge learned in searching will be chronically incomplete and dynamic. The results of the searches will affect the choice of navigation algorithms, and the act of navigating will generate newer information that may require changing the navigation, e.g. an obstacle to navigate around or a blocked passage requiring backtracking.
Field robotics subsumes factory robotics and automatic machines, adding a heuristic layer of dynamic control that deals with incomplete, dynamically changing information. It is this additional layer of dynamic, time sensitive perception and heuristic control that makes the new, field robotics different than classical embedded control systems design.
The difference between the design of autonomous embedded systems and visibly moving robots is the scope of the task. Designing an autonomous embedded system where all of the problems are easily handled by simple actions in a single control loop, such as cruise control, is a single component of the design of a visibly moving robot.
Detecting humans in an office environment where they are the only expected source of heat and movement is MUCH different than detecting them on a dirt road where there could also be wild animals, farm animals, and kids mixed together. Driving on carpet between cubicals assumes the floor will be flat and level. Skidding to a stop on a slanted, icy surface and flipping your 26,000 pound robot is probably not an issue between cubes. (unless maybe it’s Margarita Friday)
While the scope of the tasks are extremely different, the basics of embedded design remain the same. Inertial nav systems, video systems, GPS systems, and distance ranging systems are all embedded systems that must be tied together very close to real time in order to do such a simple task such as keeping a vehicle in a lane. You did that task on your way to work while you were drinking coffee, texting and listening to the radio. The difference is that even when you learned to drive at age 16, you already had 16 years of training to walk upright without falling over, detect, classify, and avoid obstacles, plan and follow a route, and read a map.
If you can’t figure out how fast your incoming data from your range finder (or any of your embedded sensors) needs to be in order to determine where to start your next turn, then you probably need to go back to first principles.
I work in robotics and I hire engineers to work on robots. I heard a while back that schools were starting to give degrees in robotics and after I thought about it I realized that I don’t need robotics engineers. I need embedded engineers, and mechanical engineers, …, and guys with great fabrication skills. A new hire would learn more in a month of working on robots in my lab then they would in a year of school(I know what’s in the ciriculum). The problems I need solved with my robots aren’t the same as the problems a comercial company would have with their industrial robots. No degree program can replicate what new grads would learn from working with the old farts (like me) for a while. What I and the rest of the industry need are intelligent energetic engineers who are passonate about robots.
J. S.
Chief Engineer
The Neurosciences Institute
I also hire engineers to work on robots. I’ve seen a lot of 4 year degree engineers come out of school with robotic training under their belt who don’t want to pay attention to details in extremely complex embedded systems. I agree with Jim in that I need intelligent, energetic engineers who are passonate about robots, not robotic engineers.
Hi,
Embedded Systems in general are designed to accomplish a dedicated task/s which require certain i/p’s to process and then give the o/p. Likewise even Robots can be thought as a system to accomplish certain tasks which act on i/p’s or AI ,process them and give o/p(Mostly Motion). So I think a good knowledge of motion control + embedded system would be the skillset for developing robotic systems.
I think the key thing is, a robot (I’m talking about a true industrial robot) is a system of motors, joints, and masses. So I think one should at least understand the dynamics of physical bodies.
It’s also a controls issue, so one needs to be pretty thoroughly steeped in the concepts of modern control theory, stability, and all that.
Finally, as Scott suggests, this is a “hard real-time” problem. Things absolutely _MUST_ be done at a fixed rate, with low jitter.
Now, as for “other-than-robot” applications like differential braking on a car: The problem is much the same, except that the car is much harder and has more variables to control.
The answer is a definite and unambiguous maybe.
I’m only being a little facetious here; industrial robots and UAV’s are a subset of embedded systems. In terms of actual embedded skills, I don’t think there is a lot of difference (and I have done both visibly moving robotic systems and “other” control systems).
But there is a whole level of anciliary skills which are needed for either system and they may vary quite a bit. As Jack said, if you have an industrial robot, you have moving masses and you really need to understand the physics of that motion. in an HVAC system, on the other hand, you don’t have much in moving parts (compressor and maybe some actuators) but you better understand the heat cycle you’re working with.
In short, the fundamental differences aren’t within the embedded side but rather with the hardware you are working with.
Basically a robot could be an array of embedded subsystems (size of 1 to n). The only difference may be in a complex robot if it has a “brain” controlling the array of the subsystems. So the skill needed for this “brain” would be in the realm of mathematicians developing algorithms to manage the robot’s activity.
“Basically a robot could be an array of embedded subsystems (size of 1 to n). The only difference may be in a complex robot if it has a “brain” controlling the array of the subsystems. ”
I don’t think it works that way, A.. Physically, the various parts of a robot can’t be treated in isolation. Newton’s laws dictate that for every force and torque on one body, there’s an equal and opposite force on the other. A controller has to take into account those reaction forces.
I think it’s probably possible to build a robot that doesn’t have (much) crosstalk between channels. You could have, for example, an x-y-z table, so the various degrees of freedom are all independent. You could make it a point to move along only one axis at a time.
You _COULD_ do that, but you won’t sell any of the robots. The customers want smooth motions from place to place, all axes working together.
Isn’t Newton amazing? Why has the controller have to know about Newton?
I think fundamentally embedded thinking just embedded system design as considered these days is just a process of integrating all the ancillaries…Be it be at small scale level or large scale level… which accounts to of part of the latter half of the discussion question…
but as for robots i think much more info is needed in terms of motion and its dynamics…..dynamics of robotics and control systems..which is much more academic and needs expertise…
I am not playing anyone’s side just my point of view
“Isn’t Newton amazing? Why has the controller have to know about Newton?”
Because the motion imparted to a given joint depends on what’s going on in the other joints. Every motion of any joint imparts reaction forces and torques on all the others. If you want the system to follow a smooth trajectory from one state to another, you have to anticipate and counter those reactions.
In the world of rigid body motion, we call these reactions “cross-coupling” between axes. An industrial robot is a collection of (hopefully) rigid bodies, all connected together in complicated ways, so the cross couplings are very much more profound.
Certainly most, if not all, motion control is hard-real time, not to mention fault-tolerant if not fault proof. If I have a bench instrument giving me temperature and humidity readings and the watchdog timer reboots it and I lose 3 seconds worth of data, it is not (usually) tragic. Yet it, too, is an embedded system. But if I’m riding along on my Segway scooter, and the wheels go from 10mph to zero in 10ms, I won’t be a happy camper.