Skip to main content

The History of Industrial Robots, From Single Taskmaster to Self-Teacher

  • Robots’ origin story stretches back to the Renaissance—but really started to evolve in the post-WWII era alongside computing advances.
  • Industrial robotics have come a long way, transforming factories into safer, more efficient workplaces for global manufacturing.
  • Robotics today is informed by data and cloud computing, and will usher in a new era of sustainability for the manufacturing industry.

The 15th and 16th centuries aren’t commonly known for robotics, but the Renaissance era’s artists, scientists, and inventors, like Leonardo Da Vinci, first imagined the automated mechanisms that would eventually lead to robots.

The first true industrial robot appeared half a millennium later, created in 1930 out of the model construction system Meccano. The industry took a quantum leap forward when the 1960s ushered in the computer era, and since then, every decade has brought exponential advances alongside other technologies.

The number of industrial robots in use in America grew from around 200 in 1970 to around 4,000 just ten years later—and then jumped to 1.6 million by 2015. Today, there are estimated to be close to three million industrial robots in active service worldwide. After its humble beginnings, this has been among the most impactful advances in the history of making stuff, becoming an indispensable part of society and the global economy.

When the automotive industry took off in the mid-20th century, it brought industrial robotics with it.

Manufacturing Before Robots

The early industrial revolution introduced multi-unit manufacturing of products for mass consumption. The factories of the time used extreme heating and cooling technologies, very heavy machinery, and huge contraptions that belched oil and smoke, putting human workers directly in harm’s way.

People standing on dank production lines affixed, welded, and hammered—a slow process further hindered by the inherent limitations in human strength and endurance. This repetitive workflow could dull workers’ focus and cause fatal accidents, as could the machinery itself during catastrophic breakdowns. Pollution destroyed lungs; grinding labor destroyed spirits; and the work that would be relegated to robots in the modern world is still referred to by the epithet “dirty, dangerous, or dull.”

When the auto manufacturing industry went into hyperdrive in the post-WWII period, it did so in conjunction with the rise of computing, making industrial robots natural partners in industry. Suddenly, a computer could prescribe the steps a robot took—the literal movements in made as it worked—making every action identical and every object uniform and reprogrammable to accommodate the tiniest change.

1930s: The Birth of Industrial Robots

In the 1930s, Canadian/Australian civil engineer (and roboticist before that title existed) Bill Taylor built the first pick-and-place robotic crane, which he called Gargantua. As tall as the average household ceiling, Gargantua was programmed using paper tape with holes punched in it to stack blocks in pre-programmed configurations, driven by a single electric motor, and built entirely out of the kids’ construction product Meccano. Although Gargantua itself was never commercialized, Taylor’s design ideas contributed to the coming revolution of industrial robotic manufacturing.

It still took a few decades until the widespread adoption of the computer gave manufacturing and robotics a shot in the arm. A Connecticut company called Unimation launched a computer-controlled robot called the 1900 series; one of its first major customers was General Motors. Overnight, the automotive manufacturing business was turned on its head. The Unimation 1900 had such an impact that it blasted into the popular consciousness, appearing at both a Chicago trade show and The Tonight Show Starring Johnny Carson.

By 1966, with 450 1900-series robots in use across the US, Unimation looked overseas, appointing Finnish company Nokia as its European manufacturer. The next iteration of Unimation’s technology, a welding robot, arrived in 1969 and could assemble an unprecedented 100 cars per hour, doubling the previous rate.

Victor Scheinman invented the Stanford Arm, the first viable robotic arm, in 1969 while at Stanford University. Photo by Gildardo Sánchez via Creative Commons 2.0.

Around the same time, robotics grew in a new dimension thanks to the academic Victor Scheinman. Working at Stanford University’s AI lab, Scheinman created the world’s first viable multi-axis robotic arm, which gave robotics a far wider envelope of movement. The Stanford Arm was lightweight, electric, anthropomorphic, and multi-programmable, able to perform a wider variety of dangerous or unpleasant assembly-line jobs in place of human operators. Commercially available in 1971, the Stanford Arm revolutionized industrial robots.

1980s and 1990s: Computer-Controlled Robots Become the Standard

Gargantua, Unimation, and the Stanford arm were so successful as concepts that until the mid ’90s changes were incremental rather than sweeping. During the ’80s, advances like industrial lasers were improving quickly, making sensor technology and rudimentary machine vision systems possible. It was generally accepted that industrial robots represented the future of manufacturing.

Computer control had also become standard in the desktop era, and the next advance was the ability to control a robot from a PC, thanks to 1994’s MRC (multi-robot control) system. The PC era also brought a steep reduction in microprocessor prices, putting computer-controlled robotics in the hands of even more industries and players.

Better sensors, cheaper computing power, and a higher degree of programmability combined to move robots out of the simple repetitive task phase and give them what many thought of as rudimentary intelligence.

2010s: Human/Robot Collaboration Arrives

During the 2010s, the robotics industry experienced another revolution that put robots where they’d long been imagined: working alongside humans.

Collaborative robots—or “cobots”—are machines that work in close proximity to or directly with human operators. In years past, robots had to be strictly quarantined away from people to minimize the chances of the large, heavy industrial machine causing harm by going haywire or breaking down.

Woman working at Skoda Auto with what has been called the first cobot, the Kuka LBR iiwa lightweight robotic arm. Courtesy of Kuka.

Better, safer, and lighter materials, as well as much better sensor technology that can tell a robot where it and its human partners are—ushered in an age of human-scale robots designed to partner with human collaborators.

Another important innovation in the field, automation, also enabled this type of close human/robot collaboration. With the help of software, automation performs ever-finer details in a workflow. Algorithms that account for different circumstances in a process can correct actions in real time to meet established outcomes.

For example, whereas a computer-controlled robot will make the exact series of movements and actions in sequence with split-second regularity, a cobot waiting for a person to hand over an object for processing might use its sensors and programming to understand that its human partner has stopped to take a drink or bathroom break and will reach for the object only when it’s offered.

And after a decade of development, cobots have taken their place in manufacturing; one estimate finds there are around 40,000 in operation around the world, representing a market worth around $1 billion in 2021 and forecasted to grow to $11 billion in 2030.

2020s: Robots of Today

Industrial robots today don’t look much different than they have for the past 50 years. Yet up close the sensors are better; the effectors are more precise; and the materials are often far more user-friendly. But they’re still basically articulated lengths of rigid material with a tool on the end and joints along the length allowing multiple axes of movement.

What differs today is the stuff you can’t see. Data, cloud computing, and Internet of Things (IoT) technologies mean every input, action, and output (from a whole factory, a single device, or part of a device) can record information about its performance and report on it.

All that data gives manufacturing businesses much more control over quality, maintenance, and productivity and is transposable to other departments, partners, suppliers, customers, and stakeholders across the street or across the world. It forms smart factory infrastructure, a global manufacturing brain that can improve the way things are made on a macroeconomic basis, contributing to saving money, time, and the planetary environment.

It’s a future that grows closer every day. Over the past decade, the number of industrial robots in the sector has tripled, peaking at around 422,000 units in 2018 before things slowed down, exacerbated by the pandemic in 2020. Automotive and electronics are still the biggest users, accounting for just over half of new installations in 2020. The hotbeds of the industry are Japan, China, Germany, and the US, but ground zero is South Korea, which has 930 robots per 10,000 manufacturing industry employees—seven times the global average.

This autonomous mobile robot (AMR) has a robotic arm attached. AMRs can be good entry points for companies beginning with robotics. Courtesy of Kuka.

What’s Stopping More Widespread Industrial Robot Adoption?

History may remember this time in manufacturing as an industry easing off the brake slowly, even while it presses the gas pedal down. This phase is caused by economic factors that can still trip up new adopters, blocking more widespread adoption of industrial robots.

  • Unpredictable After-Market Costs

Provisioning and programming a robot for a specific process is simple enough, but if a business pivots and new assembly line or manufacturing functions are needed, it takes time and money to reprogram a robot, safety test new effectors, and put it back into service.

  • A Lack of Uniform Standards

There are a lot of players in the robotics industry, and they all have their own software and compliance frameworks—many of which do not play nice with each other. Failing to properly arrange a standards environment before deploying a robot fleet can cost big later.

  • Costly Maintenance

Despite best efforts, machinery still breaks down. The maintenance and replacement cycle is a lot easier to plan for because of technologies like cloud-based performance reporting, but taking an essential piece offline can create a bottleneck that slows down everything else. It might not just be a single robot that stops for repairs; it might be the entire factory output.

  • Future-Proofing is Difficult or Impossible

Human beings have limitations but are purpose-built for adaptability. Making robots similarly flexible can be a costly exercise. If the needs of a manufacturing process change—often due to unpredictable market forces—it might be as simple as changing a few effectors, but it might also mean a wide-scale reprogramming effort, maybe an entirely new robotic fleet.

Flexible-use cobots can help a smart factory introduce more customization into their products.

Industrial Robotics are the Future of Manufacturing

The history of industrial robotics is at its core the story of humans’ drive to make more stuff faster and better. But this is a new, uncertain time in history: The worst pandemic disease in living memory has disrupted supply chains everywhere, and there’s a global reckoning with the damage that unchecked manufacturing and consumption has wrought.

That puts the new science of industrial manufacturing robots in a rare position. Though heavy manufacturing has traditionally had a negative impact on human lives and environments, there is now a chance to redesign how things are made to help sustain and heal. Robots, computing, and data are the tools needed to build a better, more sustainable future.

This article has been updated. It originally published October 2018.

About the Author

After growing up knowing he wanted to change the world, Drew Turney realized it was easier to write about other people changing it instead. He writes about technology, cinema, science, books, and more.

Profile Photo of Drew Turney