Sean Higgins, Tribune News Service
Read enough headlines about artificial intelligence and you can be excused for thinking that we’re headed for a dystopian future ruled by AI-powered robot overlords. Mass unemployment, including people being forced to train their robot replacements to get that last paycheck, seems to be a common theme in the latest dire forecasts. Don’t panic. It is helpful to look to history because we’ve gone through these technology-will-destroy-jobs cycles before, and we can see familiar patterns. Historically, new technologies have transformed the economy in disruptive but positive ways. While some jobs do go away, they tend to be the ones involving manual labor and drudgery. What’s more, new technology creates new opportunities, even whole new industries.
AI is a catch-all term. It generally refers to computer programs like ChatGPT that can seemingly think and create. These programs are complex tools that try to synthesise data from existing sources. AI doesn’t actually think; it imitates what is within its database, with current versions doing it at a higher level than was possible previously. AI can be a valuable tool to automate tedious, repetitive tasks. Advances have led to fear that AI will do to more intellectual professions what earlier automation did to factory production: eliminate jobs. News outlets will employ AI programs instead of reporters. Films will have AI-generated actors, scripts and so on. Limiting the future use of AI was a significant issue in recent Hollywood strikes by the Writers Guild of America and SAG-AFTRA, the actors’ union.
Manufacturing is a good case study in how new technology transforms work. About 18 million people worked in manufacturing jobs in the United States in January 1980, according to the Labor Department. Since then, factories have become heavily automated, and today the number of manufacturing jobs is 12.7 million. Despite this transition, the unemployment rate is 4.2 percent, two full points below the January 1980 rate of 6.5%, and the US economy produces more than ever. Over time, factory workers went from doing things like spray painting assembly line cars to supervising the machines that did the painting, which was safer and more productive; or they found jobs elsewhere that were made possible by automation. Meanwhile, cars became cheaper to buy.
The transition was hard for some people, but we came through it with more jobs overall. That scenario likely holds for 21st-century AI. A World Economic Forum study found that AI and related technology will create 11 million jobs while displacing 9 million, for a net gain of 2 million. It will also open up new opportunities. Small-business owners who create those awkward homemade ads you see on TV will have new tools to develop fancier, more professional-looking awkward ads. Another historical point to remember is that new technology rarely lives up to its early hype. AI is in a gold-rush stage, with corporations racing to invest in it based on extravagant promises about AI’s potential capabilities. New waves of technology often come with promises that they will be able to perform miracles, promises that should be viewed skeptically.
In the aughts, for example, we were told that embryonic stem cells would soon allow paraplegics to walk again. Some touted 3-D printing as readily giving everyone the replicator technology from “Star Trek,” rather than more Minecraft fridge ornaments. And so on. Even transformative technology, like the internet, changes things in unexpected ways. Remember when the internet was supposed to make it easier to preserve and access reliable information, rather than drowning us in a sea of trivial and often unreliable data?
Since AI can only imitate rather than think, companies that rely exclusively on it instead of actual human minds do so at their own peril. The derisive term “slop” has already come to be associated with AI-generated art thanks to its habit of getting details like human anatomy wildly wrong. AI writing programs have also shown the curious habit of “hallucinating” research that does not exist. The FDA, for example, used an AI programme to speed up drug approvals, but it sometimes relied on studies that didn’t exist. This failure forced the FDA to assign more staffers to vet the AI-produced studies and weed out the ones based on false data. Yes, AI technology will continue to improve, but the sci-fi future of thinking machines is still a long way off. Until then, AI is just another tool, and tools will always need human minds and hands to operate them.