Every once in a while, a confluence of teaching, writing, and reading obligations nudges me to think a lot more about a topic than I would under ordinary circumstances.1 Over the past week or so, that topic has been artificial intelligence. It’s been quite some time since I have written about it. Five years ago, I posited that general artificial intelligence (AGI) might qualify as a general purpose technology. At the same time, there were aspects of AI discourse that I found to be, um, suspect. Even ChatGPT has left me intrigued but cold.
In 2024, I’m seeing and reading a lot about the current state of AI and I’m not entirely sure what to make of it all. According to one recent assessment, it’s a mortal threat to higher education and other sectors: “We all know that in the era of generative AI and large language models, the usual assessment suspects are dead. The essay, still beloved of many, is perhaps only the most obvious corpse. If we’re serious about preventing widespread plagiarism, many other traditional assessment methods are also no longer viable.”
The more I think about it, however, the more skeptical I am on the most doomsaying proclamations about the topic. Like the MOOCs craze, my takeaway is that AI will change some things but not everything and certainly not all at once.
Take, for example, Jon Stewart’s rather bizarre rant against AI — or at least the firms promoting AI — on The Daily Show:
Goodness knows that Silicon Valley CEOs are ripe for satire. But Stewart’s entire rant can be distilled down to a “lump of labor” fallacy that presumes rapid, mass unemployment across broad swathes of the economy. Just to be clear, such a thing would be bad for U.S. society even if markets eventually clear. But given that AI has been gathering momentum at the same time that unemployment has dropped to record lows, maybe Stewart’s fears are overstated just a wee bit?
Part of the problem is that Stewart accepts at face value the CEO claims that AI will displace jobs across multiple sectors. It is possible that one source of my calm about this came from something a colleague mentioned to me a few days ago. My longstanding presumption had been that artificial intelligence suffers from fewer cognitive biases than human beings — who, never forget, are barely evolved monkey brains — possess. That is spectacularly false. By their very programming, AIs are prone to overconfidence bias — they always think they have a good bead about what they are analyzing. The AI’s lack of metacognition is always its greatest weakness.
This is why, unlike Stewart and the CEOs he lampoons, I remain unconvinced that AIs will displace human laborers as rapidly as predicted. It might happen at some point, but like self-driving cars, it always seems to be five years away from happening.
Don’t take my word for it — take Ed Zitron’s, who recently speculated that we may have already reached Peak AI. Stewart’s technique of playing quotes from softball interviews with tech CEOs might very well be one source of misperception:
This is the problem with powerful people in tech. If you allow them to speak and fill in the gaps for them, they will happily do so. Murati and Altman continuously obfuscate how ChatGPT works, what it can do, what it could do, and profit handsomely from a complete lack of pushback from a press that routinely accepts AI executives' vague explanations at face value. OpenAI's messaging and explanations of what its technology can (or will) do have barely changed in the last few years, returning repeatedly to "eventually" and "in the future" and speaking in the vaguest ways about how businesses make money off of — let alone profit from — integrating generative AI….
I believe a large part of the artificial intelligence boom is hot air, pumped through a combination of executive bullshitting and a compliant media that will gladly write stories imagining what AI can do rather than focus on what it's actually doing. Notorious boss-advocate Chip Cutter of the Wall Street Journal wrote a piece last week about how AI is being integrated in the office, spending most of the article discussing how companies "might" use tech before digressing that every company he spoke to was using these tools experimentally and that they kept making mistakes….
Despite fears to the contrary, AI does not appear to be replacing a large number of workers, and when it has, the results have been pretty terrible. A study from Boston Consulting Group found that consultants that "solved business problems with OpenAI's GPT-4" performed 23% worse than those who didn't use it, even when the consultant was warned about the limitations of generative AI and the risk of hallucinations.
Another presumption about AI is that it is an arms race and China is poised to take the lead. Like clockwork, for the past few years former Google CEO Eric Schmidt sounds warnings that China is catching up to the United States in AI. Last year he wrote, “It is hard to say whether China will seize the lead in AI, but top officials in Beijing certainly think it will. In 2017, Beijing announced plans to become the global leader in artificial intelligence by 2030, and it may achieve that goal even earlier than expected.”
And yet, as of April 2024, this catch-up has yet to take place. Macro Polo’s Global AI Tracker recently concluded, “The United States remains the top destination for top-tier AI talent to work…. the United States remains far and away the leading destination for the world’s most elite AI talent (top ~2%) and remains home to 60% of top AI institutions.”
In February of this year the New York Times’ Paul Mozur, John Liu and Cade Metz surveyed AI experts and came away with a similar conclusion:
Even as the country races to build generative A.I., Chinese companies are relying almost entirely on underlying systems from the United States. China now lags the United States in generative A.I. by at least a year and may be falling further behind, according to more than a dozen tech industry insiders and leading engineers….
“Chinese companies are under tremendous pressure to keep abreast of U.S. innovations,” said Chris Nicholson, an investor with the venture capital firm Page One Ventures who focuses on A.I. technologies. The release of ChatGPT was “yet another Sputnik moment that China felt it had to respond to.”
Jenny Xiao, a partner at Leonis Capital, an investment firm that focuses on A.I.-powered companies, said the A.I. models that Chinese companies build from scratch “aren’t very good,” leading to many Chinese firms often using “fine-tuned versions of Western models.” She estimated China was two to three years behind the United States in generative A.I. developments….
When OpenAI released ChatGPT in November 2022, many Chinese firms were being hamstrung by a regulatory crackdown from Beijing that discouraged experimentation without government approval. Chinese tech companies were also burdened by censorship rules designed to manage public opinion and mute major opposition to the Chinese Communist Party.
Chinese companies with the resources to build a generative A.I. model faced a dilemma. If they created a chatbot that said the wrong thing, its makers would pay the price. And no one could be sure what might tumble out of a chatbot’s digital mouth.
AI might well prove to be a game-changer over the next decade. But right now an awful lot of what is said or written about AI falls into the category of “competent bullshit.”2 In other words, the folks who claim AI will be immediately transformative sound way too much like they are parroting ChatGPT — and not in a good way.
In the case of reading, both fiction and non-fiction.
Is it possible that, five years from now, this column will fall into that category as well! My metacognition forces me to acknowledge this possibility!
Here is a quote from my Mystery AI Hype Theater email from the people at DAIR who are actual respected full-time AI skeptics: (They are exasperated with a Chronicle of Higher Education article which I cannot read because paywall):
"A clearer-eyed view of what has happened in the last two years is that a few companies have amassed enormous amounts of data (mostly taken non-consensually) and the capital to buy the computing resources to train enormous models which, in turn, can be used to output facsimiles of the kinds of interactions represented in the data. So we have the ability to get text on demand that looks like legal contracts, or looks like medical diagnoses, or looks like therapeutic conversations, or looks like a news article, or looks like scientific research papers. But it actually is none of those things, because in all cases the textual artifact isn't really the point; the point is rather the thought processes and relationship-building that led to and follow from the textual artifact. (The sort-of exception here is legal contracts, where the textual artifacts are very much the point, except that the whole task is designing a textual artifact that meets the needs of the parties entering into the contract. Those needs usually extend well beyond "a text that has some nice legalese in it and otherwise looks like a contract.")
It adds insult to industry to say that higher ed should be (and is failing at) keeping pace with this purported rapid progress."
I wrote the appended commented on Dave Karpf's substack just before coming here. Responding to the BCS study you mention, it's a bit self-serving I think. If a consultant using Chat-GPT can produce a report that's only 23 per cent worse than the traditional product in, say, half the time, that will be enough to chew up the low-end of the consultancy market, which is most of it.
*My comment*
ChatGPT is already as good as McKinsey (and that's not a compliment). I prompted ChatGPT with "can you write a comparison of wind and wave power mentioning advantages of both but concluding that wave power is superior" and got the following. Add some glossy pics (thanks, DALL-E) and it's indistinguishable from the consulting report you'd get with the same prompt (maybe worded more subtly)
ind power has several advantages over traditional fossil fuel-based energy sources. It is a renewable and clean energy source that does not produce any greenhouse gas emissions during operation.[1][2] Wind turbines also have a relatively small land footprint compared to other renewable technologies like solar.[3] Additionally, wind power is a mature technology with well-established infrastructure and a declining cost of energy.[1][2]
However, wind power also has some limitations. Wind is an intermittent resource, with power output varying significantly based on wind speed.[2] This can lead to issues with grid integration and reliability. Wind farms also require large open spaces, which can limit their deployment in some regions.[3]
In contrast, wave power offers several advantages over wind power. Waves are a more consistent and reliable energy source, as they are hardly interrupted and almost always in motion.[2] This makes electricity generation from wave energy more reliable and predictable compared to wind.[2] Wave energy converters also have a smaller footprint and can be deployed offshore, reducing the need for large open spaces.[4][3]
Furthermore, wave power has a higher energy density than wind, meaning more energy can be extracted from the same area.[5][6] The theoretical global output of wave power is estimated to be around 29,500 TWh/yr, which is roughly 125% of the current global electricity demand.[5] This vast untapped potential makes wave power an attractive option for renewable energy generation.
While both wind and wave power have their advantages, the evidence suggests that wave power is the superior renewable energy source. Wave power offers greater reliability, predictability, and energy density compared to wind, making it a more promising option for large-scale renewable energy generation.[2][5] As such, the development and deployment of wave energy technologies should be a priority in the transition to a sustainable energy future.
Wave vs. Wind and Solar - 'Sintef' - Blog
Wave energy pros and cons - SolarReviews
Feasibility of Wave Power - Stanford University
Advantages and Disadvantages of Wave Energy - Greentumble
Wave and Wind are the New Hybrid Renewable Energy Source
Review of Hybrid Offshore Wind and Wave Energy Systems