You wouldn’t know it from the breathless coverage of new AI tools like ChatGPT, but the use of AI in the media is old news. While many media companies have been using AI-generated content for years, only the closest media watchers knew they might be reading news generated by a bot. A new report digs into the reality of how AI is — and has been — used by media companies, how AI’s limits necessarily constrain its use, and where new media models may be able to put this technology to profitable use.
News media and AI are old pals
A recent report “AI and the media. Now.” from Di5rupt and FIPP collects media leaders’ thoughts on how AI is impacting global publishers. In exploring media companies’ relationship with AI through the years, it’s clear that this symbiosis began earlier than even those in the know may have realized. According to the report, the earliest use of automated content “dates back to the early 1960s, when computer systems were first used to generate news articles.” More recently, news organizations have been using AI tools to churn out data-driven content — like sports recaps, stock price updates, or weather reports — often creating their own solutions to do so.
From the Associated Press to Axel Springer to China Daily, automated content has been on the editorial calendar for years. These tools have been used largely to free up human resources for reporting, analysis, and other in-depth work. Meanwhile, other types of AI tools make light work for people by doing the front work of transcribing interview recordings, analyzing data, looking for copyright infringements, or even helping with audience segmentation. But these uses all belie the continued limitations of AI, which are many.
The limits of AI for the media
While publicly available tools like ChatGPt may have put AI-generated content squarely in the limelight, the truth is not much has changed about AI’s capabilities or how media organizations are using it. As the report puts it, “Essentially think of ChatGPT as a search engine on steroids. You ask it a question and it skims the data that it has stored to construct a response for you.” So, the report cautions that any companies planning on replacing writers with machines, must first consider the limits of AI.
- It cannot actually write breaking news (yet!) as it was only trained on data through 2021.
- Not only can AI make mistakes, but it has no real way of fact-checking its sources.
- It does not cite its sources — and is often found to be plagiarizing.
- It has its own unconscious bias.
- AI is only as good as the data it has access to, so it is often biased, wrong, or lacking important context.
- Analysis is not AI’s strong suit, at least when it comes to ChatGPT and its ilk. The report says, “It presents what it sees as fact and leaves the reader to make their own mind up.” This is why it works well for basic, data-driven content creation and not much else.
Ironically, the algorithms rely on journalists to create new content — or maybe we should call it data — to train on, ultimately making AI dependent on humans, at least in its current state. In fact, The AP and ChatGPT’s parent company, OpenAI, struck a deal to let its large language model use The AP’s content to train. These limitations mean AI will likely continue to be relegated to simple content creation tasks that allow humans to do more important work.
The potential for AI-driven news
ChatGPT’s limits aside, some startups have already launched AI-only news sites. Despite these sites being subject to the limitations above, the report thinks that the trend may be more than just a novelty.
John Barnes, Chief Digital Officer, William Reed Business Media, explains what these new media outlets could look like: ”Model around advertising and affiliate revenue and gobble up traffic by being quick and comprehensive, or find highly-targeted subscription and B2B channels where speed of information matters much more than grammatical flair or particularly special insight — results, events, and data-driven types of products.”
However, when one stops to consider the kind of content where speed matters more than insight, one will likely arrive at where AI and media first started their relationship: at financial, sports, and other data-driven news. As these sites proliferate, they will face the same challenges as other media organizations: differentiation. And when you don’t have original, investigative reporting — or even celebrity profiles — to set your publication apart, it all comes down to who can be the most accurate while maintaining speed.
Ultimately, AI may help the next Woodward and Bernstein dig through piles of data to find a story or transcribe the Watergate tapes, but it will never be able to meet Deep Throat in the parking garage.