Journo Resources Fellow

June 10, 2024 (Updated )

“Journalism has always been a business of scarce resources,” says David Caswell. “There was limited space on the page, limited time to do reporting, and you couldn’t do a custom story for every audience segment. So, I think that journalists and media companies are looking at AI as a way to get past those.”

David is a former executive product manager at BBC News and BBC News Lab. Now he’s turned to focus on AI, recently founding StoryFlow, a consultancy applying large language learning models (LLMs) and generative AI to news production. He believes that news outlets both large and small will be compelled to adopt artificial intelligence to remain competitive.

The figures agree: a global survey of 105 newsrooms from the London School of Economics’ JournalismAI project found that more than 75 per cent had already started to use AI tools. Respondents also noted that the low cost of generative AI tools, as well as a low requirement for tech skills, make them accessible across the industry.

Journo Resources
"You don’t need experience in data science or to be able to code. All you need is to sit down in front of some easy-to-learn tools and be curious and engaged for an hour."
David Caswell, Founder of StoryFlow

“You don’t need experience in data science or to be able to code,” agrees David. “All you need is to sit down in front of some easy-to-learn tools and be curious and engaged for an hour. Most of these are inexpensive — ChatGPT premium is US$20 a month. So, there’s no headstart for big organisations. Tiny newsrooms can do what, a year ago, would have taken a lot of resources.”

When it comes to AI in the news, it’s clear the horse has already bolted — what matters more is what we do next. How can we harness its possibilities (and mitigate its challenges) with more than just scepticism?

So, How Can Journalists Use AI?

First things first, it’s important to understand what AI even looks like in a newsroom context. As Eleanor Warnock and Tim Smith from Sifted point out in an editorial about the tools their publication uses “some of the most helpful tools have been around for five years already.”

Or, to put it another way, if you’ve ever used Grammarly to check your work, or transcribed an interview with Otter.ai, then you’re already using artificial intelligence to do your work. However, you’ve likely heard more about tools like ChatGPT, Bard AI, or Adobe Firefly during the past 12 months — tools where you input a prompt and receive content in return.

“Teach journalists and editors in your newsroom how to write a good prompt,” urges Mike Reilley, the founder of Journalists’ Toolbox AI, which helps reporters incorporate AI into their workflow ethically and effectively.

“Almost any job description you see has prompt writing as a required skill,” he adds, saying it can be a useful tool to help journalists edit their stories. The team at Sifted also uses it for background research, which they claim can be up to 15 minutes quicker than using Google.

Innovative AI Projects Already In Newsrooms

AI-based tools are appearing all the time. Here are just two highlighted at the latest Newsrewired conference — you can check out their future events here.

Sophina: Sophia Smith Galer, who’s amassed more than half a million followers on TikTok, built Sophina after finding that while 57 per cent of journalists she worked with promoted their work on Instagram, just 30 per cent did so on TikTok. The main barriers they reported included a lack of time, video skills and confidence on-screen.

“When I was thinking of this AI tool that I had in my head, I knew it could solve the first two problems,” she says. “I transcribed over 100 of my TikTok videos that had performed well, combined it with a knowledge base, and trained an AI tool.” The result is an ethical tool that takes your news story and spits out a TikTok-ready script that doesn’t sound like it’s been written by Chat GPT.

The News Creator Tool: Jody Doherty-Cove is the head of editorial AI at Newsquest and believes AI-powered tools have the power “to win back time for newsrooms and get them back out into their communities”. The in-house tool drafts articles based on “trusted information” such as a press release or details of a sporting event, along with information such as a word count. The reporter will then be able to review and edit the suggested copy, which can be easily published thanks to an integration with the publisher’s CMS.

“What we’re doing is alleviating the burden of some of the mundane but important things we need to do,” he explains. “the back end of the papers, the nibs, the community news, by doing those quicker and giving more time back to those traditional reporters.”

For Sonya Barlow, founder of AI-enabled tech app LMF Network and a BBC presenter, it’s about thinking creatively. Speaking at a recent Journo Resources event she explained how she uses large language models to hone her pitches by feeding programmes multiple stories from an outlet she wants to work for and then asking for a summary of the house style. She’ll then ask if her pitch or copy matches.

“It tells me what I could be doing better,” she adds. “[Another example] is when I wrote an article recommending using London restaurants for solo diners, I asked ChatGPT or Bard for their top ten suggestions. And I didn’t use those suggestions, because I wanted to make sure that what I was saying wasn’t already covered on the internet.”

Sonya also suggests using AI-powered SEO tools like SEMrush to understand what audiences are looking for, while Mike even recommends tools to repurpose content. “Vidiofy uses AI to create social media videos from articles — all you have to do is plug in the web address,” he explains.

Is AI Going To Replace Journalists?

But, with such a wide range of uses, it was only a matter of time before fears about job losses bubbled to the surface. In July 2023, Germany’s biggest tabloid, Bild, fired one of the first warning shots, announcing that it was laying off a third of its staff and migrating their functions to AI applications.

Mathias Döpfner, CEO of Axel Springer, the company that owns Bild, wrote in an internal memo to employees: “ Artificial intelligence has the potential to make independent journalism better than it ever was — or simply replace it.”

However, David is keen to stress that “automation has the potential to replace tasks, not jobs”, because “these things don’t operate themselves”. But even in a world where AI frees up journalists’ time for greater creativity and autonomy, it doesn’t come without risks.

Journo Resources
Journo Resources

Sophia Smith Galer (L) and Sonya Barlow (R)

Top of the list is “hallucinations”, where a language model where a language model generates false or nonsensical information. “Hallucinations are very hard to find in editing because these language models are built to sound plausible,” David explains.“If you don’t really understand the source material that is driving output, it’s very easy to introduce significant error without knowing it.” In other words, sufficiently fact-checking entirely AI-generated articles could take as long as researching them in the first place.

Then there’s the risk that AI could perpetuate biases — inheriting it from the data they are trained on, producing skewed results and exhibiting prejudice against marginalised groups.

We’ve already seen these unintended consequences with The New York Timesmoderation software. Launched in 2017 using a machine-learning system called Perspective, it aimed to make moderation more efficient by predicting how a human might review comments under an article.

However, researchers at the University of Washington found that Moderator was more likely to rate African-American English as toxic. In the words of The New York Times, as the world contains racism and sexism, the patterns in data will likely contain racial and gender bias.

Journo Resources
"I wrote an article recommending using London restaurants for solo diners, I asked ChatGPT or Bard for their top ten suggestions. And I didn’t use those suggestions, because I wanted to make sure that what I was saying wasn’t already covered on the internet."
Sonya Barlow, LMF Network

And then there’s the writing itself. Even a quick experiment with prompts on ChatGPT will quickly confirm that nuanced copy is not a strength of language models, which cannot make inferences or generate commentary based on a wider societal understanding.

“They don’t look at nuance or pick up on the absence of things, which is a big skill in journalism. If you were just to use these tools blindly, you would really degrade the quality of your journalism,” David says.

“I think [AI tools] shift our role as journalists into curation,” adds Mike. “Think about a tool like Headline Hero that writes article headlines. If you’re using it, you still need to know what is a good and bad headline so you can tweak that headline to make it better.”

“Follow what the Associated Press calls an 80:20 rule. If AI is going to do 80 per cent of your work, you need to have 20 per cent human input on the back-end for quality control and editing.”

Moving Towards A Regulated Future?

But, even if you are committed to the highest standards of AI in journalism, that doesn’t mean everyone else is. In July 2023, Hold The Front Page reported on the launch of a new local news website The Bournemouth Observer — which was staffed by fake journalism profiles and refused to answer questions about whether the stories were generated using AI.

Earlier in the year, Russia Today tweeted a fake image of an explosion near the Pentagon in Washington, DC, causing a brief dip in the US stock market as the story was picked up by other outlets. Experts later told NBC that the image had the tell-tale signs of an AI-generated forgery. And, as AI tools improve at pace, detecting such misinformation will only become more challenging<.

It’s a worry shared by some of the world’s biggest publishers. In August 2023 Getty Images, The Associated Press, AFP, and others signed an open letter</a calling for “unified regulation and practices”, amid fears AI could “threaten the sustainability of the media ecosystem” by eroding readers’ trust.

While a growing number of newsrooms, from The Guardian and the BBC to WIRED and The Telegraph, have already published policies on the use of the tech in their newsrooms, there are currently no public sector policies. However, regulatory frameworks are emerging.

Useful Prompts To Try Yourself

Summarise This Paper: Academic or research papers can be long — try feeding it into a large language model and asking it to summarise the key points.

Tell Me Popular Recommendations: As Sonya says, AI is great for telling you what’s already out there. Ask it for popular recommendations so you can avoid them and find something unique.

Ask If You’ve Nailed The Tone: Feed the AI examples of articles from your target publication and get it to tell you the house style — and if your copy or pitch matches up.

In the US, the Writers Guild of America won their first major union contract setting out the uses of AI by newsrooms. Namely, that AI is not a “writer” and that companies cannot force writers to use it. It also says companies must disclose when AI is used and prohibits the use of writers’ materials to train AI.

The European Union, meanwhile, proposed the Artificial Intelligence Act in 2021 and looks set to become the world’s first comprehensive regulatory law on AI. Key areas include making it mandatory to flag AI-generated content and forcing developers to publish summaries of how they train their models.

Exactly how AI journalism might look in a few years remains uncertain — but it has the potential to profoundly influence how we create, distribute, and consume content. And in that, there’s opportunity.

David concludes: “The number one thing that journalists can do to prepare themselves for the world that’s coming is to engage with the tools and understand them. Learn what they can do, what the risks are, how to mitigate the risks […] because it’s not going away. This is the way our world is going to work from now on.”

Editor’s Note: All the pictures in this piece apart from headshots have been generated by Adobe Firefly — it felt thematic. Some were a dream to conjure up. Others felt like we were bashing keywords in to create a dystopian nightmare of endless forks when all we wanted was a picture of a solo diner.