Read Daily News or Be Left Behind ?

LATEST NEWS

Read Daily News Or Be Left Behind

Live in the Present & Plan for the Future

By now the horror stories of news publishers churning out error-ridden, AI-generated content are well known. Other uses of AI have helped journalists make headlines for the right reasons — like The New York Times using computer vision on satellite imagery to count bomb craters in Gaza.

But beyond the most high-profile wins — and faceplants — understated but impactful applications of generative artificial intelligence are quietly transforming newsrooms worldwide. Generative AI is presenting new opportunities for augmenting the reporting process and storytelling itself. From helping with copy editing to surfacing insights buried in vast troves of data, the latest breed of AI can and in some cases already is equipping journalists with powerful tools to elevate their craft. And as the technology evolves, its potential to bolster accuracy, efficiency, and depth of coverage promises to reshape journalism and the news business.

A 2023 survey of 105 news organizations across 46 countries found that the majority see potential upside for journalism in generative AI tools like ChatGPT. In the survey by JournalismAI, a global initiative to keep news organizations informed about AI, almost three-quarters of respondents said such AI applications present new opportunities for the field. And 85% of respondents — including journalists, technologists, and newsroom managers — have experimented with using AI for tasks including creating images and generating story summaries.

Semafor, which launched six weeks before ChatGPT, is one of those experimenting newsrooms. The media startup has two AI tools in use — an internal editing bot and MISO. The latter helps find the stories that power Signals, a breaking news feed curated by journalists and generative AI to highlight news from publications across the globe. Semafor’s approach to AI is to be clear-eyed about its current capabilities and limit its uses to those areas, said Gina Chua, Semafor’s executive editor.

“Essentially, you’ve got an English major that can do a lot of stuff,” Chua said.

Beyond those tools, she’s experimenting with what’s known as retrieval-augmented generation, or RAG. This technique lets an AI chatbot churn out relevant information on the fly, prioritizing a library of data provided by Chua. This approach helps combat so-called hallucinations, or occurrences in which generative AI models make things up.

Chua built a RAG tool using the Department of Justice’s definition of a hate crime and gave the chatbot certain scenarios, like a white man attacking an Asian woman. The chatbot could label it a hate crime or not and explain the reasoning behind that call. She fed the second model the Trans Journalists Association’s style and coverage guidelines. She then asked the AI to provide feedback on published stories and the level of misinformation about trans issues in them. Both have worked well, and Chua wants to explore building RAG models that could help Semafor’s journalists navigate how to report on sensitive subjects or quickly get up to speed on a story.

Chua built these chatbots in her spare time with easy-to-use tools. She said journalists of most stripes need to experiment with AI — and not dismiss it as being not as good as they are at certain tasks.

“It’s probably not going to be as good as a human in some things,” she said. “It’s a mistake to say, ‘I want this to be a human.’ The trick is to say, ‘I want this to be as good a tool as it can be, and how do I complement it, and how does it complement me?’”

Chatbots driving subscriptions

The most common way people interact with generative AI is through chatbots like OpenAI’s ChatGPT or Google’s Gemini. Newsrooms are also building chatbots for their readers, including Skift, a business news website that covers the travel industry.

ChatGPT had only been out for a couple weeks when Skift CEO Rafat Ali told an audience of travel professionals that “this is going to be huge, you need to start working on this now,” recalled Jason Clampet, Skift’s co-founder and president.

Clampet took that advice to heart and Skift engineers soon built an AI assistant called “Ask Skift.” The chatbot, trained on Skift’s 30,000 news stories and reports, can answer users’ travel questions and even suggest existing Skift stories for further reading.

“It’s a great way for us to be able to figure out how to cover stories,” Clampet said. “Another news site might just have to write about something as a knowledgeable observer. We’re able to be like, ‘Oh, we see what the issue is here.’”

Ask Skift now answers thousands of questions every week. The chatbot works much like Skift’s paywall, allowing users to ask three questions for free before prompting them to become paid subscribers. The Ask Skift paywall is already leading to some $365 annual subscriptions, Clampet said.

He said Skift monitors readers’ questions to identify trends and potential story ideas. This leads to new coverage, like an article about why travel costs have risen so much.

“Almost in the same way that somebody might do research on Google Trends to figure out what people are up to and looking for,” Clampet said. “This is a way for us to see this is what people have been asking for and here’s how we go further on the subject.”

Ask Skift was just its first generative AI experiment — the engineering team also released an app that lets users ask questions on Slack, the widely used office chat platform. With the technology getting easier to use every day, Clampet said, more ideas are being considered.

“Most of it is just kind of trial and error and figuring out how to make it a little better each time,” he said.

‘A rush to get ahead’

High-profile blunders at Sports Illustrated are just the tip of the iceberg. Close to a dozen other outlets have published AI-generated articles that contained errors, a problem documented extensively by the tech site Futurism.

The missteps seem to stem from companies rushing to be the first in deploying generative AI capabilities without thorough vetting processes in place, said Felix M. Simon, a communications research and doctorate student at the Oxford Internet Institute.

“What we can see from the outside is, in all cases, it was a rush to get ahead and to implement AI as quickly as possible,” he said.

Involving journalists during production and then before publication is essential, Simon said. The need for human oversight is another reason that journalists will have to learn their way around AI tools.

“There will be a learning curve,” Simon said. “We’ll have to get used to working with these systems if we want to work with them in the first place and identify their strengths, but also their weaknesses.”

‘AI is a tool. It’s only a tool’

While human oversight remains crucial, Simon cautioned against complacency regarding AI’s potential for job displacement. It won’t be why journalists get replaced in the near term — deeper systemic issues, including cost-cutting pressures and a lack of sustainable business models for media led to mass layoffs of journalists long before generative AI.

But as AI technology advances, it could require less human help and, even at its current abilities, reduce the need for some staffing. Management could also use AI to justify further layoffs, Simon said.

Others are worried that relying on giant tech companies could exacerbate the news media’s already significant struggles. Rodney Gibbs, the senior director for strategy and innovation at The Atlanta Journal-Constitution, wrote in Nieman Lab late last year that the conversation about ChatGPT has echoes of the “pivot to video” that sent many newsrooms astray in digital media’s Facebook era.

It’s a fear that Nikita Roy, who heads the AI Journalism Lab at the Craig Newmark Graduate School of Journalism, has heard — but doesn’t put much stock into. Newsrooms aren’t reliant on tech companies to attract audiences this time, she said. Instead, newsrooms are customers buying a product, giving news outlets more power in the relationship.

Importantly, AI tools are easy enough to use that even the smallest newsrooms can benefit, Roy said. And used correctly, AI could benefit a perennially besieged industry.

“AI is a tool,” she said. “It’s only a tool. But it’s a tool to help us do more work and reach more audiences, and those are the two things we need to do.”

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

Leave a Comment

Your email address will not be published. Required fields are marked *