Research / Insights on current and emerging industry topics
What journalists really think about AI us in newsrooms
As AI becomes standard across UK newsrooms, journalists report growing reliance on automation alongside deep anxiety about trust, accuracy, and originality, which highlights the need for clearer standards, stronger training, and responsible integration.
December 9, 2025 | By Rande Price, Research VP – DCN
AI’s influence on journalism is no longer theoretical; it’s unfolding inside newsrooms right now. A new Reuters Institute study of 1,004 UK journalists captures this shift. It reveals how they use AI, how newsrooms structure that use, and what they fear or expect as adoption grows. The findings point to real momentum, uneven strategies, and rising concerns that will shape what comes next.
AI use grows across core newsroom tasks
UK newsrooms now treat AI as standard equipment. More than half of UK journalists use AI at least weekly, and more than a quarter of them use it daily. Language-processing tasks dominate AI usage. Forty-nine percent of journalists use AI for transcription or captioning at least monthly, and many rely on it weekly or daily. Translation follows at 33%, and grammar checking or copy-editing at 30%.
At the same time, journalists are starting to turn to AI for core reporting work. Twenty-two percent use AI for story research at least monthly. Sixteen percent use it for idea generation or headline drafting. Twelve percent use it for fact-checking or source assessment, and 10% say they generate first drafts of articles with AI tools. This data reveals AI’s shift from the margins of newsroom work into its editorial center.
AI habits vary by newsroom role and reporting focus
The report also highlights important differences in adoption patterns. Journalists under 30 lead weekly use at 42%, compared with 29% among those aged 50 and older. Men use AI more often than women, though the gap narrows at lower frequency levels. Seniority matters as well. Journalists with strategic authority use AI at significantly higher rates than those with limited editorial responsibility.
Beat assignments also influence use. Business reporters show the highest levels of weekly use at 43%, compared with only 21% among lifestyle reporters. Data-heavy and time-sensitive beats push more experimentation.
Format demands also drive adoption. Journalists who work across many formats turn to AI more often. Journalists who work in at least five formats use AI monthly or more at 62%, compared with 48% of journalists who work in only one format. As newsrooms stretch reporters across text, audio, video, social posts, and visuals, journalists increasingly use AI to fill gaps to create efficiency.
Journalists hold deep concerns about ethics and quality
Even as their usage grows, journalists express concern and unease about AI’s impact on journalism. Sixty percent say they are extremely concerned about negative effects on public trust. Fifty-seven percent express the same level of concern about harm to accuracy, and 54% worry about the impact on originality.
Concern spans almost all demographics. Journalists with higher AI knowledge tend to express stronger concern. Daily AI users express less anxiety, likely because familiarity breeds confidence. Still, the level of concern across the profession underscores the need for clearer standards and newsroom transparency.
Responsible AI adoption in newsrooms
The report points to several considerations for responsible and effective AI adoption:
- Create clear protocols. Many newsrooms still lack guidance on transparency, bias, fairness, and appropriate use. Staff need clarity about when and how to use AI and how to validate its output.
- Invest in training. Only 32% of journalists say their newsroom provides AI training. As more reporters adopt AI tools, they may need structured support and instruction in verification, accountability, and oversight.
- Keep humans in control. Many journalists say AI increases low-level work such as data cleaning and output checking. Newsrooms should examine workflows carefully to ensure AI reduces, rather than adds to, workload.
- Protect trust. Audience trust remains the biggest concern. Newsrooms need clear transparency practices and should explain how and why they use AI in reporting.
- Match tools to newsroom goals. Different organizations have different missions, capacities, and audiences. A large broadcaster may build in-house tools, while a small digital outlet may rely on third-party services. Responsible adoption means aligning these choices with editorial values and available resources.
AI adoption continues to expand across UK newsrooms, with journalists relying on it for tasks ranging from transcription and research to headline drafting. Yet many organizations are still in the early stages of true integration. Moving forward in the most positive direction will require clear guidance, sustained training, and a commitment to transparency. Newsrooms that prioritize ethics and trust as they adopt AI will be best positioned to capture its benefits while minimizing its risks as the technology evolves.



