A recent comprehensive study conducted by Richard Fletcher and Rasmus Kleis Nielsen, published by the Reuters Institute for the Study of Journalism, delves into the public’s awareness, usage, and opinions of generative AI, particularly within the realm of news media. This study, which spans six countries—Argentina, Denmark, France, Japan, the UK, and the USA—reveals a complex landscape of mixed reactions toward the integration of AI in journalism.
One of the study’s key findings is the varying levels of awareness and usage of generative AI tools among the public. ChatGPT emerges as the most recognized generative AI tool, with around 50% of respondents aware of its existence. Despite this high level of awareness, frequent usage remains surprisingly low. Across all surveyed countries, only 1-7% of respondents report using such tools daily, with a significant portion having only interacted with them once or twice. This indicates that while generative AI tools are widely known, they are not yet embedded in routine internet usage for many people.
The study also uncovers diverse expectations regarding the impact of generative AI on individuals’ lives and broader society. Younger respondents, in particular, expect generative AI to impact their daily lives compared to older individuals significantly. This generational difference underscores the varying degrees of optimism and adaptability towards new technologies.
Interestingly, there is a mixed outlook on whether generative AI will improve or worsen life. In four of the six surveyed countries, a notable plurality believes that generative AI will enhance their lives. However, there is more skepticism about its broader societal impact. While people are generally optimistic about AI’s potential to benefit science, healthcare, and daily activities, they are pessimistic about its effects on the cost of living, job security, and the quality of news.
When it comes to journalism, the public perceives that AI is currently employed for various editorial tasks. These include editing spelling and grammar (43%), writing headlines (29%), and even drafting entire articles (27%). Moreover, a significant portion of respondents (32%) believe that human editors review AI-generated content before it is published, highlighting a blend of human and machine collaboration in current journalistic practices.
Despite the integration of AI in news production, there remains a clear preference for news content produced by human journalists. Respondents express more comfort with AI-generated content in softer news areas, such as fashion and sports, as opposed to hard news topics like international affairs and politics. This preference suggests that while AI can augment certain aspects of journalism, it has yet to gain full trust in areas requiring nuanced analysis and critical thinking.
Transparency in the use of AI in journalism is a significant concern among respondents. The majority advocate for clear labeling of AI’s role in news production, with only 5% believing that no disclosure is necessary. Approximately one-third to half of the respondents assert that AI’s involvement in tasks such as editing and writing should be explicitly labeled. This demand for transparency is crucial in maintaining public trust and ensuring ethical standards in news media.
The study by Fletcher and Nielsen offers a nuanced understanding of public sentiment toward generative AI in journalism. While there is considerable awareness and cautious optimism about the potential benefits of AI, concerns about trustworthiness, transparency, and societal impact persist. These findings highlight the importance for media organizations to balance technological innovation with human oversight and clear communication to build and maintain public trust.
As the integration of AI in journalism continues to evolve, these insights provide valuable guidance for media professionals. By addressing the public’s concerns and expectations, the industry can harness the advantages of AI while safeguarding the core values of trustworthy and transparent journalism.