I've been running an experiment. I wanted to see if AI could generate opinion articles that while written by AI capture my personality and perspectives. My AI Daily News site was initially just a way for me to aggregate news stories about AI into something I could digest in the morning before I started work.
Later I thought I would provide it a range of my prior writing, and to get it to prepare a 'Opinion' with my name on the byline. Would it produce something plausibly by me, presenting my views, but on the news of the day?
Sadly I think there has been a fundamental change from the early days of OpenAI models where the results were creative, unpredictable and entertaining. Now they have been trained in such a way to produce the same bland writing style regardless of the instructions you provide.
I got into the habit of waking up each morning, reading the 'opinion' and coaching Claude to rework it every day. Why? Because it would write opinions which were conflicting with my own documented views in fundamental ways. It would include the terms I have used, but not internalized the concepts. So each day I would need to correct it.
Multiple news organisations have banned the use of AI in journalism, and now I have experience of why. It isn't just opinions however; the stories it writes also have opinions injected beyond the facts. At least with the stories I have always linked to the original source material at the bottom, which for most stories is at least two stories.
I am not doing journalism by any measure. Journalism means doing the research, doing the interviews, cross referencing, and creating a cohesive angle for the article. Journalism isn't unbiased, in that it is influenced by the point of view of the writer, but journalistic integrity still means something.
Does this mean therefore that AI can't play a part in journalism?
My experience with AI in software has parallels. AI will happily generate code which passes 'tests' by altering the tests; in effect changing the conditions of success to please the user. AI does work in software development, but only when you have a framework which prevents this kind of gaming.
People have lost trust in journalism, partially a result of AI Slop, the kind of text which doesn't differentiate between fact and fantasy. There is also the angst of journalists fearing for their jobs resisting and minimizing the utility of AI. There is a temptation to cite ethics as a reason not to use AI when the real motivation is fear of being replaced.
The answer I think will be to employ the same disciplines that apply to human journalists to AI. That is, checking facts, resisting the temptation to opine, while at the same time creating compelling, entertaining and informing articles.
In my software development AI has become a partner, but not a replacement. It still needs me to apply that discipline to get good results. Just like software, journalism could benefit from AI, but only with stringent disciplines around how it functions.
AI journalism needs to be more than just a way of ripping off the work of actual journalists, rather to engage with the real world, and to be held to the same standards in terms of accuracy. The issue of how AI will impact jobs is a larger issue, but should not be confused with the utility of AI.
Top comments (0)