The artificial intelligence platform ChatGPT shows a significant and systemic left-wing bias, according to a new study by the University of East Anglia (UEA).
The team of researchers in the UK and Brazil developed a rigorous new method to check for political bias.
Published today in the journal Public Choice, the findings show that ChatGPT’s responses favour the Democrats in the US, the Labour Party in the UK, and in Brazil President Lula da Silva of the Workers’ Party.
Concerns of an inbuilt political bias in ChatGPT have been raised previously but this is the first largescale study using a consistent, evidenced-based analysis.
Lead author Dr Fabio Motoki, of Norwich Business School at the University of East Anglia, said: “With the growing use by the public of AI-powered systems to find out facts and create new content, it is important that the output of popular platforms such as ChatGPT is as impartial as possible.
“The presence of political bias can influence user views and has potential implications for political and electoral processes.
“Our findings reinforce concerns that AI systems could replicate, or even amplify, existing challenges