Share
News

ChatGPT Displays a 'Significant, Systematic' Liberal Bias – 'Danger of Influencing Election Results,' Say Researchers

Share

An artificial intelligence program that’s taken the world by storm could contain a dangerous quirk.

The program ChatGPT has a liberal bias, according to a new study from academics at the University of East Anglia in the United Kingdom.

Text generated by the program is prone to “significant and systematic political bias toward the Democrats in the U.S., [Luiz Inácio Lula da Silva] in Brazil, and the Labour Party in the U.K.”

Users of the program have pointed out that similar prompts for differing political figures are likely to produce wildly different answers from ChatGPT.

One author of the study is even worried that ChatGPT’s bias could influence elections.

“There’s a danger of eroding public trust or maybe even influencing election results,” University of East Anglia lecturer Fabio Motoki, one of the study’s authors, told The Washington Post of the program’s biases.

OpenAI, the developers of the program, asserted that a human-directed review process was underway earlier this year to deal with potentially biased ChatGPT political content.

Do you believe artificial intelligence could influence election results?

Human reviewers of ChatGPT are directed not to favor any political group in guiding the program’s answers, according to OpenAI.

“Biases that nevertheless may emerge from the [review process] are bugs, not features,” the laboratory stated in a February blog post.

Google admitted that its own chatbot AI program, Bard, may prove partial to biases in a March blog post.

Related:
Hollywood Superstar Threatens Lawsuits Long After He's Dead: 'My Law Firm Will Still Be Very Active'

The company confirmed that the program could be susceptible to the same biases that affect many humans.

“Because they learn from a wide range of information that reflects real-world biases and stereotypes, those sometimes show up in their outputs.”

Motoki reiterated some of the dangers posed by biased artificial intelligence in remarks provided to Cyber News.

“The presence of political bias can influence user views and has potential implications for political and electoral processes,” he said.

“Our findings reinforce concerns that AI systems could replicate, or even amplify, existing challenges posed by the internet and social media.”

Truth and Accuracy

Submit a Correction →



We are committed to truth and accuracy in all of our journalism. Read our editorial standards.

Tags:
, , , , ,
Share

Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.

Conversation