ChatGPT’s Strong Left-Wing Political Bias Unmasked by New Study

Malaysia News News

ChatGPT’s Strong Left-Wing Political Bias Unmasked by New Study
Malaysia Latest News,Malaysia Headlines
  • 📰 SciTechDaily1
  • ⏱ Reading Time:
  • 76 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 34%
  • Publisher: 68%

A research study identifies a significant left-wing bias in the AI platform ChatGPT, leaning towards US Democrats, the UK's Labour Party, and Brazil's President Lula da Silva. The artificial intelligence platform ChatGPT shows a significant and systemic left-wing bias, according to a new study by

A study by the University of East Anglia reveals a significant left-wing bias in the AI platform ChatGPT. The study highlights the importance of neutrality in AI systems to prevent potential influence on user perspectives and political dynamics.The 3 Most Secretive Zodiac Signs #astrology #zodiac

The team of researchers in the UK and Brazil developed a rigorous new method to check for political bias., the findings show that ChatGPT’s responses favor the Democrats in the US, the Labour Party in the UK, and in Brazil President Lula da Silva of the Workers’ Party.Concerns of an inbuilt political bias in ChatGPT have been raised previously but this is the first large-scale study using a consistent, evidenced-based analysis.

“Our findings reinforce concerns that AI systems could replicate, or even amplify, existing challenges posed by the Internet and social media.”The researchers developed an innovative new method to test for ChatGPT’s political neutrality. To overcome difficulties caused by the inherent randomness of ‘large language models’ that power AI platforms such as ChatGPT, each question was asked 100 times, and the different responses were collected. These multiple responses were then put through a 1000-repetition ‘bootstrap’ to further increase the reliability of the inferences drawn from the generated text.

The unique new analysis tool created by the project would be freely available and relatively simple for members of the public to use, thereby “democratizing oversight,” said Dr. Motoki. As well as checking for political bias, the tool can be used to measure other types of biases in ChatGPT’s responses.While the research project did not set out to determine the reasons for the political bias, the findings did point toward two potential sources.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

SciTechDaily1 /  🏆 84. in US

Malaysia Latest News, Malaysia Headlines



Render Time: 2025-02-26 18:02:01