Skip to main content
University of Washington University of Washington

UW Menu

  • UW Home
  • College of Arts & Sciences
  • ArtsUW
  • Directories
  • Maps
  • MyUW
Home

Main navigation

  • About Us
    • Overview
    • Current Leadership
    • Recent News
    • Department History
    • Chair's Newsletter
    • Diversity
    • Land Acknowledgement
    • Give Your Support
    • Contact Us
  • People
    • Faculty
    • Postdoctoral Researchers
    • Students
    • Staff
    • Alumni
    • Faculty Honors
    • Student Awards
  • Academics
    • Undergraduate
    • Graduate
    • Non-Degree Enrollment
    • Courses
  • Research
    • Preprints
    • Faculty Books
    • Working Groups
    • Department Seminars
    • Statistical Consulting
  • Resources
    • Resources
    • Employment
    • Undergrad Opportunities
    • Department Forms
    • Policies and Procedures
  1. Home
  2. With just a few messages, biased AI chatbots swayed people’s political views

With just a few messages, biased AI chatbots swayed people’s political views

With just a few messages, biased AI chatbots swayed people’s political views

UW researchers, including Jillian Fisher and Katharina Reinecke, recruited self-identifying Democrats and Republicans to make political decisions with help from three versions of ChatGPT: a base model, one with liberal bias and one with conservative bias. Democrats and Republicans were both likelier to lean in the direction of the biased chatbot they were talking with than those participants who interacted with the base model.

KUOW, KNKX and GeekWire published related stories.


View External Article

Be boundless

University of Washington

Footer menu

  • Accessibility
  • Contact Us
  • Jobs
  • MyUW
  • Privacy
  • Diversity
  • Terms
  • Intranet
  • Login

© 2025 University of Washington | Seattle, WA