AI tools used by English councils downplay women’s health issues, study finds
<p>Exclusive: LSE research finds risk of gender bias in care decisions made based on AI summaries of case notes</p><p>Artificial intelligence tools used by more than half of England’s councils are downplaying women’s physical and mental health issues and risk creating gender bias in care decisions, research has found.</p><p>The study found that when using Google’s AI tool “Gemma” to generate and summarise the same case notes, language such as “disabled”, “unable” and “complex” appeared significantly more often in descriptions of men than women.</p> <a href="https://www.theguardian.com/technology/2025/aug/11/ai-tools-used-by-english-councils-downplay-womens-health-issues-study-finds">Continue reading...</a>
Comments 0
No comments yet. Be the first to comment!