Digital Ageism: Challenges and Opportunities in Artificial Intelligence for Older Adults

Charlene H. Chu, Rune Nyrup, Kathleen Leslie, Jiamin Shi, Andria Bianchi, Alexandra Lyn, Molly McNicholl, Shehroz Khan, Samira Rahimi, Amanda Grenier

    Research output: Contribution to journalReview articlepeer-review

    44 Citations (Scopus)

    Abstract

    Artificial intelligence (AI) and machine learning are changing our world through their impact on sectors including health care, education, employment, finance, and law. AI systems are developed using data that reflect the implicit and explicit biases of society, and there are significant concerns about how the predictive models in AI systems amplify inequity, privilege, and power in society. The widespread applications of AI have led to mainstream discourse about how AI systems are perpetuating racism, sexism, and classism; yet, concerns about ageism have been largely absent in the AI bias literature. Given the globally aging population and proliferation of AI, there is a need to critically examine the presence of age-related bias in AI systems. This forum article discusses ageism in AI systems and introduces a conceptual model that outlines intersecting pathways of technology development that can produce and reinforce digital ageism in AI systems. We also describe the broader ethical and legal implications and considerations for future directions in digital ageism research to advance knowledge in the field and deepen our understanding of how ageism in AI is fostered by broader cycles of injustice.

    Original languageEnglish
    Pages (from-to)947-955
    Number of pages9
    JournalThe Gerontologist
    Volume62
    Issue number7
    DOIs
    Publication statusPublished - 1 Sep. 2022

    Keywords

    • Bias
    • Gerontology
    • Machine learning
    • Technology

    Fingerprint

    Dive into the research topics of 'Digital Ageism: Challenges and Opportunities in Artificial Intelligence for Older Adults'. Together they form a unique fingerprint.

    Cite this