Job description
PhD project Background. Older adults (60+ years) often have lower digital skills (Bhattacharjee et al., 2020). They are less accurate than younger adults in discriminating manipulated images from real ones (Nightingale et al., 2022) and they are more vulnerable to fake news (Moore & Hancock, 2022) and online scams (Fraud on the Elderly, 2013). Several social and cognitive factors affect Older adults’ vulnerability, including (I) being too trusting, (II) social isolation, (III) psychological vulnerability and (IV) risk taking (Shao et al., 2019). Interventions have been developed to help Older adults detect fake news and manipulated images (Moore & Hancock, 2022; Nightingale et al., 2022). Recent technological developments, such as AI, Deepfakes, and immersive virtual & augmented realities may leave Older adults more vulnerable than younger adults to deception and manipulation using these technologies. Methodology. The PhD will consist of multiple studies, each looking at a specific technological context. The first study will focus on disinformation and AI-generated images. It will investigate the importance of cognitive biases (including risk taking, sensory defects and experience with the technology) and social factors (e.g. ethnicity, social networks, social isolation, trust) in accepting disinformation and AI-generated images. It will also study the efficacy of interventions (Nightingale et al. 2022) that intend to help people detect disinformation and image manipulation. The second study will focus on the detection of manipulated video (both AI-generated “deepfakes” and simpler manipulations, such as false subtitles). The study will investigate participants familiarity with these manipulations, factors that predict their ability to detect them (including those in work package 1) and test the efficacy of Nightingale et al.’s (2022) intervention in helping participants detect them. The third study will focus on participants attitudes to immersive technologies and manipulated content. Do older people perceive immersive technology as ‘real’ and what would they consider ‘fake’ or ‘manipulated’ in such virtual or augmented spaces? Supervisors From Psychology Dr Lara Warmelink (Director of Studies) Dr Sophie Nightingale Prof Trevor Crawford From Health Research Dr Faraz Ahmed Candidate requirements Candidates should have a) a good undergraduate degree in Psychology, Computer Science, Communication or a cogent discipline b) experience with data collection and/or data analysis. Desirable: To have worked with Older adults For more information, please contact Dr Lara Warmelink at l.warmelink@lancaster.ac.uk, Department of Psychology, Lancaster University.