King's College London NLP
Projects
    EMBRACE: Enhanced Maternal and Baby Results with AI-supported Care and Empowerment
    Lead of the Digital and AI Innovation Theme. A major £35M research programme led by Prof. Josip Car, funded by Inkfish (2025-2031).
    AI-enabled High-Stakes Assessment
    Developing an AI-enabled high-stakes assessment tool in collaboration with AQA. Funded by the EPSRC (2025-2028).
    Personalised AI Feedback for Secondary School Science
    Developing a personalised assessment and feedback tool for secondary school science. Funded by the Department for Education, UK (Dec 2024 – Apr 2025).
    Elandi: Trustworthy Generative AI for Learning
    Trustworthy generative AI for affordable personalised learning and development. Funded by Innovate UK under the Accelerating Trustworthy AI: Phase 2 Collaborative R&D call, led by AI for Global Goals (2024-2025).
    Event-Centric Framework for Natural Language Understanding
    The five-year UKRI-funded Turing AI Fellowship awarded to Yulan He aims to develop a machine reading comprehension model in which a computer could continuously build and update a graph of eventualities as reading progresses.
    New Language Modelling
    Lin Gui and Yulan He have been awarded a prestigious EPSRC New Horizons grant for a high-risk research project with potentially transformative impact. The project aims to develop a new language modelling method allowing for a more faithful and explainable approximation for the input text.
    Automated Scoring System for GCSE Science Exams
    Funded by AQA, the project aims to develop an automated scoring system for assessing students’ answers to descriptive questions in GCSE Biology or Chemistry. The system is expected to produce prediction of marks and generate the rationales explaining the model decisions.
    Character-Centric Narrative Understanding
    The EPSRC ICASE project, jointly funded by Huawei London Research Centre, aims to develop new AI algorithms for automatic understanding of narratives in novels.
    Model Interpretability
    In our EPSRC-funded project, “Twenty20Insight”, we aim to investigate explainable AI (XAI) approaches which can provide interpretations both faithful to model decisions and are also better understood by humans.
    PANACEA: PANdemic Ai Claim vEracity Assessment
    Led by Yulan He, the EPSRC-funded PANACEA project developed novel supervised/unsupervised methods for veracity assessment of claims unverified at the time of posting, by integrating information from multiple sources and building a knowledge network that enables cross verification
    Please note that all images displayed in this tab have been generated using DALL-E.
© 2026 Copyright: KCL NLP Group