Robotics60: Presenters
Professor Yang Gao
AI Robotics for Sustainable Space Exploration
Professor Yang Gao is a Professor of Robotics and heads the Centre for Robotics Research within the Department of Engineering at King's College London. She is involved in design and development of real-world space missions as well as industry applications. Yang's work has been applied to several non-space sectors including nuclear, utility and agriculture through technology transfer and spin offs.
Prior to joining King's in August 2023, Yang spent nearly 19 years at the University of Surrey where she founded and led the multi-award winning STAR LAB (Space Technology for Autonomous & Robotic systems Laboratory). Before that, she was an awardee of the prestigious Singapore Millennium Foundation Postdoctoral Fellowship and worked on intelligent and autonomous vehicles.
Website: https://nmes.kcl.ac.uk/yang.gao/
Prof Dimitrios Kanoulas
Legged Robots - Advancements over the past 60 Years
Prof Dimitrios Kanoulas is Professor in Robotics & AI at University College London (UCL), Department of Computer Science since Sept. 2019. He works on the cognitive part of legged robots.
In his talk, Dimitrios will give an overview of the advancements on legged robot navigation, starting from Prof Meredith Thring’s work at QMUL to today’s “madness”.
Website: https://dkanou.github.io/
Dr Hareesh Godaba
Soft material technologies for bioinspired robots
Dr Hareesh Godaba is a Lecturer in the Department of Engineering and Design and a PI in the Centre for Robotics and Sensing Technologies (CROSS-Tech) at the University of Sussex. Until November 2020, I worked as a postdoctoral researcher in the Centre for Advanced Robotics @ Queen Mary (ARQ) in Queen Mary University of London (QMUL) on the National Centre for Nuclear Robotics (NCNR) Project developing sensing and actuation technologies for application to extreme environments.
In his talk, Hareesh will share his experiences in investigating and developing fundamental soft material technologies. He will discuss development of electrically activated soft artificial muscles with intrinsic force and displacement sensing capabilities and show his most recent work on realising high-resolution large area tactile sensing based on luminescent materials to enable intuitive physical human-robot collaboration.
Website: www.hareeshgodaba.com
Dr Oya Celiktutan
Novel Approaches for Socially Acceptable Robots to Interact with Humans
Dr Oya Celiktutan is a Reader at the Centre for Robotics Research in the Department of Engineering and leads the Social AI & Robotics Laboratory. She received a BSc degree in Electronics Engineering from Uludag University, and an MSc and PhD degree in Electrical and Electronics Engineering from Bogazici University, Turkey. During her doctoral studies, she was a visiting researcher at the National Institute of Applied Sciences of Lyon, France. After completing her PhD, she moved to the United Kingdom and worked on several projects as a postdoctoral researcher at Queen Mary University London, the University of Cambridge, and Imperial College London, respectively. In 2018, she joined King’s College London. Dr Celiktutan’s research focuses on machine learning for perception, human behaviour understanding, human-robot interaction, robot navigation, and manipulation. Her work has been supported by EPSRC, The Royal Society, and the EU Horizon, as well as through industrial collaborations. She received the EPSRC New Investigator Award in 2020. Her team’s research has been recognised with several awards, including the Best Paper Award at IEEE Ro-Man 2022, NVIDIA CCS Best Student Paper Award Runner Up at IEEE FG 2021, First Place Award and Honourable Mention Award at ICCV UDIVA Challenge 2021.
As robots develop autonomy and integrate into daily environments to assist and collaborate with humans, they should not only perform tasks successfully but also adhere to social norms and expectations. In her talk, Oya will give an overview of definitions and key concepts as well as the challenges involved in building socially acceptable robots. She will draw from her ongoing research to present examples, including how robots can navigate crowded environments with social awareness, how they can explain their actions to users, and how they can learn to imitate human behaviours.
Website: https://nms.kcl.ac.uk/oya.celiktutan/
Dr Ildar Farkhatdinov
Human Augmentation and Healthcare Applications of Interactive Robotics
Dr Ildar Farkhatdinov is a Senior Lecturer in Healthcare Engineering (Robotics and Mechatronics) at King's College London. He is an internationally leading expert in assistive robotics and human-machine interaction with applications to rehabilitation, neurosciences and immersive environments. He is a principal investigator of several projects on wearable robotics, mobility assistance and haptic interfaces. Several of his research works were recognised as the best paper or finalists for best paper awards at leading robotics conferences. Before joining King's, he was an academic at Queen Mary University of London (2016-24) and a postdoctoral research associate at the Human Robotics group of the Department of Bioengineering, Imperial College London (2013-16). He earned a Ph.D. in Robotics in 2013 (Sorbonne University, UPMC, France), an M.Sc. in Mechanical Engineering in 2008 (KoreaTech, South Korea) and a B.Sc. in Automation and Control in 2006 (Moscow University, Russia).
In his talk, Ildar will review research conducted by the HAIR (Human Augmentation and Interactive Robotics) group at the centre for Advanced Robotics @ Queen Mary and its importance for people from the point of view of both research and education.
Website: https://www.kcl.ac.uk/people/ildar-farkhatdinov
Dr Letizia Gionfrida
Empowering Movement: Vision-Based Detection, Simulation, and Assistance for Mobility Enhancement
Dr. Letizia Gionfrida is a Lecturer in Computer Vision within the Department of Informatics at King's College London, where she is also affiliated with the School of Biomedical Engineering and Imaging Sciences. At King's, she leads the Vision in Human Robotics Lab, which focuses on developing intelligent algorithms to assist individuals with mobility disorders. Since joining King’s, she has contributed to several research projects. Her work includes a research fellowship from the Royal Academy of Engineering to develop adaptive control for soft exoskeletons in hand rehabilitation for post-stroke patients. She’s leading and participating in projects focusing on vision-enhanced control for affordable prostheses (funded by the EPSRC IAA), vision algorithms in hand exoskeletons for rheumatoid arthritis patients (funded by the NIH-R21) and pose estimation for post-stroke rehabilitation (funded by the Private Physiotherapy Educational Foundation). Prior to joining King's, she was a postdoctoral research fellow at Harvard University, where she worked with Professors Howe and Walsh in the field of computer vision for wearable robotics. She continues to maintain an honorary affiliation with Harvard University. She earned her PhD in Bioengineering from Imperial College London, supervised by Professor Anil Bharath, working on hand pose estimation. During her doctoral studies, she co-founded Arthronica Ltd., a diagnostic platform for immune-mediated inflammatory disorders using pose estimation from monocular cameras. In March 2022, Arthronica was acquired by Procedure Health Limited. She also serves as Associate Co-Chair of the IEEE Robotics & Automation Society's Technical Committee on Computer and Robot Vision, Associate Editor of the IEEE RAS International Conference on Robotics and Automation, and is an active member of Black in Robotics.
In her talk, Letizia will explore the intersection of vision-based analysis, physics-based modelling, and assistive technology in aiding and rehabilitating individuals with mobility challenges. Letizia will discuss the algorithms developed in her lab that analyse kinematics and kinetics to infer human movement patterns. She will present her lab's work on physics-based musculoskeletal simulations derived from imitation learning, showcasing how video-based human motion data can be translated into accurate biomechanical models to enable the design of adaptable and efficient controllers for wearable robots. Lastly, she will introduce her lab's work on adaptable controllers for soft exoskeleton technologies offering personalised support to enhance mobility and overall quality of life.
Website: https://sites.google.com/view/gionfrida/home
Prof Christos Bergeles
Micro-surgical robotics for interventions in the operating theatre of the future
Prof Christos Bergeles received his Ph.D. degree in Robotics from ETH Zurich, Switzerland, in 2011. He was a postdoctoral research fellow at Boston Children’s Hospital, Harvard Medical School, Massachusetts, and the Hamlyn Centre for Robotic Surgery, Imperial College, United Kingdom. As a Professor at King’s College London, he directs the “Robotics and Vision in Medicine Lab” whose mission is to develop micro-surgical robots that deliver regenerative therapies deep inside the human body.
This talk will introduce SoftReach, a therapy delivery robot that aims to revolutionize the treatment of neurodegenerative diseases though image-guided minimally invasive technology. SoftReach comprises a sensorised tip-growing flexible robotic endoscope, equipped with a μl-sized tissue construct payload, that will be guided through the spine up to the brain-ventricles. There, the robot will deliver promising cell therapies, gene therapies, or brain-computer interfaces, with primary interest targeting the adult hippocampal neurogenesis as a novel pharmacological approach.
Website: https://www.kcl.ac.uk/people/christos-bergeles-1
Prof Yiannis Demiris
Plenary Talk - Robots at life’s slopes: challenges in assisting people
Yiannis Demiris is a Professor in Human-Centred Robotics at Imperial, where he holds a Royal Academy of Engineering Chair in Emerging Technologies (Personal Assistive Robotics), and directs the Personal Robotics Laboratory. His research interests revolve around the design, implementation and evaluation of interactive, human-in-the-loop systems. Of particular interest are fundamental algorithms for human state perception, multi-scale user modelling, and adaptive cognitive control architectures for determining how intelligent robots can generate personalised assistance to humans in order to improve their physical, cognitive and social wellbeing.
Humans are remarkable systems, possessing impressive perceptual, emotional, cognitive and motor capabilities. Yet, sooner or later, we all experience slopes in our life, whether due to disabilities, accidents, or life’s inevitable aging processes. In this plenary talk, Yiannis will provide theoretical and experimental work backing his persistent belief that human-centred robotics can provide a vital supporting hand to assist us in our times of need. He will provide research examples from robots assisting in activities of daily living (ADL), including dressing, bathing, feeding, and mobility. Yiannis will also touch upon human-specific technical challenges for roboticists, including privacy, trust and the need for explainable proactive robotic systems.
Websites: https://profiles.imperial.ac.uk/y.demiris; https://www.imperial.ac.uk/personal-robotics