- Kyoto, Japan
- May 8-10, 2026
2026 11th International Conference on Control and Robotics Engineering (ICCRE 2026)
ICCRE 2026 Invited Speakers
Assoc. Prof. Yidi Li
Taiyuan University of Technology, China
Biography: Yidi Li is currently an Associate Professor at the School of Computer Science and Technology, and Director of the Multimodal Intelligent Human-Robot Interaction Laboratory, Taiyuan University of Technology, China. She received her PhD in Computer Technology from Peking University in China, and is currently engaged in postdoctoral research at Osaka University in Japan. Her research work has achieved significant results in the field of the integration of computer vision and computer audition. She was awarded the 2023 ACM China Rising Star (Taiyuan Chapter) by the Association for Computing Machinery, and received IEEE SWC Best Paper Award and the IEEE Distinguished Service Award. She serves as the publicity chair of the 2025 CW, as well as the session chair of the 2024 CWSN, 2025 ICIC and 2025 ACAIT. Her research interests focus on multimodal embodied perception and human-robot interaction technology, aiming to enhance robots' robust environmental perception and autonomous decision-making capabilities.
Title of Speech: Synergizing Sound and Sight: Robust Audio-Visual Learning for Robotic Multimodal Perception
Abstract: Perception is the foundation of embodied intelligence, serving as the bridge between the physical world and machine decision-making. However, in complex and non-ideal environments, vision-only approaches are limited by restricted field of view, occlusion, and sensitivity to lighting conditions. Audio-visual fusion leverages the complementarity and redundancy of multimodal information, significantly improving robustness in challenging scenarios. This talk presents a novel audio-visual learning paradigm structured as “low-level alignment - mid-level representation - high-level robustness.” We first explore learning from association to bridge the semantic gap between heterogeneous modalities via unified modeling and contrastive learning. We then study learning in space, using bird’s-eye-view representation and point–voxel interaction to achieve spatial consistency and fine-grained geometric modeling. Finally, we address learning under challenges through multimodal distillation and vision-guided decoupled inference, enabling robust perception under limited resources and incomplete modalities. This work provides a unified perspective integrating association, spatial understanding, and robustness for building reliable multimodal perception systems for embodied intelligence.
Dr. Wenyu Liang
Institute for Infocomm Research, A*STAR, Singapore
Biography: LIANG Wenyu is currently a Senior Scientist at the Institute for Infocomm Research, A*STAR, Singapore, and an Adjunct Assistant Professor in the Department of Electrical and Computer Engineering, National University of Singapore. He has authored or co-authored over 100 publications, including 2 books, 2 book chapters, and more than 40 published journal papers. He has also received many awards related to innovation such as AAAI 2021 Best Demonstration Award, KUKA Innovation Award 2021 Finalist, IES Prestigious Engineering Achievement Awards 2016: Young Creators, etc. His research interests mainly include robotics, intelligent systems, dexterous manipulation, motion control, and force control.
Title of Speech: Giving Robots the Sense of Touch for Intelligent Robotic Manipulation
Abstract: The sense of touch enables humans to perform delicate interaction tasks with objects of varying geometry, texture, and stiffness, even in vision-denied or occluded environments. With the advancement of robotics and automation technologies, robots have become key enablers in modern industry. They accelerate processes, improve accuracy, reduce costs, and minimize human exposure to hazardous environments. Yet, most industrial robots today still rely heavily on vision or pre-programmed trajectories, lacking the the sense of touch that humans naturally deploy. To achieve similar capabilities for robots in dexterous manipulation, tactile information is essential to ensure safe, accurate, and adaptive interaction with objects and surfaces. In this talk, an introduction to tactile sensing is given at first, and then various applications using tactile feedback in robotic manipulation are presented.