At the College of Design and Engineering, our research in Advanced Robotics develops the mathematical foundations, computational frameworks, as well as the software and hardware platforms that enable robots to perceive, reason about, and act on the physical world and to communicate and interact with humans.
The focus is on human-robot collaborative systems, where robotics is the ultimate tool for humans. Advances in sensors and actuators, microelectronics, and computing enable a data-driven approach to computing in robotics and will create a new generation of robots that learn and adapt to human needs.
At the CDE we bring together experts from multidisciplinary areas that span the Departments of Biomedical Engineering, Electrical & Computer Engineering, and Mechanical Engineering.
To advance robotics science and engineering, and its impact in our daily lives, a common platform for interdisciplinary research was realised in the setup of the Advanced Robotics Centre (ARC) jointly supported with the NUS School of Computing. The ARC partners government agencies and industry to create a clear link between scientific contributions and translational research towards real-world applications.
Currently, there are six main research themes:
- Embodied Perception
- Human-Robot Interfaces
- Model Learning and Planning
- Planning and Control
- Intelligent Mechanics
- Electronics and Communications
These research efforts target the following application drivers: Productivity, Defence, Healthcare and Marine and Offshore, where robotics can provide immense benefits in these unstructured and human environments.
Embodied Perception
Embodied perception aims to connect sensing with planning and control, so that perceptual understanding of human intentions and environments is situated in the context of effective human-robot interaction and collaboration. The research topics include multiple levels of the perception hierarchy, from low-level sensor design, active sensing, to the extraction of high-level semantic features, such as speaker identification and intention recognition.
Human-Robot Interfaces
Human-robot interfaces address novel interaction modalities that include immersive salient displays for humans. We combine spoken dialog management with other interaction modalities, such as gestures and facial expressions, and create an integrated system capable of deciphering user intentions much more effectively.
Model Learning and Planning
Research in model learning and planning allows a collaborative robot to learn and plan on the fly, as the human intention changes with the progress of a joint task and the general ambient context. We investigate principled probabilistic frameworks that seamlessly integrate planning and learning.
Planning and Control
Planning and control achieve different levels of autonomy in a hierarchical fashion with humans in the loop, with the optimal level of task decomposition between the robotic system and humans. This is obtained from analysis of various data sources at different levels of granularity, to convert the data into meaningful information for effective planning and execution.
Intelligent Mechanics
Inherently safe robotic manipulators interacting with humans require a radically new design of robotic manipulators with relatively soft structures controlled by intelligent controllers. These controllers take advantage of the inherent dynamics (“intelligent mechanics”) of the robotic structure to achieve safe interaction with humans with improved performance in motion and force control.
Electronics and Communications
Electronics and Communications provide the underlying power and nerve centre of the robotic system, and include signal processing and conversion, communications and embedded computing. The research covers both hardware and software of the cognitive robotic system.