Trajectory Tracking Control of Hybrid Systems with Impulse Effects
This project focuses on advancing the knowledge in the control of hybrid systems for achieving highly versatile bipedal robotic walking through reliable global-position tracking control. In this work, Lyapunov theory would be extended to general hybrid systems with state-triggered jumps for creating a large class of controllers that can provably solve the problem of trajectory tracking. The project would synthesize a control framework that exploits the researched controller design to achieve high versatility simultaneously with provable stability, agility, and energy efficiency for bipedal robotic walking. This project will lay a foundation for the creation of next-generation legged robot systems capable of safe and reliable real-world operations.
Sponsor: National Science Foundation.
Human Biomechanical Modeling
In this research, we focus on analytically and numerically revealing the fundamental principles of the physical interaction between a human and an assistive device. The research outcomes would be used to inform the design and control of adaptive robotic exoskeletons for enhancing the energy efficiency of human locomotion as well as enabling reliable human-intent inference.
Sponsor: U.S. Army.
5G-enabled Robot Teleoperation
The goal of this project is to combine 5G technology with legged locomotion for enabling reliable teleoperation during time-critical missions such as search and rescue as well as emergency responses. With teleoperation, we incorporate the human experience and expertise into robotic systems while keeping human operators in safe, remote environments. To teleoperate a robot during time-critical missions, the human operator needs high-resolution, real-time information about the field to make timely decisions. Moreover, significant computational power is demanded to enable online motion planning for guaranteeing stability and agility of robot operation. Through a 5G network, the human operator can access reliable, high-resolution, real-time video streaming to make feasible, timely decisions for the robot to act on. With edge computing, the robot is able to translate the high-level commands from human operators into joint-level action in real time.
Sponsors: Verizon 5G Lab and Ericsson.