I am an Assistant Professor in the Electrical and Computer Engineering Department at UCSD. I am also affiliated with the Center for Machine Intelligence, Computing & Security and the Center for Energy Research. My research interests broadly lie in machine learning, dynamical systems, and control. My lab focuses on various aspects of creating intelligent systems, with an emphasis on principled learning and control algorithms for sustainable energy and power systems.
I received my Ph.D. (2020) in Electrical and Computer Engineering and M.S. (2020) in Statistics from the University of Washington advised by Prof.Baosen Zhang on the topic of machine learning and control for energy systems. Prior to UCSD, I spent 1 year as a postdoc fellow at Caltech advised by Prof. Adam Wierman and Prof. Anima Anandkumar, and several research internships in Google DeepMind, JD Research, and Doosan Gridtech.
Recent talks and Collaborators Talks
Demand Response Model Identification and Behavior Forecast with OptNet
Robust Online Voltage Control with Mistake Guarantee
Learning and Control for Energy Systems @UCSD MAE Department Seminar.
2023/09: Our lab and collaborators have two papers accepted for presentation at NeurIPS 2023!
SustainGym: Reinforcement Learning Environments for Sustainability Applications (A suite of environments designed to test the performance of RL algorithms on realistic sustainability tasks, including EVChargingEnv, ElectricityMarketEnv, DatacenterEnv, CogenEnv, BuildingEnv)
2023/09: Invited speaker at the CDC 2023 Physics-informed Learning for Control and Optimization This workshop aims to provide insight into recent advances in the field of physics-informed machine learning for control and optimization, and sketch some of the open challenges and opportunities. Please consider joining the workshop on December 12th, 2023 @CDC 2023, Singapore.
2023/09: Invited speaker at the Sixth AES Workshop organized by the National Renewable Energy Laboratory (NREL). I presented our recent line of work on stability-constrained RL for voltage control. Check out the presentation slides here.
2023/08: We released a new real-world multi-zone building dataset BEAR-Data (1 building, 80+ zones with detailed zonal temperature and 17 control commands data about 8 months) for smart building research. The accompanied paper is accepted to ACM BuildSys 2023.
2023/08: Our paper Predicting Strategic Energy Storage Behaviors is accepted for publication in the IEEE Transactions on Smart Grid. Congrats Yuexin!
2023/08: Our paper Stability Constrained Reinforcement Learning for Decentralized Real-Time Voltage Control is accepted for publication in the IEEE Transactions on Control of Network Systems. Congrats Jie!
2023/07: Our paper on Bridging Transient and Steady-State Performance in Voltage Control: A Reinforcement Learning Approach With Safe Gradient Flow is accepted for publication in the IEEE Control Systems Letters and will also be presented at IEEE CDC 2023. Congrats Jie!
2023/07: Our lab and collaborators have four papers accepted for presentation at IEEE CDC 2023!
2023/07: Yufan presented our work on Combining Data and Physics Knowledge for Demand Response Forecast in Energy Systems at the 2023 IEEE Power and Energy System General Meeting!
2023/06: Our paper on Operator Learning for Nonlinear Adaptive Control is accepted and presented at the L4DC 2023. Congrats Luke! Check out our project website.
2023/05: Invited speaker at the ACC 2023 Workshop on Physics-Informed System Identification
2023/05: Invited speaker at the AFOSR Workshop on Intersection of Deep Learning and Computational Nonlinear Control
2023/05: Honored to receive the Hellman Fellowship
2023/04: Invited speaker at the UCSD Control Systems & Dynamics Seminar.
2023/02: Invited speaker at ITA 2023 session in machine learning and control.
2023/01: Honored to receive the Jacobs School Early Career Faculty Development Award for our collaborative work with Prof. Patricia Hidalgo-Gonzalez on reinforcement learning for microgrid control
I co-organized the Control Meets Learning online seminar series in 2020 - 2021. This repo contains many interesting talks on the intersection of control and learning and future outlooks. Please check out this website for the recordings.
2022/10: Invited talks at Allerton 2022 and INFORMS 2022 about "Stability Constrained Reinforcement Learning for Real-Time Voltage Control" Check out our slides here
2022/10: Invited talk at the Pacific Northwest National Laboratory about Learning and Control for Sustainable Energy Systems.
2022/06: Our work on "Demand Response Model Identification and Behavior Forecast with OptNet: a Gradient-based Approach" at the ACM International Conference on Future Energy Systems (ACM e-Energy) conference.
2022/06: Our paper Robust online voltage control with an unknown grid topology is presented at the ACM International Conference on Future Energy Systems (ACM e-Energy) and selected as Best Paper Finalist.
Prospective Students and Postdocs
We are looking for highly motivated and self-driven Ph.D. students and postdoctoral candidates with a strong mathematical background and foundation in machine learning, control, and energy systems. Our lab has 2 openings for fully-funded Ph.D. students to start in FALL 2024 (Application Deadline: 2023/12/20). There are also positions for master/undergraduate research. If you are currently at UCSD, please fill out the Application Form first and send me an email with your resume and UCSD transcript once you filled out the application. For all other applicants: If you are interested in visiting opportunities, please fill out Visiting Scholar Form and send me an email.
Both theoretical and empirical research is carried out in the group and students who can build bridges between the two, and also between different disciplines will be a good fit here.