313 Academic Research Building
265 South 37th Street
Philadelphia, PA 19104
Research Interests: statistics, optimization, reinforcement learning, machine learning theory, information theory, statistical machine learning
Links: Personal Website, Google Scholar
For more information, please visit my Personal Website.
I’m looking for highly motivated postdocs and Ph. D. students with strong mathematical background and interest in machine learning theory, statistics, and optimization.
Ph.D. in Electrical Engineering, Stanford University, 2015 (Advisor: Andrea J. Goldsmith)
M.S. in Statistics, Stanford University, 2013
M.S. in Electrical and Computer Engineering, University of Texas at Austin, 2010
B.E. in Electrical Engineering / Microelectronics, Tsinghua University, 2008
Associate Professor of Statistics and Data Science,
Associate Professor of Electrical and Systems Engineering (secondary appointment),
The Wharton School, University of Pennsylvania, 2022-present
Assistant Professor of Electrical and Computer Engineering,
Associated Faculty Member of Computer Science and of Applied and Computational Mathematics,
Princeton University, 2017-2021
Postdoctoral Researcher, Department of Statistics
Stanford University, 2015-2017 (Advisor: Emmanuel J. Candès)
Changxiao Cai, H Vincent Poor, Yuxin Chen (2022), Uncertainty Quantification for Nonconvex Tensor Completion: Confidence Intervals, Heteroscedasticity and Optimality, accepted to IEEE Transactions on Information Theory.
Gen Li, Yuejie Chi, Yuting Wei, Yuxin Chen, Minimax-Optimal Multi-Agent RL in Zero-Sum Markov Games With a Generative Model.
Yuling Yan, Gen Li, Yuxin Chen, Jianqing Fan (Under Review), Model-Based Reinforcement Learning Is Minimax-Optimal for Offline Zero-Sum Markov Games.
Gen Li, Laixi Shi, Yuxin Chen, Yuejie Chi, Yuting Wei (Under Review), Settling the Sample Complexity of Model-Based Offline Reinforcement Learning.
Laixi Shi, Gen Li, Yuting Wei, Yuxin Chen, Yuejie Chi (2022), Pessimistic Q-Learning for Offline Reinforcement Learning: Towards Optimal Sample Complexity, International Conference on Machine Learning (ICML).
Shicong Cen, Chen Cheng, Yuxin Chen, Yuting Wei, Yuejie Chi (2022), Fast global convergence of natural policy gradient methods with entropy regularization, Operations Research, 7 (4), pp. 2563-2578.
Changxiao Cai, Gen Li, H Vincent Poor, Yuxin Chen (2022), Nonconvex Low-Rank Tensor Completion from Noisy Data, Operations Research, 70 (2), pp. 1219-1237.
Gen Li, Yuting Wei, Yuejie Chi, Yuantao Gu, Yuxin Chen (2022), Sample complexity of asynchronous Q-learning: sharper analysis and variance reduction, IEEE Transactions on Information Theory, 68 (1), pp. 448-473.
Abstract: Gen
Yuxin Chen, Jianqing Fan, Bingyan Wang, Yuling Yan (2021), Convex and Nonconvex Optimization Are Both Minimax-Optimal for Noisy Blind Deconvolution under Random Designs, Journal of the American Statistical Association, (in press) ().
Gen Li, Yuxin Chen, Yuejie Chi, Yuantao Gu, Yuting Wei (2021), Sample-Efficient Reinforcement Learning Is Feasible for Linearly Realizable MDPs with Limited Revisiting, Neural Information Processing Systems (NeurIPS).
Independent Study allows students to pursue academic interests not available in regularly offered courses. Students must consult with their academic advisor to formulate a project directly related to the student’s research interests. All independent study courses are subject to the approval of the AMCS Graduate Group Chair.
Study under the direction of a faculty member.
The goal of this course is to introduce the Python programming language within the context of the closely related areas of statistics and data science. Students will develop a solid grasp of Python programming basics, as they are exposed to the entire data science workflow, starting from interacting with SQL databases to query and retrieve data, through data wrangling, reshaping, summarizing, analyzing and ultimately reporting their results. Competency in Python is a critical skill for students interested in data science. Prerequisites: No prior programming experience is expected, but statistics, through the level of multiple regression is required. This requirement may be fulfilled with Undergraduate courses such as Stat 1020, Stat 1120.
Convex optimization has become a real pillar of modern data science and has transformed algorithm designs. A wide spectrum of problems in statistics, machine learning, and engineering can be formulated as optimization tasks that exhibit favorable convexity properties, which admit standardized and efficient solutions. This course aims to introduce the elements of convex optimization, concentrating on modeling aspects and algorithms that are useful in data science applications. Topics include convex sets, convex functions, linear and quadratic programs, semidefinite programming, optimality conditions and duality theory. We will visit important applications in statistics and machine learning to demonstrate the wide applicability of convex optimization. We will also cover effective optimization algorithms like gradient descent and Newton's method. Prerequisites: Basic linear algebra (Math 3120, 3130, 3140 or equivalent), basic calculus (Math 2400 or equivalent), basic probability (STAT 4300 or equivalent), and knowledge of a programming language like MATLAB or Python to conduct simulation exercises.
Convex optimization has become a real pillar of modern data science and has transformed algorithm designs. A wide spectrum of problems in statistics, machine learning, and engineering can be formulated as optimization tasks that exhibit favorable convexity properties, which admit standardized and efficient solutions. This course aims to introduce the elements of convex optimization, concentrating on modeling aspects and algorithms that are useful in data science applications. Topics include convex sets, convex functions, linear and quadratic programs, semidefinite programming, optimality conditions and duality theory. We will visit important applications in statistics and machine learning to demonstrate the wide applicability of convex optimization. We will also cover effective optimization algorithms like gradient descent and Newton's method. Prerequisites: Basic linear algebra, basic calculus, basic probability, and knowledge of a programming language like MATLAB or Python to conduct simulation exercises.
The goal of this course is to introduce the Python programming language within the context of the closely related areas of statistics and data science. Students will develop a solid grasp of Python programming basics, as they are exposed to the entire data science workflow, starting from interacting with SQL databases to query and retrieve data, through data wrangling, reshaping, summarizing, analyzing and ultimately reporting their results. Competency in Python is a critical skill for students interested in data science. Prerequisites: No prior programming experience is expected, but statistics, through the level of multiple regression is required. This requirement may be fulfilled with Undergraduate courses such as Stat 1020, Stat 1120.
Convex optimization has become a real pillar of modern data science and has transformed algorithm designs. A wide spectrum of problems in statistics, machine learning, and engineering can be formulated as optimization tasks that exhibit favorable convexity properties, which admit standardized and efficient solutions. This course aims to introduce the elements of convex optimization, concentrating on modeling aspects and algorithms that are useful in data science applications. Topics include convex sets, convex functions, linear and quadratic programs, semidefinite programming, optimality conditions and duality theory. We will visit important applications in statistics and machine learning to demonstrate the wide applicability of convex optimization. We will also cover effective optimization algorithms like gradient descent and Newton's method. Prerequisites: Basic linear algebra (Math 3120, 3130, 3140 or equivalent), basic calculus (Math 2400 or equivalent), basic probability (STAT 4300 or equivalent), and knowledge of a programming language like MATLAB or Python to conduct simulation exercises.
Convex optimization has become a real pillar of modern data science and has transformed algorithm designs. A wide spectrum of problems in statistics, machine learning, and engineering can be formulated as optimization tasks that exhibit favorable convexity properties, which admit standardized and efficient solutions. This course aims to introduce the elements of convex optimization, concentrating on modeling aspects and algorithms that are useful in data science applications. Topics include convex sets, convex functions, linear and quadratic programs, semidefinite programming, optimality conditions and duality theory. We will visit important applications in statistics and machine learning to demonstrate the wide applicability of convex optimization. We will also cover effective optimization algorithms like gradient descent and Newton's method. Prerequisites: Basic linear algebra, basic calculus, basic probability, and knowledge of a programming language like MATLAB or Python to conduct simulation exercises.
This seminar will be taken by doctoral candidates after the completion of most of their coursework. Topics vary from year to year and are chosen from advance probability, statistical inference, robust methods, and decision theory with principal emphasis on applications.
Written permission of instructor and the department course coordinator required to enroll.