log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
PhD Defense: Dynamics-Inspired Garment Reconstruction and Synthesis for Simulation-Based Virtual Try-On
Junbang Liang
Remote
Friday, May 14, 2021, 3:30-5:30 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract
E-Commerce has been growing at a rapid pace in recent years. People are now more likely to shop online than going to physical stores. Digital try-on systems, as one alternative way to improve the user experience and popularize online garment shopping, has drawn attention of many researchers. However, the technology is still far from being practical and easy-to-use to replace physical try-on, mostly due to the gap in modeling and in demonstrating garment-fitting between the digital and the real worlds. The estimation of the hidden parameters of the garments plays an important role in closing the gap, examples including accurate reconstruction of human shapes and sizes through consumer devices, faithful estimation of garment materials via learning and optimization, user-friendly recovery of dressed garments, and fast and realistic visual rendering of animated try-on results. Although previous methods have made some progresses on these under-constrained problems, learning-based approaches have shown tremendous potential in making notable impact. I propose to address the key open research issues above by adopting machine learning and optimization techniques.

To accurately reconstruct human shapes and sizes, I propose a learning-based {\em shape-aware} human body mesh reconstruction for both pose and shape estimation that is supervised directly on shape parameters. To estimate garment materials from video, I design a differentiable cloth simulation algorithm that can optimize its input variables to fit the data, thereby inferring physical parameters from observations and reaching desired control goals. To take advantage of joint learning and optimization, I further propose a joint estimation framework targeting human body and apparels through a close-loop iterative optimization. By extracting temporal information of both the body and the garment, it can also recover fabric material(s) of a garment from one single RGB video. To render realistic try-on results in close to real-time speed, I design a time-domain parallelization algorithm that maximizes the overall performance acceleration in distributed systems with minimal communication overhead. I further propose a semi-supervised learning framework to directly predict fit-accurate cloth draping on a wide range of body shapes.

In summary, my proposed frameworks focus on improving the efficiency, scalability, and capability of cloth simulation, and enable accurate hidden parameter estimation by exploiting cloth simulation for supervised learning and gradient-based feedback control.

Examining Committee: 
 
                           Chair:              Dr.  Dinesh Manocha          
                          Dean's rep:      Dr. Min Wu
                          Members:         Dr.  Ming Lin
                                                Dr. Soheil Feizi
                                                Dr.  Tom Goldstein

 

Bio
Junbang Liang is a Ph.D. student at the University of Maryland's Department of Computer Science, working under the supervision of Prof. Ming Lin. His research interest is physics-based cloth simulation and inverse problem for virtual reality applications.

 

This talk is organized by Tom Hurst