Abstract:
This paper addresses static 3D cloth draping on virtual human bodies. A two-stream deep network model that extracts features from body and garment shapes drapes a template cloth on virtual 3D bodies. Our network mimics physics-based simulation (PBS) with two orders of magnitude less computation. For plausible results and collision-awareness, we use PBS-inspired loss terms to train the network. Two loss functions penalize the difference between predicted cloth curvature and PBS to improve draped garment details. We quantitatively and qualitatively study mean curvature normal and a novel detail-preserving loss. Our new curvature loss computes 3D point local covariance matrices and compares prediction and PBS Rayleigh quotients. This provides more details while outperforming the loss that considers mean curvature normal vectors in 3D triangulated meshes. Four garment types for different body shapes and poses validate our framework. Finally, we outperform a recent data-driven method.
Note: Please discuss with our team before submitting this abstract to the college. This Abstract or Synopsis varies based on student project requirements.
Did you like this final year project?
To download this project Code with thesis report and project training... Click Here