Author
Listed:
- Jong-Hyun Kim
- Sun-Jeong Kim
- Jung Lee
Abstract
We propose an anisotropic constrained-boundary convolutional neural networks (hereafter, AnisoCBConvNet) that can stably express high-quality meshes without oscillation by applying super-resolution operations to low-resolution cloth meshes. As a training set for the neural network, we use a pair between simulation data of low resolution (LR) cloth and data obtained by applying the same simulation to high resolution (HR) cloth with increased quad mesh resolution of LR cloth. The actual data used for training are 2D geometry images converted from 3D meshes. The proposed AnisoCBConvNet is used to train an image synthesizer that converts LR geometry images to HR geometry images. In particular, by controlling the weights anisotropically near the boundary, the problem of surface wrinkling caused by oscillation is alleviated. When the HR geometry image obtained through AnisoCBConvNet is converted back to the HR cloth mesh, details including wrinkles are expressed better than the input cloth mesh. In addition, our results improved the noise problem in the existing geometry image approach. We tested AnisoCBConvNet-based super-resolution in various simulation scenarios, and confirmed stable and efficient performance in most of the results. By using our method, it will be possible to effectively produce CG VFX created using high-quality cloth simulation in games and movies.
Suggested Citation
Jong-Hyun Kim & Sun-Jeong Kim & Jung Lee, 2022.
"Geometry image super-resolution with AnisoCBConvNet architecture for efficient cloth modeling,"
PLOS ONE, Public Library of Science, vol. 17(8), pages 1-21, August.
Handle:
RePEc:plo:pone00:0272433
DOI: 10.1371/journal.pone.0272433
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0272433. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.