Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor | Institute of Textiles and Clothing | en_US |
dc.creator | Zhu, Shuaiyin | - |
dc.identifier.uri | https://theses.lib.polyu.edu.hk/handle/200/7034 | - |
dc.language | English | en_US |
dc.publisher | Hong Kong Polytechnic University | - |
dc.rights | All rights reserved | en_US |
dc.title | An efficient method for accurate human model customisation based on two-view monocular photos | en_US |
dcterms.abstract | Computer aided design and computer aided manufacture (CAM/CAM) applications are widely used in the fashion industry; they are important tools that help the industry improve productivity and design efficiency. In recent two decades, research on 3D clothing CAD has attracted much attention. Some studies simulate the traditional trial fitting process by virtually 'sewing' 2D patterns around a 3D human model so as to examine the clothing fitting and evaluate the design. Other studies adopted quite different approach that 3D garments are directly designed based on a 3D human model, and then flattened to give 2D patterns. It is not difficult to tell that an accurate 3D human model is fundamental in both approaches. Scan-based construction methods and reconstruction methods based on deformation are two major approaches for customising human models. Both approaches have pros and cons. In terms of clothing specific applications, these methods suffer drawbacks of either restricted accessibility or limited accuracy. This project aims to develop new modelling methods for 3D human model customisation. The resulting human model must be accurate and with realistic details, fulfilling the specific requirements of the clothing industry. Two intelligent modelling methods are developed in this study. One is to customise a 3D body model based on two orthogonal monocular images, in which the subject is dressed in tight-fit clothing. Another one is to customise models for people dressed in arbitrary clothing, for instance loose-fit clothing. The first method realises realistic human model customisation in five steps. First, a human body shape representation involving both 2D and 3D body feature is defined. Second, 2D and 3D body features are extracted from a large database of real subject scans and the relationship between 2D and 3D features are learnt from the extract data. Third, a robust method is proposed to extract 2D features from monocular images for shape customisation. Fourth, 3D shapes are predicted for individual customers based on image extracted 2D features. Lastly, model customisation is realised by deforming a high-resolution template model based on the predicted 3D shape. The second method is different from the first one in that some of customers' body shape information cannot be obtained from subjects' images because such 2D shape information is covered by clothing. To solve the problem, it is proposed to predict customers' overall 2D shape (profile) based on only limited 2D features. The predicted 2D profile is used to construct a 3D shape for model customisation. Therefore, a new 3D shape representation model is constructed to facilitate modelling various body shapes using scanned data. Moreover, a 2D profile model is developed to characterise the spatial relationships between various local 2D features and global 2D features extractable from scan and/or images. The second method can thus customise human models based on orthogonal-view images, in which subjects are dressed in loose-fit clothing. | en_US |
dcterms.abstract | A systematic experimental evaluation was carried out to verify the effectiveness of the proposed methods. A total of 30 subjects, 15 males and 15 females, with various body shapes were recruited to have their photos taken for body model customisation. The results are evaluated both objectively and subjectively. In objective evaluation, the customised models are compared with scanned models and the models synthesised by two commercial software programs, in terms of key girth measurements and areas. It is shown that the proposed methods can accurately develop various human models, and the level of accuracy is comparable to scan. In subjective evaluation, a questionnaire was designed for people to rate the resemblance between subject's image, customised models, scanned model, and the models synthesised by the two commercial software programs. Subjective evaluation results show that the proposed methods outperform the other methods in generating realistic and accurate models. The proposed methods can be used in virtual try-on and fit assessment applications, thus assist selling clothes online. The technology can also be used in mannequin development, to customise mannequin/fit forms that are made exactly to the shape of a retailer's or manufacturer's fit model. It reduces product development time and costs by allowing fit evaluation to be carried out at offshore manufacturing sites, without repeated shipping of sample garments to an approval site. | en_US |
dcterms.extent | xviii, 210 p. : ill. (some col.) ; 30 cm. | en_US |
dcterms.isPartOf | PolyU Electronic Theses | en_US |
dcterms.issued | 2013 | en_US |
dcterms.educationalLevel | All Master | en_US |
dcterms.educationalLevel | M.Phil. | en_US |
dcterms.LCSH | Computer-aided design. | en_US |
dcterms.LCSH | Three-dimensional imaging. | en_US |
dcterms.LCSH | Image reconstruction. | en_US |
dcterms.LCSH | Hong Kong Polytechnic University -- Dissertations | en_US |
dcterms.accessRights | open access | en_US |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
b26160638.pdf | For All Users | 11.34 MB | Adobe PDF | View/Open |
Copyright Undertaking
As a bona fide Library user, I declare that:
- I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
- I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
- I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.
By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.
Please use this identifier to cite or link to this item:
https://theses.lib.polyu.edu.hk/handle/200/7034