Beautyrec: Robust, efficient, and component-specific makeup transfer

Q Yan, C Guo, J Zhao, Y Dai… - Proceedings of the …, 2023 - openaccess.thecvf.com
Proceedings of the IEEE/CVF Conference on Computer Vision and …, 2023openaccess.thecvf.com
In this work, we propose a Robust, Efficient, and Component-specific makeup transfer
method (abbreviated as BeautyREC). A unique departure from prior methods that leverage
global attention, simply concatenate features, or implicitly manipulate features in latent
space, we propose a component-specific correspondence to directly transfer the makeup
style of a reference image to the corresponding components (eg, skin, lips, eyes) of a source
image, making elaborate and accurate local makeup transfer. As an auxiliary, the long …
Abstract
In this work, we propose a Robust, Efficient, and Component-specific makeup transfer method (abbreviated as BeautyREC). A unique departure from prior methods that leverage global attention, simply concatenate features, or implicitly manipulate features in latent space, we propose a component-specific correspondence to directly transfer the makeup style of a reference image to the corresponding components (eg, skin, lips, eyes) of a source image, making elaborate and accurate local makeup transfer. As an auxiliary, the long-range visual dependencies of Transformer are introduced for effective global makeup transfer. Instead of the commonly used cycle structure that is complex and unstable, we employ a content consistency loss coupled with a content encoder to implement efficient single-path makeup transfer. The key insights of this study are modeling component-specific correspondence for local makeup transfer, capturing long-range dependencies for global makeup transfer, and enabling efficient makeup transfer via a single-path structure. We also contribute BeautyFace, a makeup transfer dataset to supplement existing datasets. This dataset contains 3,000 faces, covering more diverse makeup styles, face poses, and races. Each face has annotated parsing map. Extensive experiments demonstrate the effectiveness of our method against state-of-the-art methods. Besides, our method is appealing as it is with only 1M parameters, outperforming the state-of-the-art methods (BeautyGAN: 8.43 M, PSGAN: 12.62 M, SCGAN: 15.30 M, CPM: 9.24 M, SSAT: 10.48 M).
openaccess.thecvf.com
Showing the best result for this search. See all results