Image Style Transfering Based on StarGAN and Class Encoder
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    The image style transfer technology has been integrated into people's lives and is widely used in practical scenarios such as artistic images, photo to cartoon, image coloring, filter processing, and occlusion removal, which bears important research significance and application value. StarGAN is a generative adversarial network framework used in recent years for multi-domain image style transfer, which extracts features through simple down-sampling and then generates images through up-sampling. However, the background color information and detailed features of characters' faces in the generated images are greatly different from those in the input images. In this paper, the network structure of StarGAN is improved, and a UE-StarGAN model for image style transfer is proposed by introducing U-Net and edge-promoting adversarial loss function. At the same time, the class encoder is introduced into the generator of the UE-StarGAN model, and an image style transfer model fusing class encoder based on a small sample size is designed to realize the image style transfer with a small sample size. The experimental results reveal that the model can extract more detailed features and has some advantages in the case of a small sample size. The images obtained by applying the image style transfer based on the proposed model are improved in both qualitative and quantitative analyses, which verifies the effectiveness of the proposed model.

    Reference
    Related
    Cited by
Get Citation

Xinzheng Xu, Jianying Chang, Shifei Ding. Image Style Transfering Based on StarGAN and Class Encoder. International Journal of Software and Informatics, 2022,12(2):245~258

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:June 01,2021
  • Revised:July 16,2021
  • Adopted:August 07,2021
  • Online: June 24,2022
  • Published: