Advanced Search
Liu Yu, Ding Yang, Fatimah binti Khalid, Li Xin, Mas Rina binti Mustaffa, Azreen bin Azman. Unsupervised Font Generation Network Integrating Content and Style Representation[J]. Journal of Computer-Aided Design & Computer Graphics, 2025, 37(5): 865-876. DOI: 10.3724/SP.J.1089.2023-00397
Citation: Liu Yu, Ding Yang, Fatimah binti Khalid, Li Xin, Mas Rina binti Mustaffa, Azreen bin Azman. Unsupervised Font Generation Network Integrating Content and Style Representation[J]. Journal of Computer-Aided Design & Computer Graphics, 2025, 37(5): 865-876. DOI: 10.3724/SP.J.1089.2023-00397

Unsupervised Font Generation Network Integrating Content and Style Representation

  • Generating Chinese fonts with a large number of characters is a challenging task. Existing methods mainly rely on large amounts of paired data for supervised learning, but collecting such data is labor-intensive and difficult to scale to new styles of fonts. To assist font designers in improving the efficiency of computer Chinese font library development, an unsupervised font generation network that separates font content and style representations is proposed. First, establish dense semantic correspondences between style and content representations in the same domain to guide the decoder to produce high-quality outputs. Then, introduce deformable convolutions in the skip connection, and make the model more focused on the structural characteristics of the font through the mutual dependence between the learning offset and the channel. Finally, design a multi-scale style discriminator to evaluate the style consistency of generated images at different scales. The team demonstrated and analyzed the generation effects of five font generation methods on public datasets, including FUNIT, MX-Font, and DG-Font. Experimental results show that the method outperforms others in terms of L1, RMSE, and user study experiments.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return