Advanced Search
Chen Kaijian, Li Erqiang, Zhou Yang. Non-Stationary Texture Synthesis Controlled by Relative Coordinate Guidance[J]. Journal of Computer-Aided Design & Computer Graphics, 2023, 35(2): 284-292. DOI: 10.3724/SP.J.1089.2023.19362
Citation: Chen Kaijian, Li Erqiang, Zhou Yang. Non-Stationary Texture Synthesis Controlled by Relative Coordinate Guidance[J]. Journal of Computer-Aided Design & Computer Graphics, 2023, 35(2): 284-292. DOI: 10.3724/SP.J.1089.2023.19362

Non-Stationary Texture Synthesis Controlled by Relative Coordinate Guidance

  • The state-of-the-art exemplar-based texture synthesis methods encounter with issues when dealing with challenging non-stationary textures. Most methods cannot preserve the global structure contained in the source texture, while some other methods cannot produce textures of arbitrary sizes due to the limitation of network architecture. In this paper, we propose a non-stationary texture synthesis method with relative coordinates as guidance. Specifically, the proposed method is built upon the pyramidal generation adversarial networks SinGAN where two key contributions are made. First, by introducing relative coordinate maps as guidance, synthesis results can well preserve the input structure; Second, by rotating texture blocks data augmentation has been done for training such that the quality of local texture details are significantly improved. Experiments show that our method can generate new samples of any size and aspect ratio, while maintaining the global structure and fine local details of source texture. Moreover, users can select to expand any part of the source texture, achieving better controllability of synthesis.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return