Combining Style Transfer in a Frame with Random Rotation Using Convolutional Neural Networks

Document Type : Computer Article

Authors

1 MSc Student, Department Electrical and Computer Engineering, Semnan University, Semnan, Iran

2 Associate Professor, Department Electrical and Computer Engineering, Semnan University, Semnan, Iran

Abstract

Style transfer is a research area that has garnered significant attention. This technology allows the style of one image to be transferred into the content of another image. Extensive research has been conducted in the field of style transfer, aiming to accelerate processing and produce beautiful and high-quality images. One application of this technology could be in generating beautiful designs for printing in industries such as tile and carpet manufacturing, where creating such intricate patterns by human hands would be challenging. In this article, a simple method is proposed for transferring multiple styles that have been randomly rotated, resulting in more unique designs compared to using a single style for transfer. This method can enhance the diversity and quality of the generated images. Evaluation of this method was conducted by surveying 25 individuals who compared our method to other existing methods. Ultimately, our method received the highest approval rating. The outcome of this research facilitates the creation of artworks more efficiently, offering a viable alternative to human-designed patterns.
 

Keywords

Main Subjects


[1] L.A. Gatys, A.S. Ecker, and M. Bethge. "Image style transfer using convolutional neural networks." In Proceedings of the IEEE Conference On Computer Vision and Pattern Recognition, pp. 2414-2423. 2016.
[2] X. Huang, and S. Belongie. "Arbitrary style transfer in real-time with adaptive instance normalization." In Proceedings of the IEEE International Conference on Computer Vision, pp. 1501-1510. 2017.
[3] K. Simonyan, and A. Zisserman. "Very deep convolutional networks for large-scale image recognition." arXiv preprint arXiv:1409.1556 (2014).
[4] X. Chen, C. Xu, X. Yang, L. Song, and D. Tao. "Gated-gan: Adversarial gated networks for multi-collection style transfer." IEEE Transactions on Image Processing 28, no. 2 (2018): 546-560.
[5] N.Q. Tuyen, S.T. Nguyen, T.J. Choi, and V.Q. Dinh. "Deep correlation multimodal neural style transfer." Ieee Access 9 (2021): 141329-141338.
[6] D.Y. Park, and K.H. Lee. "Arbitrary style transfer with style-attentional networks." In proceedings of the IEEE/CVF conference on Computer Vision and Pattern Recognition, pp. 5880-5888. 2019.
[7] S. Truong Nguyen, N. Quang Tuyen, and N. Hong Phuc. "Deep Feature Rotation for Multimodal Image Style Transfer." arXiv e-prints (2022): arXiv-2202.
[8] C. Li, and M. Wand. "Combining markov random fields and convolutional neural networks for image synthesis." In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2479-2486. 2016.
[9] A. Singh, S. Hingane, X. Gong, and Z. Wang. "SAFIN: arbitrary style transfer with self-attentive factorized instance normalization." In 2021 IEEE International Conference on Multimedia and Expo (ICME), pp. 1-6. IEEE, 2021.
[10] S. Liu, T. Lin, D. He, F. Li, M. Wang, X. Li, Z. Sun, Q. Li, and E. Ding. "Adaattn: Revisit attention mechanism in arbitrary neural style transfer." In Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 6649-6658. 2021.
[11] X. Li, S. Liu, J. Kautz, and M.H. Yang. "Learning linear transformations for fast arbitrary style transfer." arXiv preprint arXiv:1808.04537 (2018).