fbpx

Automated Generation of Computer-Generated Art Using Neural Style Transfer

By Orisys Academy on 18th January 2024

Problem Statement

Creating visually appealing computer-generated art often requires artistic skills and time-consuming manual processes. Neural style transfer provides a potential solution by automating the generation of art inspired by specific styles.

Abstract

This project focuses on automated art generation using neural style transfer. The
system will employ deep learning techniques to transfer the style of reference
artworks onto user-provided content, creating unique and aesthetically pleasing
computer-generated art.

Outcome

An automated system capable of generating computer-generated art using
neural style transfer, providing users with a tool for creative expression and
artistic exploration.

Reference

Neural Style Transfer (NST) is a class of software algorithms that allows us to transform scenes, change/edit the environment of a media with the help of a Neural Network. NST finds use in image and video editing software allowing image stylization based on a general model, unlike traditional methods. This made NST a trending topic in the entertainment industry as professional editors/media producers create media faster and offer the general public recreational use. In this paper, the current progress in Neural Style Transfer with all related aspects such as still images and videos is presented critically. The authors looked at the different architectures used and compared their advantages and limitations. Multiple literature reviews focus on either the Neural Style Transfer (of images) or cover Generative Adversarial Networks (GANs) that generate video. As per the authors’ knowledge, this is the only research article that looks at image and video style transfer, particularly mobile devices with high potential usage. This article also reviewed the challenges faced in applying video neural style transfer in real-time on mobile devices and presents research gaps with future research directions. NST, a fascinating deep learning application, has considerable research and application potential in the coming years.

  1. Y. Jing, Y. Yang, Z. Feng, J. Ye, Y. Yu, and M. Song, ‘‘Neural style transfer: A review,’’ IEEE Trans. Vis. Comput. Graphics, vol. 26, no. 11, pp. 3365–3385, Nov. 2020.
  2. H. Li, A Literature Review of Neural Style Transfer. Princeton NJ, USA: Princeton Univ. Technical Report, 2019.
  3. J. Li, Q. Wang, H. Chen, J. An, and S. Li, ‘‘A review on neural style transfer,’’ J. Phys., Conf. Ser., vol. 1651, Nov. 2020, Art. no. 012156.
  4. I. J. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, ‘‘Generative adversarial networks,’’ 2014, arXiv:1406.2661. [Online]. Available: http://arxiv. org/abs/1406.2661
  5. T. Karras, S. Laine, and T. Aila, ‘‘A style-based generator architecture for generative adversarial networks,’’ 2018, arXiv:1812.04948. [Online]. Available: http://arxiv.org/abs/1812.04948