3.8 Proceedings Paper

Towards Machine-Learning Assisted Asset Generation for Games: A Study on Pixel Art Sprite Sheets

Publisher

IEEE
DOI: 10.1109/SBGames.2019.00032

Keywords

Deep Learning; Generative Adversarial Networks; Asset Generation; Procedural Content Generation; Qualitative and Quantitative Analyses; Character Designers Evaluation

Funding

  1. CAPES/FUNCAP
  2. CNPq [88887.176617/2018-00, 439067/2018-9]

Ask authors/readers for more resources

Game development is simultaneously a technical and an artistic challenge. The past two decades have brought many improvements to general-purpose game engines, reducing the new games development effort considerably. However, the amount of artistic work per title has continuously grown ever since, as a result of increased audience's expectations. The cost of asset-making is further increased based on the aesthetics chosen by the design team and the availability of professionals capable of understanding the nuances of the specific visual language chosen. In this paper, we dig into the topic of deep-learning assets generation to reduce the costs of the asset making pipeline, a major concern for game development teams. More specifically, we tackle the challenge of generating pixel art sprites from line art sketches using state-of-the-art image translation techniques. We set this work within the pipeline of Trajes Fatais: Suits of Fate, a 2D pixel-art fighting game inspired by the late nineties classics of the fighting genre. The results show that our deep-learning assets generation technique is able to generate sprites that look similar to those created by the artists' team. Moreover, by means of qualitative and quantitative analyses, as well as character designers evaluation, we demonstrate the similarity of the generated results to the ground truth.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available