Controlling strokes in fast neural style transfer using content transforms
- Fast style transfer methods have recently gained popularity in art-related applications as they make a generalized real-time stylization of images practicable. However, they are mostly limited to one-shot stylizations concerning the interactive adjustment of style elements. In particular, the expressive control over stroke sizes or stroke orientations remains an open challenge. To this end, we propose a novel stroke-adjustable fast style transfer network that enables simultaneous control over the stroke size and intensity, and allows a wider range of expressive editing than current approaches by utilizing the scale-variance of convolutional neural networks. Furthermore, we introduce a network-agnostic approach for style-element editing by applying reversible input transformations that can adjust strokes in the stylized output. At this, stroke orientations can be adjusted, and warping-based effects can be applied to stylistic elements, such as swirls or waves. To demonstrate the real-world applicability of our approach, we presentFast style transfer methods have recently gained popularity in art-related applications as they make a generalized real-time stylization of images practicable. However, they are mostly limited to one-shot stylizations concerning the interactive adjustment of style elements. In particular, the expressive control over stroke sizes or stroke orientations remains an open challenge. To this end, we propose a novel stroke-adjustable fast style transfer network that enables simultaneous control over the stroke size and intensity, and allows a wider range of expressive editing than current approaches by utilizing the scale-variance of convolutional neural networks. Furthermore, we introduce a network-agnostic approach for style-element editing by applying reversible input transformations that can adjust strokes in the stylized output. At this, stroke orientations can be adjusted, and warping-based effects can be applied to stylistic elements, such as swirls or waves. To demonstrate the real-world applicability of our approach, we present StyleTune, a mobile app for interactive editing of neural style transfers at multiple levels of control. Our app allows stroke adjustments on a global and local level. It furthermore implements an on-device patch-based upsampling step that enables users to achieve results with high output fidelity and resolutions of more than 20 megapixels. Our approach allows users to art-direct their creations and achieve results that are not possible with current style transfer applications.…
Author details: | Max ReimannORCiD, Benito Buchheim, Amir SemmoORCiDGND, Jürgen DöllnerORCiDGND, Matthias TrappORCiDGND |
---|---|
DOI: | https://doi.org/10.1007/s00371-022-02518-x |
ISSN: | 0178-2789 |
ISSN: | 1432-2315 |
Title of parent work (English): | The Visual Computer |
Publisher: | Springer |
Place of publishing: | New York |
Publication type: | Article |
Language: | English |
Date of first publication: | 2022/06/08 |
Publication year: | 2022 |
Release date: | 2024/01/22 |
Volume: | 38 |
Issue: | 12 |
Number of pages: | 15 |
First page: | 4019 |
Last Page: | 4033 |
Funding institution: | Projekt DEAL; German Federal Ministry of Education and Research (BMBF); [01IS18092, 01IS19006] |
Organizational units: | An-Institute / Hasso-Plattner-Institut für Digital Engineering gGmbH |
DDC classification: | 0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 000 Informatik, Informationswissenschaft, allgemeine Werke |
Peer review: | Referiert |
Publishing method: | Open Access / Hybrid Open-Access |
License (German): | CC-BY - Namensnennung 4.0 International |