Controlling strokes in fast neural style transfer using content transforms
- Fast style transfer methods have recently gained popularity in art-related applications as they make a generalized real-time stylization of images practicable. However, they are mostly limited to one-shot stylizations concerning the interactive adjustment of style elements. In particular, the expressive control over stroke sizes or stroke orientations remains an open challenge. To this end, we propose a novel stroke-adjustable fast style transfer network that enables simultaneous control over the stroke size and intensity, and allows a wider range of expressive editing than current approaches by utilizing the scale-variance of convolutional neural networks. Furthermore, we introduce a network-agnostic approach for style-element editing by applying reversible input transformations that can adjust strokes in the stylized output. At this, stroke orientations can be adjusted, and warping-based effects can be applied to stylistic elements, such as swirls or waves. To demonstrate the real-world applicability of our approach, we presentFast style transfer methods have recently gained popularity in art-related applications as they make a generalized real-time stylization of images practicable. However, they are mostly limited to one-shot stylizations concerning the interactive adjustment of style elements. In particular, the expressive control over stroke sizes or stroke orientations remains an open challenge. To this end, we propose a novel stroke-adjustable fast style transfer network that enables simultaneous control over the stroke size and intensity, and allows a wider range of expressive editing than current approaches by utilizing the scale-variance of convolutional neural networks. Furthermore, we introduce a network-agnostic approach for style-element editing by applying reversible input transformations that can adjust strokes in the stylized output. At this, stroke orientations can be adjusted, and warping-based effects can be applied to stylistic elements, such as swirls or waves. To demonstrate the real-world applicability of our approach, we present StyleTune, a mobile app for interactive editing of neural style transfers at multiple levels of control. Our app allows stroke adjustments on a global and local level. It furthermore implements an on-device patch-based upsampling step that enables users to achieve results with high output fidelity and resolutions of more than 20 megapixels. Our approach allows users to art-direct their creations and achieve results that are not possible with current style transfer applications.…
Verfasserangaben: | Max ReimannORCiD, Benito Buchheim, Amir SemmoORCiDGND, Jürgen DöllnerORCiDGND, Matthias TrappORCiDGND |
---|---|
DOI: | https://doi.org/10.1007/s00371-022-02518-x |
ISSN: | 0178-2789 |
ISSN: | 1432-2315 |
Titel des übergeordneten Werks (Englisch): | The Visual Computer |
Verlag: | Springer |
Verlagsort: | New York |
Publikationstyp: | Wissenschaftlicher Artikel |
Sprache: | Englisch |
Datum der Erstveröffentlichung: | 08.06.2022 |
Erscheinungsjahr: | 2022 |
Datum der Freischaltung: | 22.01.2024 |
Band: | 38 |
Ausgabe: | 12 |
Seitenanzahl: | 15 |
Erste Seite: | 4019 |
Letzte Seite: | 4033 |
Fördernde Institution: | Projekt DEAL; German Federal Ministry of Education and Research (BMBF); [01IS18092, 01IS19006] |
Organisationseinheiten: | An-Institute / Hasso-Plattner-Institut für Digital Engineering gGmbH |
DDC-Klassifikation: | 0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 000 Informatik, Informationswissenschaft, allgemeine Werke |
Peer Review: | Referiert |
Publikationsweg: | Open Access / Hybrid Open-Access |
Lizenz (Deutsch): | CC-BY - Namensnennung 4.0 International |