Locally controllable neural style transfer on mobile devices
- Mobile expressive rendering gained increasing popularity among users seeking casual creativity by image stylization and supports the development of mobile artists as a new user group. In particular, neural style transfer has advanced as a core technology to emulate characteristics of manifold artistic styles. However, when it comes to creative expression, the technology still faces inherent limitations in providing low-level controls for localized image stylization. In this work, we first propose a problem characterization of interactive style transfer representing a trade-off between visual quality, run-time performance, and user control. We then present MaeSTrO, a mobile app for orchestration of neural style transfer techniques using iterative, multi-style generative and adaptive neural networks that can be locally controlled by on-screen painting metaphors. At this, we enhance state-of-the-art neural style transfer techniques by mask-based loss terms that can be interactively parameterized by a generalized user interface toMobile expressive rendering gained increasing popularity among users seeking casual creativity by image stylization and supports the development of mobile artists as a new user group. In particular, neural style transfer has advanced as a core technology to emulate characteristics of manifold artistic styles. However, when it comes to creative expression, the technology still faces inherent limitations in providing low-level controls for localized image stylization. In this work, we first propose a problem characterization of interactive style transfer representing a trade-off between visual quality, run-time performance, and user control. We then present MaeSTrO, a mobile app for orchestration of neural style transfer techniques using iterative, multi-style generative and adaptive neural networks that can be locally controlled by on-screen painting metaphors. At this, we enhance state-of-the-art neural style transfer techniques by mask-based loss terms that can be interactively parameterized by a generalized user interface to facilitate a creative and localized editing process. We report on a usability study and an online survey that demonstrate the ability of our app to transfer styles at improved semantic plausibility.…
Author details: | Max ReimannORCiD, Mandy Klingbeil, Sebastian Pasewaldt, Amir SemmoORCiDGND, Matthias TrappORCiDGND, Jürgen Roland Friedrich DöllnerORCiDGND |
---|---|
DOI: | https://doi.org/10.1007/s00371-019-01654-1 |
ISSN: | 0178-2789 |
ISSN: | 1432-2315 |
Title of parent work (English): | The Visual Computer |
Publisher: | Springer |
Place of publishing: | New York |
Publication type: | Article |
Language: | English |
Year of first publication: | 2019 |
Publication year: | 2019 |
Release date: | 2020/10/20 |
Tag: | Expressive rendering; Interactive control; Mobile devices; Neural networks; Non-photorealistic rendering; Style transfer |
Volume: | 35 |
Issue: | 11 |
Number of pages: | 17 |
First page: | 1531 |
Last Page: | 1547 |
Funding institution: | Federal Ministry of Education and Research (BMBF), GermanyFederal Ministry of Education & Research (BMBF) [01IS15041] |
Organizational units: | Digital Engineering Fakultät / Hasso-Plattner-Institut für Digital Engineering GmbH |
DDC classification: | 0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 000 Informatik, Informationswissenschaft, allgemeine Werke |
Peer review: | Referiert |