• Treffer 2 von 5
Zurück zur Trefferliste

Locally controllable neural style transfer on mobile devices

  • Mobile expressive rendering gained increasing popularity among users seeking casual creativity by image stylization and supports the development of mobile artists as a new user group. In particular, neural style transfer has advanced as a core technology to emulate characteristics of manifold artistic styles. However, when it comes to creative expression, the technology still faces inherent limitations in providing low-level controls for localized image stylization. In this work, we first propose a problem characterization of interactive style transfer representing a trade-off between visual quality, run-time performance, and user control. We then present MaeSTrO, a mobile app for orchestration of neural style transfer techniques using iterative, multi-style generative and adaptive neural networks that can be locally controlled by on-screen painting metaphors. At this, we enhance state-of-the-art neural style transfer techniques by mask-based loss terms that can be interactively parameterized by a generalized user interface toMobile expressive rendering gained increasing popularity among users seeking casual creativity by image stylization and supports the development of mobile artists as a new user group. In particular, neural style transfer has advanced as a core technology to emulate characteristics of manifold artistic styles. However, when it comes to creative expression, the technology still faces inherent limitations in providing low-level controls for localized image stylization. In this work, we first propose a problem characterization of interactive style transfer representing a trade-off between visual quality, run-time performance, and user control. We then present MaeSTrO, a mobile app for orchestration of neural style transfer techniques using iterative, multi-style generative and adaptive neural networks that can be locally controlled by on-screen painting metaphors. At this, we enhance state-of-the-art neural style transfer techniques by mask-based loss terms that can be interactively parameterized by a generalized user interface to facilitate a creative and localized editing process. We report on a usability study and an online survey that demonstrate the ability of our app to transfer styles at improved semantic plausibility.zeige mehrzeige weniger

Metadaten exportieren

Weitere Dienste

Suche bei Google Scholar Statistik - Anzahl der Zugriffe auf das Dokument
Metadaten
Verfasserangaben:Max ReimannORCiD, Mandy Klingbeil, Sebastian Pasewaldt, Amir SemmoORCiDGND, Matthias TrappORCiDGND, Jürgen Roland Friedrich DöllnerORCiDGND
DOI:https://doi.org/10.1007/s00371-019-01654-1
ISSN:0178-2789
ISSN:1432-2315
Titel des übergeordneten Werks (Englisch):The Visual Computer
Verlag:Springer
Verlagsort:New York
Publikationstyp:Wissenschaftlicher Artikel
Sprache:Englisch
Jahr der Erstveröffentlichung:2019
Erscheinungsjahr:2019
Datum der Freischaltung:20.10.2020
Freies Schlagwort / Tag:Expressive rendering; Interactive control; Mobile devices; Neural networks; Non-photorealistic rendering; Style transfer
Band:35
Ausgabe:11
Seitenanzahl:17
Erste Seite:1531
Letzte Seite:1547
Fördernde Institution:Federal Ministry of Education and Research (BMBF), GermanyFederal Ministry of Education & Research (BMBF) [01IS15041]
Organisationseinheiten:Digital Engineering Fakultät / Hasso-Plattner-Institut für Digital Engineering GmbH
DDC-Klassifikation:0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 000 Informatik, Informationswissenschaft, allgemeine Werke
Peer Review:Referiert
Verstanden ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.