Introduction to Ergodic rates for Markov chains and processes

  • The present lecture notes aim for an introduction to the ergodic behaviour of Markov Processes and addresses graduate students, post-graduate students and interested readers. Different tools and methods for the study of upper bounds on uniform and weak ergodic rates of Markov Processes are introduced. These techniques are then applied to study limit theorems for functionals of Markov processes. This lecture course originates in two mini courses held at University of Potsdam, Technical University of Berlin and Humboldt University in spring 2013 and Ritsumameikan University in summer 2013. Alexei Kulik, Doctor of Sciences, is a Leading researcher at the Institute of Mathematics of Ukrainian National Academy of Sciences.

Download full text files

  • lpam02.pdfeng
    (1196KB)

    SHA-1:6f5876086ba6523cba88037a9cdc100722e5d055

Export metadata

Metadaten
Author details:Alexei Michajlovič KulikORCiDGND
URN:urn:nbn:de:kobv:517-opus4-79360
ISBN:978-3-86956-338-1
ISSN:2199-4951
ISSN:2199-496X
Subtitle (English):with applications to limit theorems
Publication series (Volume number):Lectures in pure and applied mathematics (2)
Publisher:Universitätsverlag Potsdam
Place of publishing:Potsdam
Editor(s):Sylvie Roelly
Publication type:Monograph/Edited Volume
Language:English
Year of first publication:2015
Publication year:2015
Publishing institution:Universität Potsdam
Release date:2015/09/16
Tag:Konvergenzrate; Langzeitverhalten; Markovprozesse
Markov processes; ergodic rates; long-time behaviour
Number of pages:ix, 122 S.
Organizational units:Mathematisch-Naturwissenschaftliche Fakultät / Institut für Mathematik
DDC classification:5 Naturwissenschaften und Mathematik / 51 Mathematik / 510 Mathematik
Publishing method:Universitätsverlag Potsdam
License (German):License LogoCC-BY-SA - Namensnennung, Weitergabe zu gleichen Bedingungen 4.0 International
Accept ✔
This website uses technically necessary session cookies. By continuing to use the website, you agree to this. You can find our privacy policy here.