Metric entropy 2

I am reading the article “ENTROPY THEORY OF GEODESIC FLOWS”.

Now we focus on the upper semi-continuity of the metric entropy map. The object we investigate is $(X,T,\mu)$, where $\mu$ is a $T$-invariant measure.

The insight that makes us interested in this kind of problem is a part of a variational problem, something about the existence of a certain object that combines a certain moduli space to make some quantity attain a critical value (maximum or minimum). The simplest example may be the Isoperimetric inequality and the Dirichlet principle of Laplace. Anyway, to establish such an existing result, a classical approach is to prove the upper semi-continuity and boundedness for the associated energy of the problem. In our case, the semi-continuity will be something about the regularity of the entropy map:

$E:M(X,T) \to h_{\mu}.$

We define the entropy at infinity:

$\sup_{(\mu_n)}\limsup_{\mu_n \to 0}h_{\mu_n}(T)$

Where $(\mu_n){n=1}^{\infty}$ varies in all sequences of measures converging to $0$ in the sense that for all $A\subset M$, $A$ measurable, then $\lim{n \to \infty} \mu_{n}(A)=0$.

Compact case

We say something about the compact case. In this case, we have a finite partition with smaller and smaller cubes, which could be understood as a sequence of smaller and smaller scales. An example to explain the differences is $\mathbb{N}^{\mathbb{N}},\sigma$, the shift map on a countable alphabet.

Because of this, there is a good asymptotic model, i.e. h-expansion, and its generalization, asymptotically h-expansion equipped on a compact metric space $X$, has been proved such that the corresponding entropy map is upper semi-continuous.

In particular, $C^{\infty}$ diffeomorphisms on a compact manifold are asymptotically h-expansive.

Natural problem but I do not understand very well:

Why is it natural to assume the measure to be a probability measure in the non-compact space?

Non-compact case

$(X,d)$ metric space

$T: X\longrightarrow X$ is a continuous map.

$d_n(x,y)=\sup_{0\leq k\leq n-1}d(T^kx,T^ky)$, then $d_{n}$ is still a metric.

It is easy to see that $\frac{1}{n}h_{\mu}(T^n)=h_{\mu}(T)$. This identity could be proved by the creation of entropy by $\delta$-separated sets and $\delta$-cover sets.

Kapok theorem:

Let $X$ be a compact space. For every ergodic measure $\mu$, the following formula holds:

$$h_{\mu}(T) = \lim_{\epsilon \to 0} \limsup_{n \to \infty} \frac{1}{n} \log N_{\mu}(n,\epsilon,\delta)$$

where $h_{\mu}(T)$ is the measure-theoretic entropy of $\mu$.

Riquelme proved that the same formula holds for Lipschitz maps on a topological manifold.

Let $M_{e}(X,T)$ be the moduli space of $T$-invariant portability measures.

Let $M(X,T)$ be the moduli space of ergodic $T$-invariant probability measures.

Simplified entropy formula:

A system $(X,d,T)$ satisfies the simplified entropy formula if for all $\epsilon > 0$ sufficiently small and for all $\delta \in (0,1)$, $\mu \in M_{e}(X,T)$, we have:

$$h_{\mu}(T) = \limsup_{n \to \infty} \frac{1}{n} \log(N_{\mu}(n,\epsilon,\delta))$$

Simplified entropy inequality:

If $\epsilon > 0$ is sufficiently small, $\mu \in M_{e}(X,T)$, and $\delta \in (0,1)$, then:

$$h_{\mu}(T) \leq \limsup_{n \to \infty} \frac{1}{n} \log(N_{\mu}(n,\epsilon,\delta))$$

Weak entropy dense:

$M_{e}(X,T)$ is weak entropy dense in $M(X,T)$. For all $\lambda > 0$ and for all $\mu \in M(X,T)$, there exists $\mu_{n} \in M_{e}(X,T)$ such that:

  1. $\mu_n\to \mu$ weakly.
  2. $h_{\mu_n}(T)>h_{\mu}(T)-\lambda$, $\forall \lambda>0$.