Problem 1. Let $X_1,X_2,\ldots$ be a sequence of i.i.d. real random variables. The entropy power inequality (EPI) easily implies that for any unit vector $(a_1,\ldots,a_n)$ Shannon entropy satisfies the inequality
\[
\textrm{Ent}\left(\sum_{i=1}^n a_i X_i \right) \geq \textrm{Ent}\left( X_1 \right).
\]
Moreover, due to the celebrated work of Artstein, Ball, Barthe and Naor (see [ABBN] ), we have
\[
\textrm{Ent}\left( \frac{1}{\sqrt{n}}\sum_{i=1}^n X_i \right) \leq \textrm{Ent}\left( \frac{1}{\sqrt{n+1}}\sum_{i=1}^{n+1} X_i \right).
\]
It is therefore natural to ask whether the following general inequality holds true,
\[
(a_1^2,\ldots, a_n^2) \prec (b_1^2,\ldots, b_n^2) \quad \implies \quad \textrm{Ent}\left( \sum_{i=1}^n b_i X_i \right) \leq \textrm{Ent}\left( \sum_{i=1}^n a_i X_i \right).
\]
Here $\prec$ means the standard Schur majorization of sequences. Unfortunately, in [BNT] (Proposition 2) it has been shown that this is not the case. However, it is conjectured that the above is true in the case of log-concave random variables (see also Theorem 7 in [ENT] ). In particular, even the following question seems to be open. \\
Question. Is it true that for i.i.d. real log-concave random variables $X,Y$ the function
\[
[0,1] \ni \lambda \mapsto \textrm{Ent}\left( \sqrt{\lambda} X + \sqrt{1-\lambda} Y \right)
\]
is non-decreasing? Or, at least, does it have the global maximum at $\lambda=1/2$? \\
Another special case of the above general Schur concavity property leads to the following question (see [ENT] , Question 8). \\
Question. Let $U_1,\ldots, U_n$ be i.i.d. random variables distributed uniformly on $[-1,1]$. Consider the function
\[
S^{n-1} \ni (a_1,\ldots,a_n) \quad \longmapsto \quad \textrm{Ent}\left( \sum_{i=1}^n a_i U_i \right).
\]
Is the maximum of this function achieved for the vector $(1/\sqrt{n},\ldots,1/\sqrt{n})$? \\
Problem 2. It is well known that among all random variables with variance one, Shannon entropy is maximized for a standard Gaussian random variable. The following question concerns a simple reverse EPI of the above flavour (see Question 11 and Proposition 9 in [ENT] ). \\
Question. Let $X,Y$ be i.i.d. real random variables with variance one and let $G$ be a standard Gaussian random variable. Is it true that $\textrm{Ent}(X+Y) \leq \textrm{Ent}(X+G)$? \\
In general, the answer is negative if we do not assume that $X$ and $Y$ are identically distributed, but assume only independence.
\\
Problem 3. There has been a great deal of work about reverse entropy power inequalities for log-concave vectors (see e.g. [BM] , [BM1] ). The following question is still open.\\
Question. Is it true that for a symmetric log-concave random vector $(X,Y)$ we have
\[
e^{\textrm{Ent}(X+Y)} \leq e^{\textrm{Ent}(X)} + e^{\textrm{Ent}(Y)}?\]
Equivalently, for every such vector $(X,Y)$, is the function $t \mapsto e^{\textrm{Ent}(X+tY)}$ convex on the real line?
Equivalently, linearising, is it true that for a symmetric log-concave random vector $(X,Y)$ and every $\lambda \in [0,1]$ we have
\[
\textrm{Ent}(\lambda X+ (1-\lambda) Y) \leq \max\{\textrm{Ent}(X),\textrm{Ent}(Y)\}?\]
\\
This question is also equivalent to the statement conjectured in [BNT] saying that for a symmetric log-concave random vector $(X_1,\ldots,X_n)$ the function $v=(v_1,\ldots,v_n) \mapsto e^{\textrm{Ent}(v_1X_1+\ldots+v_nX_n)}$ defined on $R^n$ satisfies the triangle inequality. It was shown in that paper that the function $v=(v_1,\ldots,v_n) \mapsto e^{\frac{1}{5}\textrm{Ent}(v_1X_1+\ldots+v_nX_n)}$ satisfies the triangle inequality.\\
Moreover, to answer the above question in the affirmative, it is enough to prove that for a symmetric log-concave random vector $(X,Y)$ the function $t \mapsto \textrm{Ent}(X+e^tY)$ is convex on the real line.
References. {9}
[ABBN] S. Artstein, K. M. Ball, F. Barthe, and A. Naor, Solution of Shannon’s
problem on the monotonicity of entropy, J. Amer. Math. Soc., Vol. 17
(4), 2004, 975--982.
[BNT] K. M. Ball, P. Nayar, T. Tkocz, A reverse entropy power inequality for log-concave random vectors, Studia Mathematica 235, 2016, 17--30.
[BM] Bobkov, S. G., Madiman, M.,
Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures,
\emph{J. Funct. Anal.} 262 (2012), no. 7, 3309--3339.
[BM1] Bobkov, S. G., Madiman, M.,
On the problem of reversibility of the entropy power inequality. \emph{Limit theorems in probability, statistics and number theory}, 61--74,
Springer Proc. Math. Stat., 42, \emph{Springer, Heidelberg}, 2013.
[ENT] A. Eskenazis, P. Nayar, T. Tkocz, Gaussian mixture: entropy and geometric inequalities, 2016, preprint