Correntropy is a statistical quantity that captures nonlinear similarity between an indexed set of random variables, defined as $V(i, j) = E[\kappa(X_i, X_j)]$, where $X_i$ are the random variables, and $\kappa(\cdot, \cdot)$ is a symmetric positive-semi-definite kernel [1]. Due to the reproducing kernel Hilbert space (RKHS) theory, any symmetric positive semi-definite kernel induces an RKHS, that is it can be used as an inner product in an extended space [2]. It is easy to show that correntropy is symmetric positive-definite, and hence it defines the so-called correntropy RKHS.
Each element in the correntropy RKHS is a functional that maps from an index of a random variable to a real value. In general, due to representer theorem, any functional in the space can be represented as, $g(j) = \sum_i \alpha_i E[\kappa(X_i, X_j)]$, a linear combination of the mapped random variable vectors. When the random variables that constitute the space is rich enough, then the one can approximate expectations by estimating $\alpha_i$‘s. However, unfortunately, this has little practical value, because the richness has to come from the joint of pairs of random variables. It is an interesting subject to ponder.