# Mixture of point processes

Suppose you mix two Gaussian random variables and equally, that is, if one samples from the mixture, with probability 1/2, it comes from the first Gaussian and vice versa. It is evident that the mixture of Gaussians is not a Gaussian. (Do not confuse with adding two Gaussian random variables which produces another Gaussian random variable.)

Similarly, mixture of inhomogeneous Poisson processes results in a non-Poisson point process. The figure below illustrates the difference between a mixture of two Poisson processes (B) and a Poisson process with the same marginal intensity (rate) function (A). The colored bars indicates the rate over the real line (e.g. time); in this case they are constant rate over a fixed interval. The 4 realizations from each process A and B are represented by rows of vertical ticks.

Several special cases of mixed Poisson processes are studied [1], however, they are mostly limited to modeling over-dispersed homogeneous processes. In theoretical neuroscience, it is necessary to mix arbitrary (inhomogeneous) point processes. For example, to maximize the mutual information between the input spike trains and the output spike train of a neuron model, the entropy of a mixture of point processes is needed.

In general, a regular point process on the real line can be completely described by the conditional intensity function where is the full spiking history up to time [2]. Let us take the discrete limit to form regular point processes. Let to be the probability of a spike (an event) at the -th bin of size , that is,

where are the 0-1 responses in all the previous bins. The likelihood of observing or , given the history is simply,

In the limit of small , this approximation converges to a regular point process. A fun fact is that a mixture of Bernoulli random variables is Bernoulli again, since it’s the only distribution for 0-1-valued random variables. Specifically, for a family of Bernoulli random variables with probability of 1 being indexed by , and a mixing distribution , the probability of observing one symbol or is

where is the average probability.

Suppose we mix with . Then, similarly, for binned point process representation, above implies that,

where is the marginal rate. Moreover, due to causal dependence between ‘s, we can chain the expansion and get the marginal probability of observing ,

Therefore, in the limit the mixture point process is represented by the conditional intensity function,

.

**Conclusion: The conditional intensity function of a mixture of point processes is given by the expected conditional intensity function over the mixing distribution.**

**References**

- Grandell. Mixed Poisson processes. Chapman & Hall / CRC Press 1997
- Daley, Vere-Johns. An Introduction to the Theory of Point Processes. Springer.
- Taro Toyoizumi, Jean-Pascal Pfister, Kazuyuki Aihara, Wulfram Gerstner. Generalized Bienenstock–Cooper–Munro rule for spiking neurons that maximizes information transmission. PNAS, 2005. doi:10.1073/pnas.0500495102

Interesting! I don’t understand how you have N(0,1) and N(0,-1). Does the second one have negative variance? I tried googling that and ended up in mostly esoteric stuff that I don’t know about. Your chart looks to me more like evenly mixing N(1,1) with N(-1,1).

Also, do you mind explaining what is on the horizontal figure in the second axis?

Thanks for your interest and for pointing out the errors. The each vertical tick in the second figure represents an event such as an action potential in time. I tried adding more explanation to the text.