Entropy , uncertainty and related concepts in Brownian motion process

In this paper, at first, the amount of irregularity or the uncertainty of Brownian motion process is obtained. The results shows that this uncertainty will be high by the time. In the second step, the entropy of (B(t1),B(t2), ...,B(tn)) as a Gaussian process of Brownian motion is introduced. In special case, it is shown that the joint entropy of two Brownian motion increases with increasing s and t. In the third step, the conditional entropy of Brownian motion is computed and via it the entropy of Brownian Bridge is obtained. The mutual information between two processes is introduced as a measures of how much knowing one of these processes reduces uncertainty about the other. Finally, the Entropy of Wiener integral is introduced.


Introduction
Botanist R. Brown described the motion of a pollen particle suspended in fluid in 1828.It was observed that a particle moved in an irregular, random fashion.A. Einstein, in 1905, argued that the movement is due to bombardment of the particle by the molecules of the fluid, he obtained the equations for Brownian motion.In 1900, L. Bachelier used the Brownian motion as a model for movement of stock prices in his mathematical theory of speculation.The mathematical foundation for Brownian motion as a stochastic process was done by N. Wiener in 1931, and this process is also called the Wiener process [3].Brownian motion is the macroscopic picture emerging from a particle moving randomly in d-dimensional space without making very big jumps.On the microscopic level, at any time step, the particle receives a random displacement, caused for example by other particles hitting it or by an external force.It is surprisingly difficult to construct an example of a continuous function which is not differentiable at any point.An example of a continuous, nowhere differentiable function was given by the Weierstrass in 1872.It can be shown that [2], [3] for any t however,almost all trajectories of Brownian motion are a continuous function of t but are not differentiable at t.However, the Brownian motion has infinite variation [2], [3], so it seams that this process has specific conditions in terms of irregularity.So I want to describe this process by entropy.
A key measure of information in the theory is known as entropy, which is usually expressed by the average number of bits needed for storage or communication.Intuitively, entropy quantifies the uncertainty involved when encountering a random variable.For example, a fair coin flip (2 equally likely outcomes) will have less entropy than a roll of a dice (6 equally likely outcomes).
It is easy to recognize that uncertainty plays an important role in human affairs.For example, making everyday decisions in ordinary life is inseparable from uncertainty.In decision making, we are uncertain about the future.We choose a particular action, from among a set of conceived actions, on the basis of our anticipation of the consequences of the individual actions.Our anticipation of future events is, of course, inevitably subject to uncertainty.However, uncertainty in ordinary life is not confined to the future alone, but may pertain to the past and present as well.We are uncertain about past events, because we usually do not have complete and consistent records of the past.We are uncertain about many historical events, crime-related events, geological events, events that caused various disasters, and a myriad of other kinds of events, including many in our personal lives.We are uncertain about present affairs because we lack relevant information.A typical example is diagnostic uncertainty in medicine or engineering.As is well known, a physician (or an engineer) is often not able to make a definite diagnosis of a patient (or a machine) in spite of knowing outcomes of all presumably relevant medical (or engineering) tests and other pertinent information.We use logarithms to base 2. The entropy will then be measured in bits.The entropy is a measure of the average uncertainty in the random variable.It is the number of bits on average required to describe the random variable.

General concepts of differential entropy
In this section I'll introduce some of the essential ideas and quantities from information theory.The material reviewed here is standard.A good, thorough reference is the text by Cover and Thomas [1].

General definitions Definition 2.1. The differential entropy H(X) of a continuous random variable X with density f
where S is the support set of the random variable.Remember that we use the convention 0log0 = 0.
(2) For any 0 < s < t, the random variable B(t) − B(s) is normally distributed with mean 0 and variance t − s.This implies (taking s = 0) that B(t) has N(0,t) distribution.
(4) Almost all sample paths of B(t, ω) are continuous functions.
The blow figure shows one sample pass of this process.
International Scientific Publications and Consulting Services From (2.8) we have: In the below table, you see some of these values.We see that the uncertainty or irregularity will be high by the time.The general trend of this entropy can be seen in the blow figure.

International Scientific Publications and Consulting Services
. Then from (2.9) we can write As a result, if we put n = 2 we get and, The following figure shows the general trend of this joint entropy.As expected, the entropy increases with increasing s and t.
International Scientific Publications and Consulting Services     Using Relation (3.17), we have then from (2.8), we can write The following figure shows this entropy.Intuitively, mutual information measures the information that B(t) and B(s) share: it measures how much knowing one of these processes reduces uncertainty about the other.symmetric measure of uncertainty coefficient is defined as the following redundancy measure:

R = I(B(s); B(t)) H(B(s)) + H(B(t))
. (3.23) which attains a minimum of zero when the two process are independent and a maximum value of  It can be shown that [2] the Wiener integral is a Gaussian random variable with mean 0 and variance ∥ f ∥ 2 = ∫ b a f (t) 2 dt.Therefore from (2.8) it can be concluded that (4.25)

Conclusion
If g is continuous and of finite variation function, then its quadratic variation is zero.Since the quadratic variation of a Brownian motion over [0,t] is t and not zero implies infinite variation.So, as expected,the overall effect of this process is tend to guide to the increase in entropy or disorder.This tends is occur because this process has infinite variation on any interval, no matter how small it is.The results showed that this uncertainty will be high by the time.The main aim of this paper was entering the entropy in Brownian motion process to help it to display some of the interesting features of Brownian motion.I hope in the future, we can do some statistical inference about Brownian motion process by these concepts.

Figure 1 :
Figure 1: One sample pass of Brownian motion

Figure 2 :
Figure 2: The entropy of Brownian motion

Figure 3 :
Figure 3: The bivariate entropy of Brownian motion

Figure 6 :
Figure 6: One sample pass of Brownian Bridge

Figure 10 :
Figure 10: Mutual information of Brownian motion for t = 6

Table 1 :
The entropy value of Brownian motion for some t