Entropy is one of the most important concepts in physics. In my previous articles, I focused on what entropy is and its relationship with time. As you will remember, entropy is a measure of disorder and is in the positive direction of increase.
In this article, I will explain the expression of entropy on quantum computers and its relationship with quantum information in its broadest form. In the last part, we will learn what another important concept, entanglement entropy, means to us.
Let's consider a system. The more information we have about this system, the lower the entropy. The opposite is also true. That is, the less information we have about this system, the higher the value of entropy. So entropy is a measure of our ignorance about a system.
First, let's define this definition in a classical mathematical way. Next, let's write the equivalent in quantum mechanics.
Let's consider two sets A and B. Suppose that sets A and B contain one bit of information. Let these bits have N configurations for set A and n configurations for set B. In this case, we can write down the probabilities of finding the bits in the sets A and B.
If set B gets smaller, the entropy decreases because the information will be constrained.- If we have all the information about our system, then the entropy is zero.
- If we don't know anything about our system then the entropy will be maximum.
If we generalize this to classical bits;Of course, if our information is such that it gives a distribution, then we need to have an entropy equation on probability. This is expressed as follows.The minus sign in the equation is written to make the entropy positive in case the probabilities less than 1 turn out to be negative.
So, we can write the information as:
The logic is similar to the classical expression. However, in quantum mechanics, we formulate and use the density matrix.
Suppose a state vector is of probability Pi. However, let's write the entropy expression in quantum mechanics.
- Let a system be prepared in a pure state. So let's 100% know in which configuration it is prepared. In this case, our entropy in this system would be zero. Because we already have all the knowledge.
- Another limit situation is as follows. You get prepared for different situations with equal probabilities. In this case, the entropy of our system becomes maximum.
Let's evaluate this through an example. Let's look at the entanglement below.
The entropy of this entangled state is zero. Because we know 100% that our state vector is prepared in this way.However, since the subsystems that make up our entangled system are prepared with equal probabilities, their entropy becomes maximum.
In summary, in entanglement entropy, the entropy of the whole of the system is zero, but the entropies of the subsystems that make it up are at their maximum value.
Reference
https://www.youtube.com/watch?v=TGrK3oDOq1k
Thanks for information.
ReplyDelete