Загрузка страницы

Is ENTROPY Really a "Measure of Disorder"? Physics of Entropy EXPLAINED and MADE EASY

This is how I personally wrapped my head around the idea of entropy! I found the statistical mechanics explanation much easier to grasp than the thermodynamics (original) one.

Hey everyone, I'm back with another video, and this one has been highly requested! I really enjoy making videos about thermodynamics because it helps me to wrap my head around the topic more easily. I'll be hosting another poll to see which area of physics you want me to talk about in a future video.

Before we go any further, please check out the little document I've written up with 5 questions that you can attempt upon watching this video. They should help you get an even deeper insight into what entropy really means, and what it represents. Check it out here: https://drive.google.com/drive/folders/1AKTYC6GcpsI7ZnKQEWGIjk2-g6P_R5hR?usp=sharing

In this video, we're talking about entropy. More specifically, we're talking about the definition of entropy that deals with systems on the microscopic level - looking at the particles a system is made of, rather than just the pressure / volume / temperature of the entire system. Believe it or not, these are two different approaches in physics, each with their own merits. While the original (classical thermodynamics) definition of entropy looked at the system as a whole, we didn't get a deeper understanding of entropy until we looked on the small scale and introduced this statistical mechanics definition.

We start by considering an abstract system consisting of some number of particles in a box. These particles can occupy specific energy levels (meaning each of these particles can carry / have a specific amount of energy). Given these restrictions, as well as measurements we can make on the system of (a) how many particles there are in the system, and (b) what the total energy of the system is, we can work out all the possible ways the particles in the system can be arranged in their energy levels. This is really important, because all the possible ways the particles can be arranged, are known as all the possible microstates (sometimes written as micro states) of the system. The entropy of the system is directly dependent on the number of microstates.

The statistical mechanics definition of entropy tells us that a system's entropy is equal to the Boltzmann constant multiplied by the natural logarithm of the total number of possible microstates. This is explained much better in the video, so stop reading this description lol. If you're not confident with logarithms, check out this page on what ln means: https://en.wikipedia.org/wiki/Natural_logarithm

This also brings us to the common description of entropy, as being a "measure of disorder". If a system has lots of possible microstates it can be arranged in, this means that the particles in that system can be arranged in many ways, and the entropy of the system is larger. These systems are known as "disordered" because of the large number of ways in which they could be arranged. However, systems with fewer possible microstates are more "ordered", and they have smaller values of entropy. Hence, entropy is a "measure of disorder". The more possible microstates, the larger the entropy.

One assumption we make when calculating a system's entropy is the assumption of equal a priori probability, or the fundamental assumption of statistical thermodynamics. This assumption states that the system is equally likely to be in any one of the possible microstates, and is a good one to make for a system that is isolated from external influences, and is in thermal equilibrium across its extent. Read up about this here: https://en.wikipedia.org/wiki/A_priori_probability

Thank you all for watching this video, and please do check out the little document I've written (linked above) and attempt the questions in it. Keep an eye out for the video I'm going to make walking through all the solutions too.

If you want to check out some music I'm making, head over to my second channel on Parth G's Shenanigans. Follow me on Instagram @parthvlogs. I'll see you really soon!
Merch - https://parth-gs-merch-stand.creator-spring.com/

Видео Is ENTROPY Really a "Measure of Disorder"? Physics of Entropy EXPLAINED and MADE EASY канала Parth G
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
18 августа 2020 г. 21:00:15
00:11:13
Яндекс.Метрика