
翰林國際教育全網首發
力爭超快速發布最全資料
為你
千千萬萬遍
Online線上學術活動部分此卷共4大題
但每題下分不同數量的小題
完整版下載鏈接見文末
The subject matter of this document is statistical mechanics, or the study of how macroscopic results manifest from microscopic interactions in systems with many interacting parts.
Our goal is to provide a uni?ed introduction to statistical mechanics using the concept of entropy. A precise de?nition of entropy pervades statistical mechanics and other scienti?c subjects and is useful in its own right. While many students may have heard the word entropy before, entropy is rarely explained in its full detail or with rigorous mathematics, leaving students confused about many of its implications. Moreover, when students learn about thermodynamic laws, laws that describe the macroscopic results of statistical mechanics, like “change in internal energy = heat ?ow in + work done on a system”, the concepts of internal energy, heat, and work are all left at the mercy of a student’s vague, intuitive understanding.
In this document, we will see that formulating statistical mechanics with a focus on entropy can provide a more uni?ed and symmetric understanding of many of the laws of thermodynamics. Indeed some laws of thermodynamics which appear confusing and potentially unrelated at ?rst glance can in fact all be seen to follow from the same treatment of entropy in statistical mechanics.
At this point in your life, you may have heard the word entropy, but chances are, it was given a vague, non-committal de?nition. The goal of this section is to introduce a more explicit concept of entropy from an abstract standpoint before considering its experimental and observational signatures.
1.1 Quantifying the amount of information in the answer to a question
Entropy, while useful in physics, also has applications in computer science and information theory. This section will explore the concept of information entropy as an abstract object.
We will first consider entropy not as related to the concept of heat in objects, but as a purely axiomatic quantification of what we mean by the information we receive when we hear the answer to a question. For a concrete example, imagine that someone flips a coin and doesn’t reveal which side landed upright and we ask “what was the outcome of the coin flip? Heads or tails?” When they now tell us the answer, how much “information” do we gain by learning what the outcome was? In other words, we are faced with determining how much information is received when we hear the answer to a question that has a probability distribution of outcomes. This question was asked by Claude Shannon in 1948.
After pondering this question for a long time, you might come up with a few criteria that any reasonable measure must obey, such as:
完整版真題資料可以底部二維碼免費領取↓↓↓
免費領取最新年份學術活動真題及解析

翰林學員全站資料免費打包下載,專享高速下載通道。
[vc_btn title="查看更多PUPC相關詳情" color="primary" align="center" i_icon_fontawesome="fa fa-globe" css_animation="zoomIn" button_block="true" add_icon="true" link="url:http%3A%2F%2Fwww.linstitute.net%2Farchives%2F26506||target:%20_blank|"]

? 2025. All Rights Reserved. 滬ICP備2023009024號-1