The foundational controversy of thermodynamics (TD) is the monotonic behavior of state functions of macro-systems. Intuitively, this should not be the case, as these systems are composed of micro-particles that are guided by deterministic time-reversible quasi-periodic laws of motion. Here is my view:
First of all, let me re-write the populist title into something that is slightly more scientifically precise. Let me re-name the issue at hand as follows: “Causes of randomizaiton and introduction of monotonic processes in TD macro-states”
Attempts in both mathematical and philosophical literature to explain the emergence of monotonic processes of TD macro-states (e.g. H-function, entropy function) from quasi-periodic deterministic equations guiding their apparent consituents divide into numerous camps. Some of the more influential ones are:
- interventionists: who claim that systems are never closed and environmental perturbations (fields, etc) are the cause of randomizaiton;
- coarse grainers: who divide systems into small chunks and look for the evolution of values averaged over these smaller segments;
- master equationists: who take the time parameter to the infinite limit, and obtain translations of quasi-periodic functions into monotonic correspondents;
- limitationists: who claim that it is impossible to reverse the time direction of micro-states in any meaningful sense, etc.
- quantum mechanists: claiming that non-commutability of observations of micro-states is sufficient for introducing randomization; also – that the classical phase space correspondence of micro-states to their true QM equivalents is a poor match at best for reasons of intranslatability of QM degrees of freedom into classical equivalents.
These, and other approaches often attempt to find the *exact* match of TD desciptions in the language of SM. It appears to me that unless a new fundamental law of nature is at play (D. Albert), the *exact* translation may be a logical impossibility. The theories on offer in their quest for the precision of imaging one theory against another often provide no regard to scale of the respective inter-theory translations.
To be fair, most (though not all) of those opining on the topic seem to be in agreement that the answer to the question at hand is part conceptual and part mathematical.
I support the view point expressed by Craig Callender (Prof of Philosophy at UCSD, prior at LSE) that one only needs to reproduce the image of one theory in the language of another (TD in the language of SM in our case) “at the appropriate scale”. In our case, the adequate correspondence of TD descriptions in the language of SM does not need to be the *exact match* but rather needs to translate their respective guiding equations without the loss of any essential information.
I intend to provide the conceptual framework and a point of view on the series of mathematical equations known as Bogoliubov–Born–Green–Kirkwood–Yvon (BBGKY) hierarchy. The key foundational assumption to these equations is that they truncate information about the system as they move from the mechanical description of the micro-state at T between zero and the mean time of interaction (between particles) on to the kinetic and TD phases of a system evolution at further times.
**BBGKY Hierarchy** More specifically, the procedure takes quasi-periodic *mechanical* equations of motion (Newtonian or QM) that describe the system evolution in the time period of between zero and mean interaction time and (by way of neglecting ,approximating, or averaging over certain information) outputs a set of *kinetic* equations that describe the system evolution in the time between mean interaction time and mean free time. Stage two transforms these kinetic equations into the next set of approximations that guide the system in the time period of between the mean free time (this is time when local sets of ensembles of micro states correlate with local thermodynamical metrics) to time Tau, when the boundary conditions become insignificant and system-wide distribution function becomes Gibbsonian.
Aside from a set of technical approximating assumptions that do not carry a conceptual load (e.g. particles are hard spheres, they do not carry angular momentum to be accounted for, or the assumption that the density function depends only on distance between the particles etc.) the fundamentally significant assumption in reconstructing the BBGKY chain of equations deals with the concept of the weakening of correlations between particles as we move from two-particle interactions to a higher number particle interactions. Through the process of approximation we truncate tiny bits of information about the system.
The question is this: at what scale are we able to afford ourselves the luxury of these approximations? My paper intends to provide a viewpoint.