## Book Volume 1

##### Abstract

Basic concepts are defined, such as what thermodynamics aims to, what a system is, which are the state functions that characterize it, what a process is.

#### Entropy in Classical Thermodynamics: The Importance of Reversibility

Page: 8-17 (10)

Author: Alberto Gianinetti

##### Abstract

The concept of entropy is introduced within the context of classical thermodynamics where it was first developed to answer the questions: What happens when heat spontaneously goes from hot bodies to cold ones? To figure out this change, the transfer process must have some peculiar features that make it reversible. First of all, to be studied, systems and processes must be conceptualised as occurring at equilibrium conditions, or in conditions close to equilibrium. A first description of equilibrium is provided.

##### Abstract

The idea of transferring energy from/to a system is expanded to include work in addition to heat. Work is an alternative form by which the energy of a system can be transferred.

##### Abstract

Once the concept of work has been introduced, the concepts of reversibility, equilibrium, and entropy become clearer and can be better defined. This was the task undertaken by classical mechanics.

##### Abstract

It is further highlighted that the concept of work represents a more general way to see what entropy is. That is, the idea of work offers a better framework to understand entropy. A fundamental observation in classical mechanics is that a system is at equilibrium when no work can be done by it. The capability of a system to do work is inversely linked to its stability, which then can be considered to be equivalent to the inertness of the system. Thus, equilibration, stability and inertness are all aspects of the same feature of a system, and such feature is measured in terms of entropy. An equilibrated, stable, and inert system has the highest value of entropy it can reach, and any departure from these conditions results in a decrease of entropy. Therefore, work availability and entropy are inversely linked: the maximal entropy is attained when no work can be done.

##### Abstract

The systems we can directly see are composed of huge numbers of particles. So, the properties of these systems are obtained as statistical averages of the effects of their particles. This casts a conceptual bridge between the macroscopic world wherein we observe systems with their overall properties and the microscopic world where particles with their own properties dominate the scene. Statistical mechanics shows that the former world is determined by what happens in the latter, and this approach provides a better, finer understanding of what’s going on at the macroscopic level and why.

##### Abstract

Since the features of macroscopic processes derive from the microscopic behaviour of the system’s constituent particles, it is shown that temperature, which appears to us as a simple property of macroscopic bodies, is indeed a complex effect of the movement of the body’s particles.

##### Abstract

Some processes happen spontaneously. What, at a macroscopic level, appears as a nature’s tendency, is an effect of the complex statistical behaviour of the microscopic particles: their overall net effect emerges at the macroscopic level as a spontaneous force that determines if and how a system can spontaneously change, and if and toward which direction a process is therefore started.

##### Abstract

The stability of a system is determined by the overall behaviour of the system’s particles. In turn, this behaviour is established on the basis of the natural distributions the particles themselves spontaneously tend to assume. They tend to distribute across space according to a uniform spreading as the most probable outcome, and they also tend to share their energies according to a complex, non-uniform function that is nevertheless probabilistically equilibrated.

##### Abstract

The microscopic approach of statistical mechanics has developed a series of formal expressions that, depending on the different features of the system and/or process involved, allow for calculating the value of entropy from the microscopic state of the system. This value is maximal when the particles attain the most probable distribution through space and the most equilibrated sharing of energy between them. At the macroscopic level, this means that the system is at equilibrium, a stable condition wherein no net statistical force emerges from the overall behaviour of the particles. If no force is available then no work can be done and the system is inert. This provides the bridge between the probabilistic equilibration that occurs at the microscopic level and the classical observation that, at a macroscopic level, a system is at equilibrium when no work can be done by it.

##### Abstract

A useful definition of entropy is “a function of the system equilibration, stability, and inertness”, and the tendency to an overall increase of entropy, which is set forth by the second law of thermodynamics, should be meant as “the tendency to the most probable state”, that is, to a state having the highest equilibration, stability, and inertness that the system can reach. The tendency to entropy increase is driven by the probabilistic distributions of matter and energy and it is actualized by particle motion.

##### Abstract

The probabilistic distributions of matter and energy give the direction toward which processes spontaneously tend to go. However, every existing process is presently in force because of some kinetic barrier or restraint that operated in the past: were it not for kinetic restraints (and constraints that were eventually removed by past processes), all the ongoing processes, and their causal antecedents, had already occurred billions of years ago. The role of constraints and restraints is therefore fundamental to establish the actual direction a real process can take, as well as its velocity.

##### Abstract

According to the second law of thermodynamics every spontaneous change, or process, is associated with an increase in entropy. Although the probabilistic distributions of particles and energy give the possible direction of a process, its occurrence is enabled by the motional energy of the particles. Even particles, however, are subjected to constraints of motion that slow down the attainment of some possible changes and thereby reduce their probability of occurrence, especially if alternative pathways to increase entropy are possible and can be accessed faster. Kinetic restraints are therefore key determinants of which processes are activated among the different possible ones.

#### Spreading & Sharing is a Common Outcome of a Physical Function Levelling Down Available Energy

Page: 107-111 (5)

Author: Alberto Gianinetti

##### Abstract

Entropy has been defined as a probabilistic function of energy spreading and sharing, and most often this description provides a straightforward way to conceptualize entropy. It is shown that, more in general, the spreading and sharing of energy is a common outcome of a physical function levelling down available energy. The latter, as formulated by a mathematical term called the “Boltzmann factor”, originates from the equilibration of forces at the microscopic level and is effected by the net levelling force that results as statistical outcome of all the microscopic forces and that always pushes the system towards the dynamic equilibrium. This net levelling force is the rationale for which work can be done at a macroscopic level, and its derivation from the microscopic world explains why it is linked to equilibration and therefore to entropy increase.

#### Changes of Entropy: The Fundamental Equation and the Chemical Potential

Page: 112-130 (19)

Author: Alberto Gianinetti

##### Abstract

Any change in the physical world is consequent to a net force levelling down available energy. Different features of a system, or process, can determine on which properties of the system the levelling force eventually operates. Examples are the equilibration of the pressure of a gas, of the temperature of bodies, of the concentration of solutions, of the position of a body with respect to a fluid in the presence of a gravitational field, and so on. All these phenomena occur because of the levelling force and are associated with an entropy increase. The entropy increase can be calculated through specific relationships.

##### Abstract

Entropy is maximal at equilibrium. According to the fundamental equation this demands that there is equilibration for every specific interaction term, namely, thermal, mechanical, diffusive, and others. Relevant exemplifications are illustrated for a number of important processes.

##### Abstract

Some phenomena, though representing instances of entropy change, appear to defy the description of entropy as a function of energy spreading and sharing. Despite its great utility such a description is sometimes difficult to apply because a function levelling down available energy not always exactly acts as a spreading and sharing function. The concept of a physical function levelling down available energy is therefore preferable to understand entropy and the second law of thermodynamics because it has a more general value.

##### Abstract

A few particular phenomena are quite difficult to frame into the fundamental equation, nonetheless they can be interpreted to the light of the general idea of statistical mechanics that any system and any overall change tend to the most probable state, i.e., a state that is microscopically equilibrated and then macroscopically stable.

##### Abstract

Entropy quantification can be performed under the assumption that both the position of a particle in space and its level of energy can be defined as corresponding to one among many enumerable states, even if their number is hugely high. This means that, if absolute values of entropy have to be computed, neither energy nor space should be continuous variables, even though entropy changes can be calculated in any case. Remarkably, quantum theory just says that’s the case, because at a very short scale both energy and space seem to behave like discrete quantities rather than as continuous ones. So, a general string theory, which represents the evolution of quantum theory, appears to be the natural, preferable theoretical framework for the definition of entropy.

##### Abstract

As a probabilistic law, the second law of thermodynamics needs to be conceptualized in terms of the probabilities of events occurring at the microscopic level. This determines the probability of occurrence for macroscopic phenomena. For the best comprehension of this approach, it is necessary to distinguish between “probabilities”, which are subjective predictions of an expected outcome, and “frequencies”, which are objective observations of that outcome. This distinction is of help to unravel some ambiguities in the interpretation of the second law of thermodynamics.

#### Outlines of a Verbal Account of the Thermodynamic Entropy for a Pedagogical Approach

Page: 158-189 (32)

Author: Alberto Gianinetti

##### Abstract

Starting from the observation of spontaneous phenomena, it can be envisioned that, with time, every isolated system tends to settle into the most equilibrated, stable, and inert condition. In the very long term, this is the most probable state of a system. This can be shown to be a universal law, the second law of thermodynamics, defined as “the tendency to the most probable state”. Thereafter, it is intuitive that “a function that measures the equilibration, stability, and inertness of a system” is maximized by the second law. This function is called entropy.

## Introduction

The second law of thermodynamics is an example of the fundamental laws that govern our universe and is relevant to every branch of science exploring the physical world. This reference summarizes knowledge and concepts about the second law of thermodynamics and entropy. A verbatim explanation of chemical thermodynamics is presented by the author, making this text easy to understand for chemistry students, researchers, non-experts, and educators.