Based on the provided sources, non-ergodic systems affect the definition, calculation, and evolution of entropy in the following ways:

1. Breakdown of Standard Equilibrium Assumptions

In standard statistical mechanics, a system is ergodic if it visits all possible microstates over time, or if the time average of a property equals its ensemble average[1],[2]. This “mixing” ensures that a system maximizes its entropy by spreading across the entire available phase space[3].

In non-ergodic systems, this assumption fails:

Restricted Phase Space: The system does not visit all possible microstates. Instead, the phase space is effectively partitioned into separate regions (or “basins”) that do not communicate with one another[4].

Effective Symmetry Breaking: This partitioning often results from high energy barriers or phase transitions, causing the system to remain in a specific region for a time scale much longer than the observation time. This is termed “effective ergodicity breaking”[4]. Consequently, the system does not reach a global maximum entropy state but may settle into a local maximum within its restricted partition[4],[5].

2. Entropy Calculation via Ergodic Decomposition

Because a non-ergodic system cannot be described by a single global probability distribution that evolves uniformly, its entropy is calculated differently. Information theory handles non-ergodic processes through ergodic decomposition:

Weighted Averages: The entropy rate of a stationary non-ergodic source is not a simple value but is defined as the integral (or weighted sum) of the entropy rates of its individual ergodic components[6],[7].

Sample Entropy Convergence: For a non-ergodic process, the sample entropy (entropy calculated from a specific observed sequence) does not converge to the global system entropy. Instead, it converges to the entropy rate of the specific ergodic component in which the system happens to be trapped[8].

3. Path Dependence and History (The “Fourth Law”)

In complex, evolving systems (such as biological evolution), non-ergodicity implies that history matters.

Expanding State Space: In these systems, the space of possible states (e.g., genetic combinations) grows exponentially—much faster than the system can explore them. Therefore, the probability of the system returning to a previous state is arbitrarily close to zero[9].

Freezing of Accidents: Because the system cannot explore all possibilities to find the optimal (highest entropy or fitness) state, it can become “trapped” in local minima. In evolutionary terms, features that are not necessarily optimal but are functional can become fixed in a population. This prevents the system from reaching a global equilibrium[5].

Irreducibility: This non-ergodic history creates information that is “singular” and algorithmically irreducible, distinguishing it from standard thermodynamic ensembles[10].

4. Theoretical Distinctions

There is historical nuance in how non-ergodicity relates to entropy:

Boltzmann vs. Ehrenfest: Historical confusion exists regarding the definition of ergodicity. Boltzmann used the term Ergoden to refer to what is now called a microcanonical ensemble (a collection of configurations), whereas the modern definition of a system visiting every point in phase space (attributed to Ehrenfest) is mathematically impossible for many systems[11],[12].

Generalization: To handle non-ergodic systems in thermodynamics, researchers have proposed generalizations, such as the “Fourth Law of Thermodynamics,” which attempts to define a metric for steepest entropy ascent even in non-equilibrium, non-ergodic contexts[13].

In summary, in non-ergodic systems, entropy does not simply maximize globally; it becomes a path-dependent property determined by the specific sub-region of phase space the system occupies, requiring mathematical decomposition to be accurately measured[4],[7].