Based on the provided sources, the answer depends heavily on whether one is discussing thermodynamic entropy or informational (Shannon) entropy, and the extent to which one accepts the theoretical unification of the two.

The sources present a divided perspective:

1. The Thermodynamic View: Entropy is Inseparable from Energy

In the realm of classical and statistical thermodynamics, the sources overwhelmingly argue that entropy cannot exist unconnected from energy. It is defined as a property of the distribution or quality of energy.

Entropy as Energy Dispersal: Harvey Leff and Frank Lambert argue that entropy and energy are “inextricably linked”[1]. Leff states that entropy has significance only for systems that store internal energy; classical mechanics (point particles with no internal energy) has no entropy[2]. Entropy is described as a measure of the dispersal or spreading of energy over accessible microstates[3][4].

Entropy as “Bound” Energy: T. Sherman and Nicholas Georgescu-Roegen define entropy as an index of “bound energy” (or unavailable energy)—energy that has been degraded and cannot perform work[5]. In this view, total energy consists of “free energy” (available for work) and “bound energy” (connected to entropy).

Dimensionality: Critics of the information-theoretic view, such as Libb Thims and Jeffrey Wicken, emphasize that thermodynamic entropy has physical units of energy divided by temperature (J/K)[8][9]. They argue that separating entropy from these energy-based units creates a “category error” or semantic confusion[10][11].

2. The Informational View: Entropy as “Missing Information”

Arieh Ben-Naim and others argue that entropy can be conceptualized independently of energy transfer, primarily by viewing it as Shannon’s measure of missing information (uncertainty).

Independence from Energy Changes: Ben-Naim points to processes like the mixing of ideal gases or expansion into a vacuum (Joule expansion). In these specific isolated cases, entropy increases, yet there is no change in internal energy, no work done (W=0), and no heat transferred (Q=0)[12][13]. He argues this proves that changes in entropy do not necessarily involve energy interactions or temperature[14].

Dimensionless Entropy: Ben-Naim advocates for redefining temperature to have units of energy, thereby rendering entropy a dimensionless quantity (measurable in bits), identical to Shannon’s information measure[15][16]. In this framework, entropy is purely a measure of the probability distribution of states, regardless of whether those states are energy levels or coin flips.

Syntactic vs. Semantic: C. Joslyn distinguishes between syntactic entropy (statistical, content-free, applicable to any probability distribution) and semantic entropy (thermodynamic, physically measured)[17][18]. Syntactic entropy exists mathematically without connection to energy, but it does not necessarily obey the Second Law or predict physical observables unless interpreted thermodynamically.

3. The Physical Reconnection (Landauer’s Principle)

Even within the information-theoretic framework, several sources argue that “information is physical,” meaning that processing information (changing entropy) inevitably carries an energy cost.

Landauer’s Principle: Sources like Wolpert, Prokopenko, and Bennett cite Landauer’s principle, which states that logically irreversible operations (like erasing a bit of information) must dissipate a minimum amount of energy (kTln2) as heat[19][20]. This suggests that while abstract “information” might seem unconnected to energy, any physical instantiation or manipulation of that entropy requires work and energy exchange[21][22].

Maxwell’s Demon: The resolution of the Maxwell’s Demon paradox relies on the cost of information. The “demon” cannot reduce the entropy of a gas (separating hot and cold molecules) without paying an energy cost to acquire or erase information, thus satisfying the Second Law[23][24].

Summary

• **Thermodynamically:**No. Entropy is the measure of how energy is distributed. Without energy (and microstates for that energy to occupy), thermodynamic entropy is zero or undefined[2][25].

• **Informationally:**Yes (mathematically). One can calculate the Shannon entropy of a deck of cards or a string of binary digits without reference to energy[26].

• **Physically:**Debated. Authors like Ben-Naim argue entropy is fundamentally information and energy units are a historical artifact[13]. Conversely, authors like Thims and Leff argue that stripping entropy of its energetic context renders it a “vacuous” concept inapplicable to physical reality[27][28].