## Abstract

This commentary on thermodynamics in solid mechanics aims to provide an overview of the main concepts of thermodynamic processes as they apply to, and may be exploited for, studies in nonlinear solid mechanics. We give a descriptive commentary on the (physical) interpretation of these concepts, and relate these where appropriate to behaviour of solids under thermo-mechanical conditions. The motivation is firstly that students of solid mechanics have often had less exposure to thermodynamics than those in other branches of science and engineering, yet there is great value in analytical formulations of material behaviour derived from the principles of thermodynamics. It also sets the contributions in this Theme Issue in context. Along with the deliberately descriptive treatment of thermodynamics, we do outline the main mathematical statements that define the subject, knowing that full details are provided by the authors in their corresponding contributions to this issue. The commentary ends on a lighter note. In order to aid understanding and to stimulate discussion of thermodynamics in solid mechanics, we have invented a number of very basic and completely fictitious materials. These have strange and extreme behaviours that describe certain thermodynamics concepts, such as entropy, in isolation from the complexities of real material behaviour.

## 1. Introduction

At one time, thermodynamics was criticized for lacking the richness of continuum mechanics as a basis for the study of the deformational response of solids. However, its value as a foundation for nonlinear solid mechanics has been established over the past 30 years, and follows the pioneering work of Coleman & Gurtin (1967), Lubliner (1972) and Truesdell (1984). Advances in the development of plasticity and damage models have been particularly fruitful, with major fundamental contributions from Ortiz (1985), Simo & Ju (1987), Simo & Miehe (1992), Hansen & Schreyer (1994) and Lemaitré (1996), for example. This Theme Issue as a whole aims, in part, to highlight the many avenues yet to be fully exploited by this versatile theory.

Thermodynamics is systems analysis. With that straightforward perspective, its value in nonlinear solid mechanics becomes more clear. In other words, when we are dealing with dynamical processes that themselves involve time evolution of strain (and stress), heat generation, conduction and dissipation to neighbouring domains, an analysis of the relevant systems becomes a powerful tool. Nonetheless, there are not as many applications of the theory as there might be. This is perhaps because the physical interpretation of its basic principles when applied to the deformation of solids is still a source of misunderstanding.

This introduction to the Theme Issue ‘Thermodynamics in solid mechanics’ is not intended to reproduce the formal mathematical foundations of the subject—the reader would be better served by studying the excellent texts of, say, Coleman & Noll (1963) and Maugin (1992). Rather we review some of the major concepts that are important in applying thermodynamics to problems in solid mechanics. The objective is to provide a commentary on the (physical) meaning of these concepts as may be appropriate in the deformational response (and degradation) of quasi-brittle and/or ductile materials. It is hoped that further research will be encouraged by this treatment, and that new lines of enquiry may be opened, even unlocked through the application of thermodynamics.

The papers that follow are independent but with the underlying theme that they address aspects of thermodynamics in solid mechanics. They have been arranged to provide a flow from theoretical considerations of thermodynamic stability (Petryk 2005), generalized constitutive developments and continuum mechanics (Schreyer & Maudlin 2005), through to thermodynamic aspects of particular materials such as generalized granular media (Suiker & de Borst 2005), concrete (Lemarchard *et al*. 2005; Baker & de Borst 2005), and steels (Vakili-Tahami *et al*. 2005).

## 2. Equilibrium thermodynamics (thermostatics)

All analysts are accustomed to the idea of equilibrium. Thermodynamics also uses equilibrium as a starting point. But, we should first define the notion of thermodynamic equilibrium, and then note what happens in non-equilibrium thermodynamics.

A finite system is one which can be described by a set of independent state variables, and so any characteristic of the system becomes a function of that state. We will introduce certain important state variables in this commentary.

A system is in thermodynamic equilibrium if the state variables associated with the system do not change with time. In classical terms, that means that the temperature of the internal system equals that in the exterior. Thermodynamics thus concerns the exchange of heat (and energy) from the system to the exterior. Thermostatics compares systems in thermodynamic equilibrium, and considers the transition from one state of equilibrium, _{1}, to another state of equilibrium, _{2}. Thermodynamics concerns the study of phenomena that occur outside the state of equilibrium.

These have obvious analogues in basic mechanics. A structure in static equilibrium can move to another equilibrium state under a change of forces. The movement from one state to the next is a dynamic process. Similarly, a system in thermostatic equilibrium can move to another state of thermostatic equilibrium under a change in thermodynamic forces and conditions. The transformation between states is a thermodynamic process.

We will begin by discussing the thermodynamic principles that arise from thermodynamic processes between two states that are in thermodynamic equilibrium. Fundamental to thermodynamic processes are the concepts of work and heat. But we shall observe that heat in nonlinear material phenomena is a more general quantity that might be required for heat transfer-type studies.

While we limit the analytical treatment to the conceptual, many of the details have been taken from the excellent exposés by Maugin (1992) and Stabler (2000).

### (a) Energy

Analysts are conversant with the notion of energy and the application of energy principles. Here we need to be quite specific in defining the types of energy available in a system, and how they may be used. Importantly, we must distinguish between internal energy and free energy. In any nonlinear dynamic system there will be a potential energy of the applied actions, *V*, and kinetic energy of the dynamical system, *T*: everything else is internal energy, *E*. That is, we write the total energy of a system as(2.1)Internal energy typically includes strain energy absorbed by the deformed solid, ‘plastic work’ under strain hardening, heat generated, sound released by cracking. The quantity depends on the boundaries set to the system.

Heat is defined as the transfer of energy across the boundary of a system due to a temperature change, but does not mean that it is only a thermal quantity. If temperature induces damage in a solid, for example, then ‘heat’ would be generated. However, this does not mean that damage necessarily makes the system get hot! On the other hand, in plasticity we are accustomed to the notion of heat generated during large plastic flow, as in necking of a steel bar, where the material does actually heat up.

The change in total energy associated with the transformation from one thermodynamic state to another is just the exchange of heat, *Q*, and work, , from the system to the exterior. This is a simple energy balance, and is in fact a statement of the first law of thermodynamics. That is, the energy change between two thermodynamic states is(2.2)and should be independent of the path taken. We can also consider change in energy due to an infinitesimal transformation as: , where d*Q* and are contributions to elementary heat and work, respectively.

Energy is defined by a function of a set of *independent* state variables. It is then appropriate to consider the differential increment in total energy as a chain of partial derivative terms, each with respect to a state variable. Each of the terms must then represent a contribution (or component) of elementary heat or work.

### (b) Thermodynamic potential and property

Thermodynamic potentials are quantities from which all characteristics of the system can be deduced. Energy is the obvious thermodynamic potential. We see later that specific forms of energy are often the more suitable choices.

Thermodynamic potential is defined as a function of a set of *independent* state variables. Some obvious examples might include plastic strain or a damage variable. Associated with this set of independent state variables, there is a set of dependent variables, called thermodynamic properties. These play a duality-type role in that each state variable has a thermodynamic property, and it is occasionally desirable to reverse their roles.

Briefly, if we write energy as a function of the state variables, , then the chain rule yields(2.3)In this very general expression, the derivatives like are the thermodynamic properties (or sometimes called thermodynamic forces) associated with each independent variable *Χ*_{j}.

### (c) Entropy

Associated with every thermodynamic state, there is a level of entropy. Most students have difficulty with the concept of entropy, and the misconception often taints their view of the whole subject. But they might take solace from Truedell's (1966) insightful assertion:1As mechanics is the science of motions and forces, so thermodynamics is the science of forces and entropy. What is entropy? Heads have split for over a century trying to define entropy in terms of other things. Entropy, like force, is an undefined object, and if you try to define it, you will suffer the same fate as the force-definers of the seventeenth and eighteenth centuries: either you will get something too special or you will run around in a circle.

Curiously, we rarely attempt to define entropy *per se*, but rather the change in entropy. Entropy is merely a condition, or property, of the state of a system. It is possible to think of entropy in a material as a tendency towards disorder. But is a gain in entropy a good thing or a bad thing? This is probably a more significant question. Actually, it is neither.

Claussius coined the term entropy from the Greek for transformation. He developed the concept to relate the heat exchanged in a reversible process at constant temperature. The change in entropy, *S*, from one state _{1} to another state _{2}, is represented by the closed integral of heat exchange over temperature(2.4)where d*Q* is actually the heat entering the system (body) across all boundaries in an isothermal process, and *Θ* is the temperature. This leads simply to d*Q*=*Θ* d*S* for a reversible process. More correctly, we write equation (2.4) with the closed boundary integral(2.5)where d*q* is the inward normal heat flux, and *s* is the boundary coordinate.

Entropy is thus associated with a change in state, and plays a crucial role in thermodynamics since it can distinguish between work and heat. That in itself leads to the second law of thermodynamics. The second law recognizes that heat and work are not equivalent because mechanical energy can be completely converted to thermal energy, but thermal energy can only be partially converted to mechanical energy during any cyclic process. There must then be dissipation!

Entropy is in fact the appropriate state variable governing internal energy, associated with the transfer of heat. The elementary exchange of heat is thus(2.6)The latter term comes by comparison with equation (2.4), so that we deduce(2.7)where *Θ* is the absolute or thermodynamic temperature. In other words, temperature is the thermodynamic property associated with the state variable of entropy.

The incremental change in energy, d*Π*, involves the sum of many partial differential terms, the first of which is the elementary heat exchange equations (2.4) and (2.6): the remainder must be the change in work. Since the state variables are independent, work is independent of entropy, and so entropy separates heat and work in energy exchange. Furthermore, the rate of heat exchanged is thus: , and so the total heat exchanged is then *ΘS*.

### (d) System and boundary definition

It is clear that the choice of system to investigate, i.e. the definition of domain and boundary of the system, is very important in interpreting thermodynamic processes.

Let us consider a very basic thermal problem. We take a steel bar perfectly insulated along its length and apply heat at one end until the temperature reaches, say, 50 °C and the other end is just above room temperature. We consider steady state conditions where the temperature profile is not changing, i.e. thermostatic equilibrium. By definition, steady state means that heat in at one end equals heat out at the other. Else there would be a net driving force that would change the state, i.e. change the temperature profile.

But steady state conditions in the bar should also mean that the entropy of the bar should be constant. What then of the entropy formula in equation (2.5), since clearly the value is changing in this example? Firstly, the boundary integral reduces to the difference of the terms d*q*/*Θ* at each end. The heat in and out is the same, and so too are the fluxes, but the boundary temperatures are different. That suggests that *S*_{2}−*S*_{1}≠0.

However, it is the entropy of the system, *not* the bar, which is increasing constantly over time. We can in fact construct a system (a closed room for example) around the bar, which includes the heat source and sink that control the temperatures at each end. It is the entropy of this system which changes. It is interesting that we can construct the closed system without actual heat crossing the boundary. Our system includes source, bar and sink. The rate of entropy production at the source is lower than the entropy production at the sink, since the source is at a higher temperature than the sink. Furthermore, the non-zero value from equation (2.5) might be conceptualized as the ‘entropy convection’ or ‘entropy throughput’ across the bar, rather than some change in the entropy of the bar itself.

We do not often need to calculate the value of entropy for a material during computation, as its use lies more in formulation. But it may have a useful role in certain computations. Hence, the reader might finally enquire whether equation (2.5) can be used to calculate the value of entropy for the bar itself once it reaches that steady state. We could set our ‘zero entropy’ at time zero for this heating process. Then as heat is applied to one end of the bar and the temperature increases, we could evaluate the integral until steady state is reached. The final value of *S*_{2} from equation (2.5) would be the entropy of the bar at the steady state. We would only then be concerned with a change in entropy of the bar if the bar changed state through some other process.

A point to note for the analyst is that the temperature boundary condition has to be slightly above the external temperature so that heat can be driven across the boundary. If the temperatures were identical, then no heat would exchange. There would thus be a net heat received by the system at the heated end. Then, equilibrium would not have been reached, and entropy would continue to change.

We might return briefly to the Claussius definition of entropy and the formula in equation (2.5) and ask what an isothermal process might be in the mechanical deformation of a material. Consider a second simple example of a testing regime where a sample is being extended in a testing machine beyond its elastic limit. We consider a control mechanism that maintains temperature of the material at a constant value, as distinct from the normal process of constant applied load or constant strain rates.

This is an isothermal process where the material is changing its state over time. This might be achieved by varying the applied stress through the machine, or by some process of extracting/exchanging heat. Also, depending on how the system was maintained at constant temperature, we could have either an isentropic process (no change in entropy) or one where entropy changed.

### (e) Enthalpy and free energy

We introduce the concept of free energy, being internal energy less heat exchanged, by a Legendre transformation(2.8)Then, by simple differentiation, noting that *E* depends on *S* not *Θ*, we have(2.9)The consequence is that temperature becomes the independent variable for free energy and entropy its thermodynamic property. Put another way, for a fluid flow, temperature is the forcing (or intensive) variable and entropy the quantitative (or extensive) coordinate.

This form of free energy, known as Helmholtz free energy, is dominant in the study of solids. For a solid, the internal energy is a function of entropy, strain, *ϵ*, and independent variables specific to the problem: *E*=*Ê*(*S*,*ϵ*,…). Thus, the common stress strain laws are(2.10)Enthalpy is a concept more suited to gases, but it may play a role in the study of porous media, and pressures in partial phases of water and steam (or humidity), for example. When the internal energy is a function of volume, e.g. *E*=*Ê*(*S*,*Ω*,…), and pressure is its thermodynamic property, then(2.11)In such cases, it is convenient to introduce enthalpy, *H*, as the thermodynamic potential, defined by(2.12)When applied to solids, the internal energy is taken as a function of strain, not volume, for which stress is the thermodynamic property, though clearly there is a relationship. Then(2.13)which is consistent with equation (2.11) because tensile stress is taken here as the positive quantity, whereas pressure being compressive would be negative. Enthalpy for a unit volume is then written(2.14)Free energy is effectively that which can be released to the system. It is the quantity most often used to begin constitutive modelling. The most common descriptions of free energy are the Gibbs and Helmholtz functionals.

Like enthalpy, Gibbs free energy can be applied to systems where the volume of a system is a state variable, and pressure is its thermodynamic property. We apply a transformation to internal energy so that temperature and pressure become the independent variables, and entropy and volume become the thermodynamics properties.(2.15)For solid, we might again invoke the stress/strain dependence and write per unit volume(2.16)The quantities *pΩ* and *σ* : *ϵ* might both be seen as work like quantities that play similar roles in transforming internal energy to enthalpy and then to free energy. For studies of deformation of porous media, where the effects of pressure in the (unsaturated) fluid phase and stress in the skeleton are coupled, one may wish to include both quantities (adjusted for volume representation) in the free energy functional. Recognizing that change in volume is volumetric strain, the expression can easily be constructed to yield the classic effective stress concept of soil mechanics. However, Coussy (1995) warns that constitutive development based on simple phenomenological concepts of effective stress might fail to capture the complexity of porous media, and suggests a more complete thermodynamic basis.

It is clear from equations (2.8) and (2.16) that *F* and *G* (taken per unit volume) form complementary energies(2.17)from which the duality of constitutive descriptions follow(2.18)

## 3. Non-equilibrium thermodynamics (thermodynamic processes)

The issue in non-equilibrium thermodynamics is how to deal with systems that are changing, in the thermodynamic sense. Here we wish to outline how the concepts and principles from thermostatics might be applied to states that are not in thermodynamic equilibrium—the reader may refer to the texts like Kondepudi & Prigogine (1998) for a fuller account. We also wish to draw some analogies with mechanical statics and dynamics, concepts more familiar to an engineer. In that sense, we write some standard equations of thermodynamics in alternate forms to those in the main articles of this issue, in order to give a different perspective.

There are three main approaches. The first (classical) approach sees a thermodynamic process as a series of thermostatic equilibria. The second and third use the rational theory of Coleman & Noll (1963) and the theory of internal variables (Lubliner 1972), respectively.

The most important assumption in each case is that thermodynamic temperature and entropy can be used in non-equilibrium processes. But this is not an unusual approach in analysis. Mass is strictly applicable to bodies at rest, but we use it to describe the motion of a particle. Only at high velocity does the approximation break down: similarly the concepts of thermodynamic temperature and entropy.

Intrinsic variables in solid mechanics are generally taken as relevant quantities per unit volume. For example, the variable associated with transfer of heat is the entropy per unit volume, *η*, and temperature is again its thermodynamic property. With that notion, the internal and free energy relationships and those governing the evolution of systems (Claussius inequality) all follow. The formal statements that ensue will be well described through the articles that follow in this issue.

We will consider the Helmholtz form of free energy for purposes of explanation. We write the internal and Helmholtz energy per unit volume as *e* and *ψ*, respectively, as well as the absolute temperature, *θ*, and entropy per unit volume, *η*. Then the Helmholtz energy is *ψ*=*e*−*θη*. The important result for our purposes then derives from the time derivative of this Legendre transformation(3.1)An alternative form is to write(3.2)where(3.3)is effectively the interchange between free and internal energy, i.e. free energy lost to the internal store. The term is actually the rate of heat exchange, and can be considered a kind of thermodynamic momentum.

### (a) Dissipation

For a reversible process, the change in entropy is due only to the input of heat to the system. By contrast, for an irreversible process (intrinsic changes to material structure, for example), the change in entropy results from (pure) heat input and irreversibilities that occur. Then, for an isothermal process, we can write the inequality(3.4)where d*Q* is the heat received across all boundaries during the transformation. More precisely we apply the divergence theorem to the closed boundary integral in equation (2.5) and write quantities per unit volume, which leads to the Claussius inequality for deformable solids(3.5)where *r* is an internal point heat source per unit volume, and *q* is the heat flux. In other words, the total heat exchanged is greater than that introduced through a source and that resulting from spatial distributions.

The difference is dissipation. This consists of mechanical and thermal dissipation, being heat like quantities lost from the system due to plasticity and damage, for example.

To balance the inequality equation (3.5), we simply add the dissipation term to the right-hand side, which leads to the thermal balance (or equilibrium) equation (3.8*b*) below. In so doing, we find(3.6)Since dissipation balances the heat exchanged, it must be non-negative, and the resulting dissipation inequality governs the evolution of thermodynamic systems. The expansion of the final term in equation (3.6) reveals dissipation, , to be an internal, or thermo-mechanical, dissipation plus that due to heat conduction.(3.7)Combination of the above expressions lead to the Claussius Duhem inequality, which is a useful representation of the second law of thermodynamics. There are several forms of these expressions. In papers that follow, internal dissipation will be shown to be the product of rate change of intrinsic quantities (like damage and plasticity variables) and their thermodynamic properties.

### (b) Balance equations

Again, the purpose is not to present all mathematical background in this commentary, but the discussion is assisted by a physical review of the basic balance equations on which thermodynamics rests.

The balance equations for mechanical momentum and thermal equilibrium are(3.8a)(3.8b)where *V* is displacement rate (or velocity), *σ* is local stress, *b* a point body force, *Q* is heat flux, *r* an *internal* point heat source per unit volume, and is mechanical dissipation.

In that sense, we see entropy change driven by a heat source, dissipation and divergence of heat flux. We can also recast equation (3.8*b*) by invoking equation (3.2)(3.9)The ‘driving force’ in equation (3.9) is made up from the rate at which free energy is locked within internal energy, tempered by dissipation from the system, and a point heat source, *r*. In other words, the force is a rate of energy transfer, i.e. heat source and sink.

If there were no ‘locking’ of free energy with respect to the internal energy, then , and if there was no dissipation, then the thermal momentum looks very analogous to mechanical momentum in equation (3.8*a*): .

That is, temperature is driven by the divergence of heat flux and a point heat source, just as displacement (velocity) is driven by the divergence of stress and a point body force.

## 4. Basic elements of the thermo-periodic table

The contributions in this issue will present these concepts through a rigorous thermodynamic framework. However, in digesting the basic analytical concepts presented earlier, it is best to link these to some physical behaviour. To do this, we have invented some extreme and fictitious material behaviour: real behaviour will be left for the corresponding authors in this issue.

We will describe briefly four materials which, in their extreme behaviours, underpin the physical interpretation of the principles of thermodynamics in governing the behaviour of solids. The student may find it instructive to consider these behaviours in light of thermodynamic conditions and relationships, and ask whether thermodynamic principles are applicable to these extreme behaviours.

These materials might be regarded as the basic elements of a ‘thermodynamic periodic table’. The student might further find it interesting to consider which combinations of elements give rise to behaviours akin to real materials of interest. That is, the materials in the Thermo-periodic Table are the first two generations, and real materials might be seen as offspring of these simple forms.

### (a) Rigidium

Rigidium is the first element in the thermodynamic periodic table, and is the basis of all materials in the universe. It does not deform under load or heat, nor generate a change in temperature. Rigidium allows all heat to be perfectly and instantaneously conducted from the material without any change in temperature. In other words, there is no change in state: no change in internal or free energy, no dissipation and no change in entropy. This exotic material is thermodynamically invisible.

The definition of domain boundary is important. If heat is supplied at one side and perfectly conducted out the other, then there is heat flow but not heat exchange. That is, heat received along all boundaries exactly equals the heat lost through all boundaries—an exact balance. With no heat exchange with the exterior, d*Q*=0⇒d*S*=0. Hence, there is no change in entropy.

Alternatively, we might note that since there is no interchange between internal and free energy, then . Hence, from equation (3.2) we have(4.1)(On the left-hand side, entropy should be interpreted as an instantaneous value). Hence, if there is no change in temperature, there is no change in entropy. It also follows from equation (3.1) that: . That is, all heat is conducted instantly and there is no dissipation.

### (b) Isentropia

Isentropia is a particular form of the ideal linear elastic material. Under load it undergoes perfectly linear elastic (and reversible deformation). It is the elastically deformable offspring of Rigidium. Any change in internal energy is immediately and equitably available as free energy. It conducts heat perfectly an instantaneously across its boundaries without any change in temperature. As such, the total exchange of heat across its total boundary is always zero. Like Rigidium and so entropy again remains constant for all time.

### (c) Agitatum

Agitatum is like Rigidium except that it is affected by load. The material is completely rigid under load, and undergoes no deformation, but it does get hotter. When a weight is placed on a block of Agitatum, stress but not strain is created. There is no strain energy or potential, but the stress in the material creates higher temperatures. This is because stress agitates the molecules and that creates heat to raise the temperature. In fact, some might argue heat is just that: molecular agitation.

The process is reversible, in that once we remove the weight, heat production stops, and the material cools by reversing the molecular agitation. But what about entropy? There is uncertainty about the position of the atoms and so one might argue that entropy increases. We should also ask whether entropy in a case is reversible at the material level.

Internal energy is created. If this is considered to be available as free energy, then again from equation (3.2), we have . But this time and so . Since by definition, that would imply that entropy was decreasing which is not appropriate. Hence, we argue that the internal energy is not truly free. It is only available to reverse the molecular agitation and also to reverse the entropy of the material. With , we can easily have . (Also in reality, we should consider integral versions of these quantities if we are trying to conceptualize absolute changes in temperature and entropy.)

Much depends on the system we choose to consider, and the boundaries set. If agitatum were a slightly different material, namely one where heat was lost across the physical boundary of the material when the load was removed. Then temperature would be reversed by heat loss. Entropy of a block of this material would still return to its original value, but the entropy of the whole system would increase.

### (d) Bondygon

Bondygon, like Rigidium, does not change temperature under load, nor does it strain in the traditional elasticity sense. However, it does undergo internal change in crystal structure without any outward appearance. That is, parallel lattices of atoms click past each other, rather like a ratchet system, without showing any true deformation or strain.

In so doing, the material absorbs internal energy and consequently locks in free energy. The new form of Bondygon cannot be reversed, and there is no strain energy to be released after the lattice has reconfigured. Moreover, it has most certainly changed its state and hence entropy.

Now, energy can be put in to the material such that there is an increase in internal energy but that it is locked in, which means it is not available as free energy. Hence, Δ*ψ*=0 and so: . Referring then to the transformation equation (3.9), with , we have Δ*η*>0, i.e. entropy increases!

This is heat in the form of irreversible crystal changes, which translates directly to dissipation. That is, returning to equations (3.8*b*) and (3.9), we see that: .

This might apply to isothermal damage, for example, in which case an increase in entropy corresponds to a tendency to disorder.

## 5. Closure

It is salutary to remember that if one defines a model through the choice of a thermodynamic potential, such as a free energy function, then the model details should be deduced through thermodynamic principles. Any avoidance of this process might lead to a violation of the laws of thermodynamics, such as unwanted energy absorption or negative dissipation for example, and hence render the formulation a nicety at best.

The contributions to this Theme Issue explore a range of topics where considerable progress has been made through application of thermodynamic principles both in constitutive modelling and computational strategies. We trust that these articles will provide incentive for further research in this rich field.

Finally, I would like to express my sincere appreciation to all authors in this issue both for their expert contributions and their patience in its production.

## Footnotes

One contribution of 7 to a Theme Issue ‘Thermodynamics in solids mechanics’.

↵This quotation came to the author's attention through the obituary of Clifford Truesdell written by J. M. Ball, R. D. James (2002). The Scientific Life and Influence of Clifford Ambrose Truesdell III,

*Arch. Rat. Mech. Anal.*,**161**, 1–26.- © 2005 The Royal Society