Saturday, May 4, 2013

Heart Rate Variability

“Small is the number of them that see with their own eyes and feel with their own hearts.”
Albert Einstein



In my previous post I discussed a paradox regarding the evolution of complex living organisms and entropy. Entropy in its classical definition is always directed from away from order and towards disorder. Living systems are able for a time to resist this entropic force maintaining internal integrity by flexibly adapting to an outside environment.

This interplay between internal stability and external adaptability is made possible through the  communication of many nested interacting physiological systems. It is known that autonomic nervous system modulation plays an integral role in many aspects of this complementary interplay by maintaining stability through homeostasis, and by changing in response to the environment through allostasis. The autonomic nervous system has a prominent role in this process and has two complementary sub-components corresponding to the sympathetic and parasympathetic branches. The sympathetic nervous system is predominate during stress while the parasympathetic nervous system is predominate when relaxed.

So What is Heart Rate Variability

Recently there has been a great deal of research related to heart rate variability (HRV) as a non-invasive bio marker of stress. Heart rate variability is related to the regulation of the sinoatrial node, the natural pacemaker of the heart.
The rhythm of the heart is primarily under the control of the vagus nerve, which inhibits heart rate and the force of contraction. When you inhale, you take your foot off the parasympathetic brake and your heart rate accelerates. When you exhale, you press down on the parasympathetic brake and your heart rate slows. This change in your heart rate from beat to beat is called heart rate variability (HRV), and HRV is a widely used method for studying cardiac autonomic modulation.
HRV can also be considered a measure of vagal tone, and is elevated in those with active lifestyles and low in those who are sedentary. Paradoxically HRV is higher when relaxed and lower under stress. A high resting HRV indicates that the heart is manifesting complex patterns subtly responding to its environment in an adaptive manner. This is a good sign and tends to correspond with a sense of well-being.
Low HRV is thought to reflect excessive sympathetic and/or inadequate parasympathetic activity and is a strong predictor of mortality in patients with Congestive heart disease. Reduced HRV is a powerful and independent predictor of an adverse prognosis in patients with cardiac disease. It has a potential to become a non-invasive diagnostic and prognostic index in clinical practice. HRV is also lowered in psychological disease states, such as anxiety, depression, and PTSD.
Julian Thayer has been one of the leading researchers relating neuroimaging studies and HRV indices to stress. Here is a review of much of his work. Thayer and his colleagues propose a model they refer to the "Neural Visceral Integration Model"
"We further propose that the default response to uncertainty is the threat response and may be related to the well known negativity bias. Heart rate variability may provide an index of how strongly ‘top–down’ appraisals, mediated by cortical-subcortical pathways, shape brainstem activity and autonomic responses in the body. If the default response to uncertainty is the threat response, as we propose here, contextual information represented in ‘appraisal’ systems may be necessary to overcome this bias during daily life. Thus, HRV may serve as a proxy for ‘vertical integration’ of the brain mechanisms that guide flexible control over behavior with peripheral physiology, and as such provides an important window into understanding stress and health."

This model specifies a central autonomic network (CAN) brain network including prefrontal and sub cortical regions that function to support adaptability and health. The primary output of the CAN are sympathetic and parasympathetic neurons that innervate the heart. Top (prefrontal) down (subcortical) inhibition is associated with increased vagal input (^ HRV). 

A Neurovisceral Integration model (NIM)
A neurovisceral integration model (NIM) proposes that the autonomic nervous system is a final common pathway that links psychological and physiological states and that HRV can be a useful index of NIM and organism self-regulation. This can be thought of as a western model that describes the heart-mind concept that is familiar to eastern philosophy, traditional Chinese medicine and tai chi practice. Or more simply as Nelson Mandela said:
"A good head and a good heart are always a formidable combination."
Heart rate variability is also gaining a great of interest in the field of endurance exercise.  Endurance training is one of my interests and I own a watch and heart rate monitor that allow me to capture HRV data.  Oscar Wilde was onto something when he said  “Hearts Live By Being Wounded”,  yet there is a fine line between the stress that causes a healthy progressive adaptation, and the stress that can lead to imbalance and disharmony. Attention to HRV levels over time may help those undergoing strenuous training  attempting to attain peak condition while avoiding over training.

There are many measures used to quantify heart-rate variability some of which come from complexity, chaos and information theory. Included among these HRV measures are measures of entropy. In future posts if time avails I plan to keep something of a diary capturing my HRV in various activities, rest, biking, running, cognitive stress states of composing a blog or writing up statistical results at work. This should allow me to see how sensitive and reliable these indices are to various activities over time. Here are some examples:

First this is my HRV data yesterday at rest before heading out on my bike to work:

Beats Per Minute = 50.36
Shannon Entropy =2.84
Correlation Dimension (D2)=3.44

Beats per minute is simply a measure heart rate. Low beats per minute in a resting heart rate is associated with high HRV, thus 50 beats per minute resting HR suggests a likely high HRV. For the measure labled Shannon Entropy, a low score indicates more complexity and high HRV. Compared to this study of engineering students at rest in which their average Shannon Entropy was 3.17 my score of 2.84 is good sign. HRV reduces with age (I am 50 yrs old), but perhaps my endurance training is paying off (or maybe engineering students have trouble resting ) . Correlation Dimension (D2) is another measure of complexity. In this case a high score means high HRV. Again my results here look good as the average for the engineering students was 2.83.

Now here is my HRV data on the 1st 30 minutes of my bike to work (the software only allows a 30 minute max period).

Beats Per Minute = 124.8
Shannon Entropy =5.32
Correlation Dimension (D2)=0.547

This all makes sense. My HR speeds up due to the stress of the physical activity. Both the entropy and D2 measures indicate less complexity (thus lower variability) in my heart rate.

Now here is my HRV data on a mildly stressful writing task sitting at work:

Beats Per Minute = 60.67
Shannon Entropy =3.39
Correlation Dimension (D2)=3.24

Again this all makes sense. My HR is 10 beats higher than at rest in the morning. The entropy and D2 measures both indicate less complexity in the HRV thus somewhat more stress. Interestingly the paper I linked above with the engineering students was a study of HRV and stress on exams. According to table 5 in the paper I wouldn't have qualified as being under stress with the entropy or D2 measure (although D2 was very close). The data does suggest more stress than my earlier resting test. These measures seem to be doing a good job so far of quantifying stress loads.

Friday, May 3, 2013

Entropy and Natures Nested Relationships


Yield, and maintain integrity.
To bend is to be upright;
to be empty is to be full.

Those who have little have much to gain,
but those who have much
may be confused by possessions.
Excerpted from Tao Te Ching Chapter 22. Stan Rosenthal Translation

Entropy ( like information as discussed in my last post here ) has many meanings and definitions which can lead to confusion. Increasing entropy has been described as; increasing disorder, a dispersal of energy, an increase in information, an increase in uncertainty, a decrease in information, moving closer to equilibrium, and increase in freedom or possibility, and even as the flow of time itself.

Much of the confusion with the term in my view is related to its great potential for wide applicability across domains. The term entropy originated in physics from the laws of thermodynamics. These original thermodynamic laws however refer to closed systems like a gas within a container. Systems in nature however are never completely closed.

The domains in which entropy has been applied range across a wide spectrum from it's origin in the the purely physical theory of thermodynamics to it's more abstract uses in information theory. In between, entropy has also often been used to describe living systems. Here the great physicist Erwin Schrödinger ( of the quantum mechanical 'Schrödinger equation') providing an example from his wonderful book 'What is life':


"Every process, event, happening—call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. Thus a living organism continually increases its entropy—or, as you may say, produces positive entropy—and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e., alive, by continually drawing from its environment negative entropy—which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive."

As the quote above suggests, the nested quality of natures systems also adds to the potential for confusion. Our interpretation of entropy is relative to our frame of reference. The physicist JB Brissaud authored an interesting article in the journal 'Entropy' in 2005 with title 'The meanings of entropy'. Brissaud speaks to how the nested nature can impact the interpretation:
"Thermodynamics is the physical science which links the microscopic and macroscopic worlds and the meaning of entropy shows a curious mirror effect according to the adopted point of view; what is information for one is lack of information for the other."


Brissaud makes a crucial insight into the paradoxical nature of entropy. For an observer looking from the outside trying to understand a complex system, a low degree of entropy is associated with many interconnected subsystems that constrain each other allowing the system to resist the entropic pull towards dissipation and equilibrium. Yet if we consider the system itself as an observer, then the reverse is true. From the perspective of the system, entropy is associated with possibility, or freedom of choice. In other words from the point of view of an agent/organism, intelligent systems maximize their capacity to extract useful information or energy from their environments. 

Complex living systems are said to have low entropy because they have many inter-related sub-systems which constrain each other in a complex and constant flux of communication. Low entropy systems then are complex due to a large number of internal constraints which provide many pathways to enable the system to flexibly interact with the outside world under conditions of uncertainty. Very simple systems then are less adaptable and more vulnerable to environmental change. 

This resolves a paradox. Low entropy complex systems have an intrinsic intelligence that affords them the opportunity to learn from novel environmental conditions through a receptivity to uncertainty. Highly self-organized systems can be said to have the 'intelligence' to respond to a variety of possibilities. These systems can efficiently exploit resources ( stored energy ) from the environment and dissipate that energy back. Thus from an objective (outside) point of view complex living systems are maximum entropy producers, while from the organisms point of view existence is maintained by keeping internal entropy low.

A fascinating new computer simulation study (bbc news reports on it here) adds support to this conception of entropy by connecting maximum entropy production to the emergence and evolution of intelligence. The simulations are based on a 'causal entropic force' , which is guided by the goal of keeping as many options open as possible. From the BBC report:

"The simplistic model considers a number of examples, such as a pendulum hanging from a moving cart. Simulations of the causal entropy idea show that the pendulum ends up pointing upward - an unstable situation, but one from which the pendulum can explore a wider variety of positions. The researchers liken this to the development of upright walking. Further simulations showed how the same idea could drive the development of tool use, social network formation and cooperation, and even the maximisation of profit in a simple financial market."
"While there were hints from a variety of other fields such as cosmology, it was so enormously surprising to see that one could take these principles, apply them to simple systems, and effectively for free have such behaviours pop out," Dr Wissner-Gross said.
This reminds me very much of the intention one attempts to achieve in tai chi practice. One of the important principles in tai chi is to maintain the ability to respond to an opponent or the environment by moving in any potential direction. The idea is to never over-commit or predetermine ones physical or cognitive awareness or orientation in any given direction. Initially when learning the postures one learns how to place themselves in, and move between specific positions. With practice the awareness becomes more holistic (not on the specific postures), but instead on subtle feedback mechanisms that allow for continuous flowing movement within the context of some core guiding principles.  This is basically a practice in developing an internal body intelligence which once developed frees the awareness so that it may be receptive to the potential surprise that unfolds in an uncertain future.

This is similar to the way human physiology maintains it's vital  systems within a narrow range ( homeostasis ) with the support of other systems that respond more dynamically to unpredictable changes in the surrounding environment ( Allostasis ). Freedom and constraint while seeming to conflict and oppose each other can within the framework of negative feedback systems interact to support an equilibrium where complex self-organizing systems maintain integrity within (core constancy) while flexibly responding to change. The ability to adapt to the unknown represents maximum entropy from the perspective of the self-organizing system. To an outsider observer the same systems ability to maintain it's integrity against the force of entropy represents a far from equilibrium low entropy dynamic.

There is no promise extended however from the nature of entropic forces  that complex systems will maintain a persistent integrity. If the ability to exploit and dissipate the environments stored energy reserves outstrips our capacity to adapt to the entailing environmental changes then the system will no longer be sustainable. Consider our current rate of exploitation with regard to our natural resources (something will have to give).
"Yield, and maintain integrity;
be whole, and all things come to you".
Excerpted from Tao Te Ching Chapter 22. Stan Rosenthal Translation


Update: I had to link this wonderfully produced video of a David Foster Wallace commencement speech which I think beautifully expresses the concept of intelligence, wisdom and meaning being hinged to maximum entropy principles.