Thursday, December 22, 2016

Are power laws good for anything?

It is rather amazing that many complex systems, ranging from proteins to stock markets to cities, exhibit power laws, sometimes over many decades.
A critical review is here, which contains the figure below.

Complexity theory makes much of these power laws.

But, sometimes I wonder what the power laws really tell us, and particularly whether for social and economic issues they are good for anything.
Recently, I learnt of a fascinating case. Admittedly, it does not rely on the exact mathematical details (e.g. the value of the power law exponent!).

The case is described in an article by Dudley Herschbach,
Understanding the outstanding: Zipf's law and positive deviance
and in the book Aid at the Edge of Chaos, by Ben Ramalingam.

Here is the basic idea. Suppose that you have a system of many weakly interacting (random) components. Based on the central limit theorem one would expect that a particular random variable would obey a normal (Gaussian) distribution. This means that large deviations from the mean are extremely unlikely. However, now suppose that the system is "complex" and the components are strongly interacting. Then the probability distribution of the variable may obey a power law. In particular, this means that large deviations from the mean can have a probability that is orders of magnitude larger than they would be if the distribution was "normal".

Now, lets make this concrete. Suppose one goes to a poor country and looks at the weight of young children. One will find that the average weight is significantly smaller than in an affluent country, and most importantly the average less than is healthy for brain and physical development. These low weights arise from a complex range of factors related to poverty: limited money to buy food, lack of diversity of diet, ignorance about healthy diet and nutrition, famines, giving more food to working members of the family, ...
However, if the weights of children obeys a power law, rather than a normal, distribution one might be hopeful that one could find some children who have a healthy weight and investigate what factors contribute to that. This leads to the following.
Positive Deviance (PD) is based on the observation that in every community there are certain individuals or groups (the positive deviants), whose uncommon but successful behaviors or strategies enable them to find better solutions to a problem than their peers. These individuals or groups have access to exactly the same resources and face the same challenges and obstacles as their peers. 
The PD approach is a strength-based, problem-solving approach for behavior and social change. The approach enables the community to discover existing solutions to complex problems within the community. 
The PD approach thus differs from traditional "needs based" or problem-solving approaches in that it does not focus primarily on identification of needs and the external inputs necessary to meet those needs or solve problems. A unique process invites the community to identify and optimize existing, sustainable solutions from within the community, which speeds up innovation. 
The PD approach has been used to address issues as diverse as childhood malnutrition, neo-natal mortality, girl trafficking, school drop-out, female genital cutting (FGC), hospital acquired infections (HAI) and HIV/AIDS.

Tuesday, December 20, 2016

More subtleties in protein structure and function

Almost three years ago I posted about the controversy concerning whether the photoactive yellow protein has low-barrier hydrogen bonds [for these the energy barrier for proton transfer is comparable to the zero-point energy]. I highlighted just how difficult it is going to be, both experimentally and theoretically to definitively resolve the issue, just as for an enzyme I recently discussed.
A key issue concerns how to interpret large proton NMR chemical shifts.

Two recent papers weigh in on the issue

The Low Barrier Hydrogen Bond in the Photoactive Yellow Protein: A Vacuum Artifact Absent in the Crystal and Solution 
Timo Graen, Ludger Inhester, Maike Clemens, Helmut Grubmüller, and Gerrit Groenhof

A Dynamic Equilibrium of Three Hydrogen-Bond Conformers Explains the NMR Spectrum of the Active Site of Photoactive Yellow Protein 
Phillip Johannes Taenzler, Keyarash Sadeghian, and Christian Ochsenfeld

I think the caveats I have offered before need to kept in mind.
As with understanding the active sites of most proteins the problem is that we don't have very direct experimental probes, but have to use indirect probes which produce experimental results that require significant modelling and interpretation.

I thank Steve Boxer for bringing one of these papers to my attention.

Sunday, December 18, 2016

A possible Christmas gift for thoughtful non-scientists?

Are you looking for Christmas gifts?

I think that scientists should be writing popular books for the general public. However, I am disappointed by most I look at. Too many seem to be characterised by hype, self-promotion, over-simplification, or promoting a particular narrow philosophical agenda. The books lack balance and nuance. We should not be just explaining about scientific knowledge but also give an accurate picture of what science is, and what it can and can't do.
(Aside: Some of the problems of the genre, particularly its almost quasi-religious agenda, is discussed in a paper by my UQ history colleague, Ian Hesketh.)

There is one book I that I do often hear non-scientists enthusiastically talk about.
A Short History of Nearly Everything by the famous travel (!) writer Bill Bryson.
There is a nice illustrated edition.

I welcome comments from people who have read the book or given it to non-scientists.

Thursday, December 15, 2016

A DMFT perspective on bad metals

Today I am giving a talk in the Applied Physics Department at Stanford. My host is Sri Raghu.
Here is the current version of the slides.

Tuesday, December 13, 2016

The challenge of an optimal enzyme

Carbonic anhydrase is a common enzyme that performs many different physiological functions including maintaining acid-base equilibria. It is one of the fastest enzymes known and its rate is actually limited not by the chemical reaction at the active site but by diffusion of the reactants and products to the active site.

Understanding the details of its mechanism presents several challenges, both experimentally and theoretically. A key issue is the number and exact location of the water molecules near the active site. The most recent picture (from a 2010 x-ray crystallography study) is shown below.

The "water wire" is involved in the proton transfer from the zinc cation to the Histidine residue. Of particular note is the short hydrogen bond (2.4 Angstroms) between the OH- group and a neighbouring water molecule.

Such a water network near an active site is similar to what occurs in the green fluorescent protein and KSI.

Reliable knowledge of the finer details of this water network really does matter.

This ties in with theoretical challenges that are related to several issues I have blogged about before. Basic questions concerning proton transport along the wire include:

A. Is the proton transfer sequential or concerted?

B. Is quantum tunnelling involved?

C. What role (if any) does the dynamics of the surrounding protein play?

A 2003 paper by Cui and Karplus considers A., highlighting the sensitivity to the details of the water wire.
Another 2003 paper by Smedarchina, Siebrand, Fernández-Ramos, and Cui looks at the both questions through kinetic isotope effects and suggests tunnelling plays a role.

In 2003 it was not even clear how many water molecules were in the wire and so the authors considered different alternatives.

One can only answer these questions definitively if one has extremely accurate potential energy surfaces. This is challenging because:

Barrier heights and quantum nuclear effects vary significantly with small changes (even 0.05 Angstroms) in H-bond donor-acceptor distances.

The potential surface can vary significantly depending on the level of quantum chemistry theory or density functional that is used in calculations.

I thank Srabani Taraphder for introducing me to this enzyme. She has recently investigated question C.

Monday, December 12, 2016

Bouncing soap bubbles

My wife and I are often looking for new science demonstrations to do with children. The latest one she found was "bouncing soap bubbles".

For reasons of convenience [laziness?] we actually bought the kit from Steve Spangler.
It is pretty cool.

A couple of interesting scientific questions are:

Why do the gloves help?

The claim is that the grease on your hands makes bursting the bubbles easier.

Why does glycerin make the soap bubbles stronger?

Why does "ageing" the soap solution for 24 hours lead to stronger bubbles?

Journal of Chemical Education is often a source of good ideas and science discussions. Here are two relevant articles.

Clean Chemistry: Entertaining and Educational Activities with Soap Bubbles 
Kathryn R. Williams

Soap Films and the Joy of Bubbles
Mary E. Saecker

Friday, December 9, 2016

Metric madness: McNamara and the military

Previously, I posted about a historical precedent for managing by metrics: economic planning in Stalinist Russia.

I recently learnt of a capitalist analogue, starting with Ford motor company in the USA.
I found the following account illuminating and loved the (tragic) quotes from Colin Powell about the Vietnam war.
Robert McNamara was the brightest of a group of ten military analysts who worked together in Air Force Statistical Control during World War II and who were hired en masse by Henry Ford II in 1946. They became a strategic planning unit within Ford, initially dubbed the Quiz Kids because of their seemingly endless questions and youth, but eventually renamed the Whiz Kids, thanks in no small part to the efforts of McNamara. 
There were ‘four McNamara steps to changing the thinking of any organisation’: state an objective, work out how to get there, apply costings, and systematically monitor progress against the plan. In the 1960s, appointed by J.F. Kennedy as Secretary of Defense after just a week as Chair of Ford, McNamara created another Strategic Planning Unit in the Department of Defense, also called the Whiz Kids, with a similar ethos of formal analysis. McNamara spelled out his approach to defence strategy: ‘We first determine what our foreign policy is to be, formulate a military strategy to carry out that policy, then build the military forces to conduct that strategy.’ 
Obsessed with the ‘formal and the analytical’ to select and order data, McNamara and his team famously developed a statistical strategy for winning the Vietnam War. ‘In essence, McNamara had taken the management concepts from his experiences at the Ford Motor Company, where he worked in a variety of positions for 15 years, eventually becoming president in 1960, and applied them to his management of the Department of Defense.’ 
But the gap between the ideal and the reality was stark. Colin Powell describes his experience on the ground in Vietnam in his biography: 
Secretary McNamara...made a visit to South Vietnam. Every quantitative measurement, he concluded, after forty-eight hours there, shows that we are winning the war. Measure it and it has meaning. Measure it and it is real. Yet, nothing I had witnessed . . . indicated we were beating the Viet Cong. Beating them? Most of the time we could not even find them. 
McNamara’s slide-rule commandos had devised precise indices to measure the immeasurable. This conspiracy of illusion would reach full flower in the years ahead, as we added to the secure-hamlet nonsense, the search-and-sweep nonsense, the body-count nonsense, all of which we knew was nonsense, even as we did it. 
McNamara then used the same principles to transform the World Bank’s systems and operations. Sonja Amadae, a historian of rational choice theory, suggests that, ‘over time . . . the objective, cost-benefit strategy of policy formation would become the universal status quo in development economics—a position it still holds today.’ Towards the end of his life, McNamara himself started to acknowledge that, ‘Amid all the objective-setting and evaluating, the careful counting and the cost-benefit analysis, stood ordinary human beings [who] behaved unpredictably.’
Ben Ramalingam,  Aid on the Edge of Chaos, Oxford University Press, 2013. pp. 45-46.
Aside: I am working on posting a review of the book soon.

Given all this dubious history, why are people trying to manage science by metrics?

Wednesday, December 7, 2016

Pseudo-spin lattice models for hydrogen-bonded ferroelectrics and ice

The challenge of understanding phase transitions and proton ordering in hydrogen-bonded ferroelectrics (such as KDP, squaric acid, croconic acid) and different crystal phases of ice has been a rich source of lattice models for statistical physics.
Models include ice-type models (six-vertex model, Slater's KDP model), transverse field Ising model, and some gauge theories. Some of the classical (quantum) models are exactly soluble in two (one) dimensions.

An important question that seems to be skimmed over is the following: under what assumptions can one actually "derive" these models starting from the actual crystal structure and electronic and vibrational properties of a specific material?

That quantum effects, particularly tunnelling of protons, are important in some of the materials is indicated by the large shifts (of the order of 100 percent) seen in the transition temperatures upon H/D isotope substitution.

In 1963 de Gennes argued that the transverse field Ising model should describe the collective excitations of protons tunnelling between different molecular units in an H-bonded ferroelectric. Some of this is discussed in detail in an extensive review by Blinc and Zeks.
An important issue is whether the phase transition is an "order-disorder" transition or a "displacive" transition. I think what this means is the following. In the former case, the transition is driven by the pseudo-spin variables and there is no soft lattice mode associated with the transition.
Perhaps, in different language, is it appropriate to "integrate out" the vibrational degrees of freedom?
[Aside: this reminds me of some issues that I looked at in a Holstein model about 20 years ago].

There are a lot of papers that make quantitative comparisons between experimental properties and the predictions of a transverse field Ising model (usually treated in the mean-field approximation).
One example (which also highlights the role of isotope effects) is

Quantum phase transition in K3D1−xHx(SO4)2 
Y. Moritomo, Y. Tokura, N. Nagaosa, T. Suzuki, and K. Kumagai

One problem I am puzzling over is that the model parameters that they (and others) extract are different from what I would expect from knowing the actual bond lengths, vibrational frequencies, in the system and the energetics of different H-bond states. I can only "derive" pseudo-spin models with quite restrictive assumptions.

A recent paper that looks some of rich physics associated with collective quantum effects is
Classical and quantum theories of proton disorder in hexagonal water ice 
Owen Benton, Olga Sikora, and Nic Shannon

Monday, December 5, 2016

Hydrogen bonding at Berkeley

On Friday I am giving a talk in the Chemistry Department at Berkeley.
Here is the current version of the slides.

There is some interesting local background history I will briefly mention in the talk. One of the first people to document correlations between different properties (e.g. bond lengths and vibrational frequencies) of diverse classes of H-bond complexes was George Pimentel. 
Many correlations were summarised in a classic book, "The Hydrogen Bond" published in 1960.
He also promoted the idea of a 4-electron, 3 orbital bond which has similarities to the diabatic state picture I am promoting.
There is even a lecture theatre on campus named after him!

Friday, December 2, 2016

A central result of non-equilibrium statistical physics

Here is a helpful quote from William Bialek. It is a footnote in a nice article, Perspectives on theory at the interface of physics and biology.
The Boltzmann distribution is the maximum entropy distribution consistent with knowing the mean energy, and this sometimes leads to confusion about maximum entropy methods as being equivalent to some sort of equilibrium assumption (which would be obviously wrong). But we can build maximum entropy models that hold many different expectation values fixed, and it is only when we fix the expectation value of the Hamiltonian that we are describing thermal equilibrium. What is useful is that maximum entropy models are equivalent to the Boltzmann distribution for some hypothetical system, and often this is a source of both intuition and calculational tools.
This type of approach features in the statistical mechanics of income distributions.

Examples where Bialek has applied this includes voting patterns of the USA Supreme Court, flocking of birds, and antibody diversity.

For a gentler introduction to this profound idea [which I still struggle with] see
*James Sethna's textbook, Entropy, Order parameters, and Complexity.
* review articles on large deviation theory by Hugo Touchette, such as this and this.
I thank David Limmer for bringing the latter to my attention.