Tuesday, July 31, 2012

The Wilson's on quality in quantum chemistry

Ken Wilson is very well known for developing the renormalisation group and applying it to critical phenomena and the Kondo problem. There is a long and interesting (but somewhat meandering) interview with him about his career. Amongst various choice tid-bits there is the exchange below about his father, E. Bright Wilson, a pioneer in quantum chemistry.

PoS

    Did your father use computers? Did you know of his models for getting infrared spectra? Did things like that play a role in your thinking?

KGW

    What I remember from discussions with my father was that he used get very wrought up about computational quantum chemists. Garbage in, garbage out. That set me up to spend some time at Ohio State studying quantum chemistry. I had done a little bit towards the end of my stay at Cornell, but took it more seriously while I was at Ohio State. And so then I had to find out from my father who were the good people. And he knew them, he had a list of them.

PoS

    And who did he say were the good people?

KGW

    There was [Isiah] Shavit. There was Ernie Davidson, John Pople from Carnegie Mellon. What's interesting about this is that later when I became interested in the history of physics and [Thomas S.] Kuhn's book, one of the characteristics of the pre-paradigm phase he discusses, and there really are pre-paradigm phases --you know, people don't want to admit that--is that everybody is arguing with each other and somebody comes in from outside and tries to figure out what's going on, like my father interacting with the quantum chemists, they learn who the good people are. And yet, they won't admit that there are good people, unless you ask them, otherwise they are more interested in complaining about the poor quality of the research by others in the field....

PoS

    And the criterion for being good? I mean what makes a person good?

KGW

    These are the people who are smart, take serious problems to work on...

PoS

    The generative quality comes in?

KGW

    For instance, consider quantum chemistry for a moment. What I found was that the people who did the important work worked on algorithms. They improved the algorithms for solving quantum chemistry problems on computers. They couldn't do the calculations they wanted to do, so they worked on algorithms. And it was the algorithmic work that was absolutely essential. When the computers got better, and they could do serious things, it was the work on algorithms that made the difference and the people that my father knew made contributions to serious algorithm developments. At the same time, there was just a lot of stuff published where people were running programs and they were paying no attention to whether they worked or didn't work, and claiming all sorts of fancy things.

Monday, July 30, 2012

Improper hydrogen bonds

Most common hydrogen bonds involve an interaction of the form X-H...Y where the donor X and acceptor Y are highly electronegative atoms such as O, N, and F. Signatures of H bond formation include lengthening of the X-H bond, softening of the X-H stretch frequency, and increase in the X-H stretch IR intensity. For strong H bonds these effects on the X-H bond are substantial, of the order of 10-100 per cent. I presented a unified picture of this in a recent paper.

However, over the past 15 years it has been discovered that there are a class of (very weak) bonds, best described as improper H bonds which are distinctly different. Generally, the donor X is not particularly electronegative (e.g. a carbon atom) and bond formation results in
  • contraction of the X-H bond (by a few milliAngstroms)
  • hardening of the X-H stretch frequency (by less than one per cent)
  • decrease in the X-H stretch IR intensity
The key idea is as follows

The factors which affect the X−H bond in all X−H···Y HBs can be divided into two parts:  (a) The electron affinity of X causes a net gain of electron density at the X−H bond region in the presence of Y and encourages an X−H bond contraction. (b) The well understood attractive interaction between the positive H and electron rich Y forces an X−H bond elongation. For electron rich, highly polar X−H bonds (proper HB donors) the latter almost always dominates and results in X−H bond elongation, whereas for less polar, electron poor X−H bonds (pro-improper HB donors) the effect of the former is noticeable if Y is not a very strong HB acceptor.


In different words due to interaction with the acceptor the relative amount of covalent and ionic character of the X-H bond changes.
Hence, I think this goes beyond my simple two diabatic state picture which does not allow the X-H diabatic state to vary in character with the interaction. 

I thank Pranav Shirhatti for stimulating my interest in this problem.

Friday, July 27, 2012

The Higgs boson and condensed matter physics

This week at the Quantum Science seminar Ben Powell gave a tutorial about the Higgs boson, highlighting its conceptual origin in condensed matter physics. The talk followed some of Section 12.6 of Piers Coleman's nice book Introduction to Many Body Physics (free online). It is a nice clear and helpful discussion.

One of the key ideas first emphasized by Phil Anderson in 1963 was that a massless gauge field can aquire a mass in the presence of a coupling to a spontaneously broken field. A concrete realisation of this occurs in superconductors. In the Meissner effect a superconductor thicker than the penetration depth expels magnetic fields. This is like the photon acquires a mass.

In the electro-weak theory of Weinberg-Salam there is a combined U(1) x SU(2) gauge symmetry. Due to coupling to the Higgs field (whose symmetry is spontaneously broken)
one gauge field remains massless (the photon) and the other three become massive. These massive particles are the W+, W-, and Z bosons.

In a type II superconductor, vortices are allowed in the superconducting order parameter field. Can such vortices occur in the Higgs field? They may have been important in the early universe.
On fascinating thing I learnt is that for the Higgs field the crucial ratio [between the London penetration length and the superconducting coherence length] that determines whether type II behaviour is possible is the ratio of Higgs boson mass to W mass. The LHC results suggest that type II behaviour is possible!

In summary, here is an extract from Coleman's book (page 246).
Shortly after the importance of this mechanism for relativistic Yang Mills theories was noted by Higgs and Anderson, Weinberg and Salem independently applied the idea to develop the theory of “electro-weak” interactions. According to this picture, the universe we live is a kind of cosmological Meissner phase, formed in the early universe, which excludes the weak force by making the vector bosons which carry it, become massive. It is a remarkable thought that the very same mechanism that causes superconductors to levitate lies at the heart of the weak nuclear force responsible for nuclear fusion inside stars. In trying to discover the Higg’s particle, physicists are in effect trying to probe the cosmic superconductor above its gap energy scale.
Aside: Later Coleman discusses how (in a slave boson formulation) "the Anderson-Higgs effect in the Kondo problem endows the composite f−electron with charge."

Thursday, July 26, 2012

A sad tale of publishing gone mad

I found this sad story interesting and disturbing because of what it reveals about journal impact factors, university rankings, self-citations, Elsevier, scientific crack pots, lawsuits....
More of the weird history is here and here.

Pauling on the role of quantum theory in chemistry

In an article The Nature of the Chemical Bond - 1992, Linus Pauling makes the following fascinating statement:
The concept of quantum mechanical resonance and the theorem that in quantum mechanics the actual structure of a system has a lower energy than any other structure have turned out to be especially important in chemistry. The energy could be calculated for an assumed wave function for a molecule. Any change that lowered the energy indicated some addition to the picture of the chemical bond. The polarization of bond orbitals and the partial ionic character of bonds were discovered in this way. The minimum-energy theorem led to the formulation of the electronegativity scale. Modern chemistry and molecular biology are the products of quantum mechanics. Chemistry has been changed by quantum mechanics even more than physics.
In 1936 in the Preface to The Nature of the Chemical Bond he also emphasized how quantum physics led to new chemical concepts.

Wednesday, July 25, 2012

The gravity of the situation

On monday night I heard Paul Davies give an interesting public lecture The origin and the end of the Universe in Brisbane. One small aspect that I was particularly interesting was the discussion of the second law of thermodynamics in gravitational systems. He emphasized the following puzzle (in my words).

For an isolated system the entropy can never decrease. In some sense this means that the "order" cannot increase. However, a long time ago matter in the universe was relatively uniform, and now it is not just ordered into galaxies and stars, but even biological life!

The key to resolving this is to realise that in a gravitational system the second law looks different. Uniform "disordered" states do not have low entropy. If you start with a fairly uniform system this is not an equilibrium state. The natural tendency of the system is to evolve to a non-uniform state with clumps of matter. Hence, the "clumpy" state [with a sun which transfers energy to order and sustain biological systems] can actually have the lower entropy.

Clarifying these issues, particularly in a quantitative manner, turns out not to be easy. I found a nice one page article by Mark Buchanan in Nature Physics. It summarises a 2009 paper Gravity, Entropy, and Cosmology: a search for clarity by physics philosopher David Wallace.

Tuesday, July 24, 2012

Blowing ourselves up

APS News has a fascinating article A Cold-War Folly? by Nina Byers. She describes a course entitled Nuclear Power: Power plants and weapons of war that she teaches UCLA undergraduates.

A couple of things I found particularly interesting. First, the graph below showing the dramatic variation in nuclear weapon stockpiles with time. Second, the diverse views that  physicists (esp. famous ones) had about the use of nuclear weapons, both against Japan, and after the war. Third, it was a good reminder that our current undergraduates were actually born after the end of the cold war!

A great strength of the US college system compared to Australia and Europe is the flexibility of the curriculum and broad general education requirements that allow and encourage such courses.

Monday, July 23, 2012

Caution and skepticism win again

On Nanoscale views, Doug Natelson has an excellent post Exotic (quasi)particles, and why experimental physics is challenging.
He points out recent theoretical work showing that a zero-bias peak in the I-V characteristic seen in recent experiments is not conclusive evidence for Majorana particles. 



The basics of electronegativity

Electronegativity is a simple and profound concept for understanding and giving a semi-quantitative description of broad classes of chemical bonds. It reflects the brilliant intuition of Linus Pauling who first introduced it in 1932. The bold idea is to assign a single number to each element of the periodic table; the relative value of the number between two elements then determines the magnitude of the polarity (charge distribution) in a chemical bond between two different elements.

I have been reviewing the concept because it is a key element to understanding hydrogen bonds, the recent IUPAC definition stating:
the hydrogen bond is an attractive interaction between a hydrogen atom from a molecule or a molecular fragment X-H in which X is more electronegative than H, ...
Unlike earlier definitions it does not require that the acceptor be more electronegative than H, only that for X-H...Y-Z
the acceptor is an electron-rich region such as, but not limited to, a lone pair in Y or a pi-bonded pair in Y-Z. 
The Wikipedia page on electronegativity is a helpful introduction but I found section 6.4 in the classic Coulson's Valence extremely helpful.

There are several alternative definitions of electronegativity (Pauling, Mulliken, Alfred-Ronnow, Allen, ...). This highlights a few things:

   -like most intuitive chemical concepts they are not something that can be defined rigorously, without ambiguity, or in a reductionist manner (a point highlighted by Roald Hoffmann)

  -the concepts are useful for understanding semi-quantitative trends

  -these different definitions actually highlight the power of the concept because they show how a wide range of chemical and physical properties (bonding energies, dipole moments, charge distributions, ...) are correlated.
The graph below shows Pauling vs. Mulliken electronegativities

The "clearest" and most "precise" definition is that of Mulliken, where the electronegativity is the average of the ionisation energy and the electron affinity of the atom. This equals half of the ground state energy difference between the cation and the anion.
This means that if A and B have the same electronegativity that the ionic valence bond (VB) structures A+B- and A-B+ will have the same energy and so contribute equally to the full VB wave function, leading to no charge polarity.

To me the success of the concept also highlights the fact that predominantly chemical bonding is local.

Like most chemical concepts there are exceptions to their naive application. For example, carbon is less electronegative than oxygen, and so one might expect that in carbon monoxide (CO) there would be a net negative charge on the oxygen atom. However, the opposite is true.

Thursday, July 19, 2012

The best measure of research impact is ...

"whether the candidate's research has changed the community's view of chemistry in a positive way".

This is the main point of a nice Editorial, Assessing Academic Researchers that just appeared in Angewandte Chemie by Richard Zare. He points out that this is the main criteria that is used to decide whether or not Assistant Professors in Chemistry at Stanford get tenure.
We do not look into how much funding the candidate has brought to the university in the form of grants. We do not count the number of published papers; we also do not rank publications according to authorship order. We do not use some elaborate algorithm that weighs publications in journals according to the impact factor of the journal. We seldom discuss h-index metrics, which aim to measure the impact of a researcher’s publications. We simply ask outside experts, as well as our tenured faculty members, whether a candidate has significantly changed how we understand chemistry.
I thank Seth Olsen for bringing the article to my attention.

Tuesday, July 17, 2012

Details do matter in photosynthesis

In Telluride I heard David Coker give a fascinating talk about his recent work on quantum decoherence in photosynthetic proteins, reported in a recent paper with Jeremy Moix, Jianlan Wu, Pengfei Huo, and Jianshu Cao.

A 2007 Nature paper claimed to report evidence for quantum coherence of electronic excitations for a few hundred femtoseconds at liquid nitrogen temperatures in the FMO complex. This led to a flurry of theoretical activity [and various silly (at least to me) and grandiose claims about "quantum biology" and "green quantum computers"].
[Aside: it is often overlooked that the claimed coherence is just between 2 chromophores not all seven in the complex].

However, these experiments were not done on the complete photosynthetic complex, but a subset with 7 chromophores. Recent structural studies have shown that there is actually an 8th chromophore. Surely, such small details don't matter...
But, they do. Coker's group has shown that this extra chromophore changes the energy transfer pathway through the complex. Moreover, coherent oscillations between the two chromophores are no longer present. The energy transfer is incoherent in the native complex. It functions fine without quantum coherence.

Monday, July 16, 2012

Quantum physics as a political metaphor

There is a somewhat amusing column Our Political Black Hole in the New York Times by Gail Collins. It considers the fictional response of US presidential candidates to the discovery of the Higgs boson. I found it somewhat amusing, but in someways it is a bit painfully close to the truth.
This is a bit how I feel when I watch the classic BBC political comedy, Yes Minister.

Perhaps Gail Collins use of physics metaphor was inspired by David Javerbaum's earlier NYT column A Quantum Theory of Mitt Romney.

Tuesday, July 10, 2012

Students need to walk before they can run

There is an interesting Editorial in the Journal of Chemical Education
Science Education for Global Sustainability: What Is Necessary for Teaching, Learning, and Assessment Strategies?
by Uri Zoller.

I agree with many of his concerns and sentiments. Clearly he has thought about the relevant issues and worked hard at trying to implement them. I particularly like the sample exam questions he uses to illustrate his goals.

However, I feel that some of his proposed solutions are unrealistic (e.g., no text books) and unhelpful to students, particularly for beginning undergraduates. I think (and my anecdotal observations are) that multi-disciplinary courses are either too hard (or too superficial) for such students. Students need to learn basic chemistry, physics, and biology before they can  cope with integrating these ideas in biophysics, materials science, or environmental policy.

I thank Ross Jansen-van Vuuren for bringing the article to my attention.

Monday, July 9, 2012

Books on quantum many-body theory

Previously at UQ we have run a few successful book reading groups for postdocs and graduate students. We have worked through parts of
Electronic Correlations in Molecules, and Solids by Peter Fulde
Advanced Solid State Physics by Phillip Philips
A Chemists Guide to Valence Bond Theory by Shaik and Hiberty

A postdoc, Tony Wright, and I are considering starting a new group based on a book on quantum many-body theory. We want a book that makes a strong connection to experiment. I welcome suggestions.

Here is my current suggestions in order of roughly decreasing preference

The Kondo Problem to Heavy Fermions by Alex Hewson.
Although focussed on the Kondo problem, it covers techniques and concepts that are more broadly applicable including Fermi liquid theory, scaling, and slave bosons.
It does connect strongly to experiment.
The e-book is available through the library.

An introduction to Many-Body Theory by Piers Coleman.
This is particularly clear, emphasizes key concepts, and has beautiful illuminating illustrations. But, perhaps we want more connection to experiment.
Available free on-line.

Many-Body Quantum Theory in Condensed Matter Physics by Henrik Bruus and Karsten Flensberg.
No path integrals. Too many Feynman diagrams? Again, perhaps we want more connection to experiment.
Multiple copies are available in the library.

Friday, July 6, 2012

Reservations about the five year h-index

Recently I encountered a new metric, the "5 year h-index" which was being used to evaluate someone's research performance. Explicity, one counts the number h of papers published in the last 5 years that have been cited more than h times. Perhaps one might argue that this is a good metric for deciding whether to give someone a grant now. Afterall, just because 10 or 20 years ago they published highly cited papers does not mean that right now they are at the cutting edge. However, I do not agree.

I think this is a highly unreliable metric because there is significant noise. Except for a few rare exceptional papers, citations within a few years of publication will be low (1-10?). Hence, comparing two people with 5 year h-indexes, say one with a 6 and another with a 10, I would contend is meaningless.

Two of the most cited papers in Physical Review journals are the EPR (Einstein Podolsky Rosen) paper and Steven Weinberg's electro-weak interaction paper. The latter attracted about one citation per year for the first 5 years after publication! The former attracted about 10 citations in the first 2 years and then none for more than ten years!

When Jorge Hirsch introduced the h-index the whole point what to find some measure for a lifetime of scientific achievement. I still think it does provide a useful coarse-grained measure for that.

Thursday, July 5, 2012

Overselling ab initio computational chemistry

The great appeal of computational quantum chemistry is that it aims to be ab initio. One simply calculates the properties of molecules from Schrodinger's equation and Coulomb's law. However, the painful reality is that many methods have to include parameters (or make choices about approximations) that are determined by comparison with experiment. Indeed there is now a whole industry of people who tweak parameters in density functionals in order to get better agreement with experiment. Although I am not enthusiastic about this I can live with it as long as people are transparent about what they are doing.

A recent JACS paper Mechanism for Singlet Fission in Pentacene and Tetracene: From Single Exciton to Two Triplets from Martin Head-Gordon's group is not as transparent as it could be about whether it is ab initio. It states
The originally proposed mechanism for SF [Singlet Fission] is based on model Hamiltonians that couple of monomer states between adjacent molecules. The low coupling in the model between the single-exciton and ME states requires that a CT state be invoked as an intermediate (i.e., an indirect mechanism). This requires the assumption that the CT state is relatively low in energy and thus energetically accessible. Herein, systematic ab initio study of the low-lying excited states in tetracene and pentacene provides an alternative mechanism for the photophysics of these materials. This study provides evidence that CT states need not be directly relevant to SF in acenes.  
Because these ab initio simulations capture the correlation of many electrons, they are distinct from model Hamiltonian studies (for instance ref 22). The current understanding of SF comes from model Hamiltonians, where certain electronic states of two monomers are employed as basis sets. While model Hamiltonian studies can yield deep insights into complex physical processes such as SF [Singlet Fission], these invariably require assumptions about the physics which are embedded as model parameters. By contrast, ab initio calculations in principle allow the essential features to emerge directly from simulations.... 
However, if one looks at the Computational Details section of the paper (which comes after the Conclusion) one finds the statement.
One deficiency of CASSCF and RAS-2SF theories is the overestimation of excitation energies due to the limited degree of dynamic correlation. To overcome this difficulty, we shift the excitation energies of T1 and S1 at the equilibrium geometry to the experimental values for the acene crystals.
Surely, this fitting to experiment [of two key observables] undermines the authors claim to be doing ab initio calculations or to be superior to model calculations. Furthermore, most of the calculations in the paper involve a small number of molecules, which is surely a model for the infinite solid, in which significant screening effects may be present. Arguably, ignoring this is comparable to the significant physical assumptions present in model calculations.

Tuesday, July 3, 2012

Developing science demonstrations that actually teach science

Demonstrations to school students can easily degenerate into the following format. First, one does something spectacular such as the Coke-Mentos fountain or the barrel crush. Second, one tells the student how it works. I confess I have often done this. However, this is actually terrible because it reinforces the misconception that science is a noun not a verb. It teaches nothing about the scientific method.

This week my wife and I demonstrated the Coke-Mentos fountain to a group of kids at a holiday club that my church was running. In order to promote critical thinking we did some comparative measurements. The fountain was done for diet Coke, Solo (a lemon drink), and generic brand (Coles) Cola. We also compared Mentos bought in the USA (on my recent trip) and in Australia. It turned out that the former is much more effective. I later learnt that we had been scooped in this important scientific discovery. It had already been published on YouTube!



I also discovered there is some nice literature on the subject.

Mentos and the Scientific Method: A Sweet Combination in the Journal of Chemical Education.

Diet Coke and Mentos: What is really behind this physical reaction? in the American Journal of Physics. They have some impressive apparatus for making quantitative measurements. They also found that playground sand was almost as good as Mentos.
They report surface analysis studies of the Mentos, highlighting the importance of the surface roughness for nucleation sites for CO2 bubbles.

The Ultrasonic Soda Fountain: A Dramatic Demonstration of Gas Solubility in Aqueous Solutions in the Journal of Chemical Education

“Can we do That Again?” Engaging Learners and Developing Beyond the “Wow” Factor in the Science Education Review.

Finally, having good apparatus helps. From Steve Spangler science we purchased a Geyser Tube which feeds the Mentos into the Coke. With the recommended 7 US Mentos we observed fountains of 2-3 metres!

Monday, July 2, 2012

Kagome lattice antiferromagnet IS a Z_2 spin liquid

At the Journal Club for Condensed Matter there is a very nice and clear commentary Identifying a spin liquid on Kagome lattice by quantum entanglement by Ashvin Vishwanath. I learnt a lot from reading it.

It reviews two recent preprints which use numerical methods (density matrix renormalisation group = DMRG) to establish topological order in the ground state of the Heisenberg spin-1/2 model on the Kagome lattice.

An earlier post considered earlier DMRG evidence that was suggestive of a spin liquid, with an energy gap, but did not establish topological order.

Sunday, July 1, 2012

Seeking simplicity in complex systems

One thing I really enjoyed and appreciated about the workshop this week has been the emphasis on developing "simple" models to describe systems that are structurally and chemically complex.
Here "simple" means that there are a just a few degrees of freedom and a few parameters in the model. "Complex" means there are many degrees of freedom.
Even when people are doing very large and demanding molecular dynamics simulations of solvated proteins the goal has been to understand the essential physics and chemistry of what is going on.

Here are a few examples.

Phil Geissler considered a  simple model for force generation in cellular processes, showing how Actin filament curvature biases branching direction.

Frank Brown considered an analytical model that could be used for Interpreting neutron spin echo experiments on lipid bilayer membranes without introducing a "fudge factor" for the value of the solvent viscosity that experimentalists had been using.

Abe Nitzan considered the simplest possible effective Hamiltonians that could be used to describe Electromagnetic and magnetic effects in molecular conduction.

Greg Voth discussed the importance of coarse graining and described an unbiased numerical method for reducing the dynamics of a protein to just a few sites. He then described recent work applying this to electron transfer in an iron hydrogenase.

Rob Coalson considered the problem of how a particular pore membrane protein worked. He considered how a polymer brush collapses when exposed to a critical concentration of binding nanoparticles. He found how the mean-field theory for a simple lattice gas type model could capture the phase transition associated with the collapse of the brush.

Dmitry Matyushov spoke about extending elastic network models for proteins to describe their response to external forces including local and global electric fields.