Thursday, July 31, 2014

Chemistry is local. In praise of Wannier.

In several posts I have emphasised that Chemistry is local.
This is illustrated by the fact that specific bonds within a molecule have approximately the same length, energy, and vibrational frequency regardless of the details of the molecule, particularly the distant parts.
This locality leads to useful concepts and theoretical approaches such as Atoms in Molecules, Natural Bond Orbitals, Valence Bond theory.
However, this locality is at variance with Molecular Orbital theory and the Kohn-Sham orbitals in Density Functional Theory (DFT); the orbitals can be completely delocalised over the whole molecule.

What are the implications for solid state physics?
Band theory is the analogue of molecular orbital theory. Bloch electronic wave functions are completely delocalised throughout the crystal.
Wannier orbitals are the physics analogue of Boys orbitals in chemistry.

In 1984 Phil Anderson wrote:
The Wannier functions are still one of the most useful but underutilized methodologies of solid state physics, and in particular it is in the language of Wannier functions that I feel the chemical implications of band theory are most effectively expressed. 
Of course the reason is that most of the concepts of chemistry are local concepts, such as bonds, ions, complexes, etc., while band theory is a global structure, in which the wave functions permeate the entire system and the eigenenergies depend on the position of every atom everywhere. There is no a priori reason why band theory should lead to such a chemically intuitive result as that the carbon—carbon single bond should have roughly the same energy and bond length whenever it appears, the Oxygen anion should have a constant radius and negative electron affinity, etc. 
This same weakness is shared by band theory’s chemical equivalent, the molecular orbital theory of Hund and Mulliken. From the very first, there was a vain attempt to restore locality by the use of atomic states and the valence-bond idea, very much advocated by Pauling, but only Pauling’s great ingenuity in applying the vague concept of “resonance” and his enormous prestige kept this scheme afloat as long as it has been: it is just not a valid way of doing quantum mechanics, and fails completely in the case of metals and of the organic chemical equivalent of metals, namely aromatic compounds and graphite, and is not very useful elsewhere except in the hands of a master empiricist such as Pauling. 
Nonetheless many local basis: how is this compatible with quantum mechanics? This is the enigma to which Wannier functions give us a very precise and clear answer. Not only that, but with a bit of ingenuity it is possible to modify the local functions in such a way as to give one a simple, accurate and serviceable method for quantum chemical calculations.
P.W. Anderson, Chemical Pseudopotentials, Physics Reports 1984

The discussion of resonating valence bond (RVB) theory is a bit harsh ("it is not just a valid way of doing quantum mechanics") and ironic given that only three years later Anderson introduced his RVB theory of superconductivity!

Indeed, the past two decades has seen a resurgence of interest in and use of Wannier functions.
Here is a recent Reviews of Modern Physics on the subject.

Wednesday, July 30, 2014

Seeing the effects of relativity with the naked eye

Our natural tendency is to think that to see the effects of Einstein's special theory of relativity you have to be travelling at some significant fraction of the speed of light. However, this is not the case. In solid state physics I am aware of three concrete phenomena that are purely due to relativistic effects.

1. Gold metal is the colour "gold".
According to Wikipedia, "non-relativistic gold would be white. The relativistic effects are raising the 5d orbital and lowering the 6s orbital.[11]"

2. Mercury is a liquid at room temperature.
This is nicely discussed in a recent blog post by Henry Rzepa concerning a recent paper that shows that relativistic effects shift the melting temperature by about 100 K.

3. Magnetic anisotropy and hysteresis in ferromagnets.
This results from spin-orbit coupling which is a consequence of relativity.

Tuesday, July 29, 2014

Journals should publicise their retraction index

Here are several interesting and related things about retracted journal articles.

1. Some retracted articles continue to get cited!
For example, today I found an interesting reference to this Science paper from 2001, only to learn it had been retracted. Furthermore, Google Scholar shows the paper has been cited several times in the past 4 years. Indeed, some of the Schon-Batlogg papers are still cited, for scientific reasons, not just as examples of scientific fraud. (For example, this recent JACS).

2. The Careers section of Nature has an interesting article Retractions: A Clean Slate, which makes the case that if you make an "innocent" mistake the best thing you can do is promptly make a retraction. But, there are some pitfalls. One thing that is still not clear to me is how in some cases one decides between complete retraction, partial retraction, and an erratum.

3. There is a correlation between journal impact factor and the frequency of retractions.
Somehow I did not find the graph below surprising.
This is described in an interesting editorial Retracted Science and the Retraction Index in the journal Infection and Immunity.
We defined a “retraction index” for each journal as the number of retractions in the time interval from 2001 to 2010, multiplied by 1,000, and divided by the number of published articles with abstracts. 
 ... the disproportionally high payoff associated with publishing in higher-impact journals could encourage risk-taking behavior by authors in study design, data presentation, data analysis, and interpretation that subsequently leads to the retraction of the work. 
Another possibility is that the desire of high-impact journals for clear and definitive reports may encourage authors to manipulate their data to meet this expectation. In contradistinction to the crisp, orderly results of a typical manuscript in a high-impact journal, the reality of everyday science is often a messy affair littered with nonreproducible experiments, outlier data points, unexplained results, and observations that fail to fit into a neat story. In such situations, desperate authors may be enticed to take short cuts, withhold data from the review process, overinterpret results, manipulate images, and engage in behavior ranging from questionable practices to outright fraud (26). 
Alternatively, publications in high-impact journals have increased visibility and may accordingly attract greater scrutiny that results in the discovery of problems eventually leading to retraction. It is possible that each of these explanations contributes to the correlation between retraction index and impact factor.
I look forward to the day when journals publicise their retraction index and university managers discourage their staff from publishing in certain journals because their retraction index is too high.

Monday, July 28, 2014

A realistic debate about climate change in the media

One of the many problems of the news media is that they love conflict and controversy. So much so, that they will not just amplify it and feed it, but even create it. This certainly happens with the issue of climate change. This video nicely illustrates the point with humour.

 

Thursday, July 24, 2014

NORDITA workshop on water

I have written many posts about what a fascinating, difficult, and important subject water is. I think it is one of the classic hard problems that does not get the attention it deserves. Science increasingly follows the latest fashionable topic that has "low-lying fruit" to pick.
Hence, I was delighted to learn last year that NORDITA [Nordic Institute for Theoretical and Atomic Physics] is planning a month long program this year on Water - the Most Anomalous Liquid.

I was even happier when I was invited to be part of a "Working Group" in week one to focus on "Quantum effects", led by Tom Markland. Hopefully this will generate some interesting discussions, science, and blog posts!

To increase the visual appeal of this post I searched on Google Images for "water quantum" and got some scary results, including this video marketing the "Quantum BioEnergy water clamp". I am not sure whether we should laugh or cry!

Tuesday, July 22, 2014

A key concept in glasses: the entropy crisis

The figure below introduces the idea of an "entropy crisis" and the Kauzmann temperature in glasses. It also leads to profound and controversial questions about the intimate connection between thermodynamics and kinetics in glasses.

Each solid curve shows the temperature dependence of the entropy of a supercooled liquid, relative to that of the crystal, above T_g, the glass transition temperature. T_m is the melting temperature of the crystal. The dashed curves are entropy in the glassy state.
The figure is taken from a very helpful review and adapted from Walter Kauzmann's classic 1948 paper.

What is going on?
The entropy of a liquid is greater than a solid [think latent heat of melting] so Delta S is positive. But, the specific heat capacity of a liquid is also greater than that of a solid [the vibrational, translational, and rotational degrees of freedom are all "softer" and less constrained]. Hence, the slope of Delta S vs. T must be positive.
Now, suppose that the liquid is supercooled so incredibly slowly that the glass does not form and you keep lowering the temperature, then at some temperature Delta S becomes negative. This extrapolated temperature [see the light blue straight line] is known as the Kauzmann temperature.

Why does this matter?
By the third law of thermodynamics, the entropy of the crystal goes to zero as the temperature goes to zero. Thus the supercooled liquid, could have negative entropy, which is physically nonsense.
Formation of the glass prevents this possibility. But, formation of the glass involves kinetics. So is there some deep connection between thermodynamics and kinetics? The review  discusses some possible connections. The extent of that connection is one of the controversial questions in glasses.

Monday, July 21, 2014

Don't try and do any work on your vacation

I sometimes here the following:

"I will be on vacation [holidays/leave] next week but I am planning to do some work on the paper".

"It turns out things were busier than I thought and I did not get to do much [or anything] on the paper."

Vacations are meant to be vacations. Work is for work time. You need the break. Furthermore, trying to simultaneously work and spend time with family or friends is usually stressful and frustrating for everyone involved.

Switch off. Take a break and enjoy it. Your productivity when you return will be greater.

Saturday, July 19, 2014

Undergrads should be taught that hydrogen bonds are quantum

There is a very nice paper in Chemistry Education Research and Practice 
What is a hydrogen bond? Resonance covalency in the supramolecular domain 
Frank Weinhold and Roger A. Klein

It relates to issues I have posted about before. In an earlier article Weinhold and Klein reviewed how most introductory chemistry textbooks claim that hydrogen bonding is essentially a classical electrostatic phenomena [some sort of dipole-dipole interaction], in spite of the fact that it is largely due to coherent quantum effects.
Similar electrostatics-type assumptions are deeply embedded in the empirical point-charge potentials of widely used molecular dynamics (MD) and Monte Carlo (MC) simulation methods (Leach, 2001). These methods make no pretense to describe chemical bonding and reactivity phenomena, but are widely presumed to adequately describe H-bonding phenomena. The ubiquity of such simulation potentials in many areas of materials and biochemical research tends to reinforce and perpetuate the corresponding electrostatics-type rationalizations of H-bonding in elementary textbooks. Neither the manner in which H-bonding is now taught to beginning students nor how it is “simulated” in MD/MC potentials has changed appreciably in the past half-century.
Building on the recent IUPAC revised definition of the H-bond , they propose several definitions that might be appropriate for inclusion in introductory texts. Here is the most technical version:
A fractional chemical bond due to partial intermolecular A–H[cdots, three dots, centered]:B ↔ A:[cdots, three dots, centered]H–Bresonance delocalization (partial 3-center/4-electron proton-sharing between Lewis bases), arising most commonly from quantum mechanical nB→ σ*AH donor–acceptor interaction.

I have emphasised before there is a "smoking gun" for this quantum view of the hydrogen bond: the existence of an excited electronic state that is a superposition of the same two "resonating" basis states [diabatic states] that make up the ground state. It should be observable in quantum chemistry calculations and in UV absorption experiments.

Weinhold and Klein also made the recommendation:
Expose students ASAP to modern theoretical discovery tools 
The ready web-based availability of WebMO and other resources for calculating and visualizing accurate wavefunctions places a powerful tool in the hands of chemical educators and their laptop-toting students in the modern WiFi-activated classroom. With suitable guidebooks or Youtube tutorials [e.g., Marcel Patek, Christopher C. Cummins, or other web-based tutorial materials listed here and here], students can soon be using the same powerful computational tools that are driving chemical discovery in research laboratories around the globe. With such access, the student's laptop or mobile device can serve not only as an in-class discovery tool but also as a patient tutor and pedagogical “oracle” to provide accurate answers (and vivid graphical imagery) concerning details of valency, hybridization, and bonding in chosen chemical species, long before mathematical mastery of the underling quantum theory is attained.
I thank Ross van Vuuren for bringing the article to my attention.

The paper is part of a special issue on Physical Chemistry Education.

Aside: it warmed my heart that the authors referenced my post Chemistry is Quantum Science, that highlighted one of Weinhold's earlier articles.

Thursday, July 17, 2014

A quantum lower bound for the charge diffusion constant in strongly correlated metals?

Previously I posted about some interesting theory and cold atom experiments that suggest that the spin diffusion constant D has a lower bound of about hbar/m, where m is the particle mass.

Coincidentally, on the same day Sean Hartnoll posted a preprint, Theory of universal incoherent metallic transport. Based on results involving holographic duality [AdS/CFT] he conjectures that the diffusion constant satisfies the bound,

Dv2F/(kBT)

where v_F is the Fermi velocity.
I have pointed out to Sean that the ratio of this lower bound for D to the cold atom one (hbar/m) is
2 T_F/T where T_F is the Fermi temperature and T the temperature. Thus, the experiments [when normalised for trap effects] and the theory give a value of D about an order of magnitude smaller than Sean's lower bound. [My earlier post also references 2D cold atom experiments that give values for D several orders of magnitude smaller].
Sean raises the issue about how much m and T_F are renormalised by interactions. However, given that the spin susceptibility undergoes a small renormalisation it is not clear to me this will be significant.
Also, in a strongly interacting system charge and spin diffusion constants might be different.

In my post I pointed out the paucity of derivations of the central equation, the "Einstein relation", D=conductivity/susceptibility. However, Sean's preprint has a nice simple derivation of this based on conservation laws, but also showing how particle-hole asymmetry complicates things.

Wednesday, July 16, 2014

A simple model for double proton transfer

I just finished a paper

Here is the abstract.

Four diabatic states are used to construct a simple model for double proton transfer in hydrogen bonded complexes. Key parameters in the model are the proton donor-acceptor separation R and the ratio, D1/D2, between the proton affinity of a donor with one and two protons. Depending on the values of these two parameters the model describes four qualitatively different ground state
potential energy surfaces, having zero, one, two, or four saddle points. In the limit D2=D1 the model reduces to two decoupled hydrogen bonds. As R decreases a transition can occur from a concerted to a sequential mechanism for double proton transfer.

I welcome comments and suggestions.

Tuesday, July 15, 2014

The conceptual chasm between neuroscience and psychology

The New York Times has a nice op-ed piece The Trouble with Brain Science by Gary Marcus, a psychologist. It is worth reading for several reasons. First, it is a nice accessible discussion of the status and challenges of neuroscience. Second, it illustrates the scientific challenges of understanding emergent phenomena. Third, it highlights some funding/political/stategic issues that are relevant to other fields.

The piece is stimulated by controversy concerning the Human Brain Project, "an approximately $1.6 billion effort that aims to build a complete computer simulation of the human brain", funded by the European Commission. The US has also funded a massive project, The Brain Initiative, focussed on developing new measurement techniques.
The controversy serves as a reminder that we scientists are not only far from a comprehensive explanation of how the brain works; we’re also not even in agreement about the best way to study it, or what questions we should be asking.
.... a critical question that is too often ignored in the field: What would a good theory of the brain actually look like?
..... biological complexity is only part of the challenge in figuring out what kind of theory of the brain we’re seeking. What we are really looking for is a bridge, some way of connecting two separate scientific languages — those of neuroscience and psychology....
We know that there must be some lawful relation between assemblies of neurons and the elements of thought, but we are currently at a loss to describe those laws.......
  
The problem with both of the big brain projects is that too few of the hundreds of millions of dollars being spent are devoted to spanning this conceptual chasm.
Some of the scientific and political issues here are also relevant to other areas of science. In most fields involving complex systems advances require a combination of advances in instrumentation, materials preparation, computational models, analytical model development, and concepts. All are necessary and interdependent. At any one time a challenge for setting priorities and allocating resources is to have an appropriate balance between all of these approaches and areas.

Unfortunately, currently it seems funding agencies think it easier to convince politicians to fund big projects involving large scale instrumentation and/or computation. Smaller single-investigator grants, and particularly those focusing on conceptual issues and simple models, are getting squeezed out.

Monday, July 14, 2014

Papers make the impact not the journals

There is an interesting PLOS ONE editorial about impact factors that ends:
With the usual flurry of Impact Factor announcements due to start any day now, it’s a good time to remember that it is the papers, not the journals they´re published in, that make the impact.
 It also shows the graph below of the citation distribution for the journal. Note how incredibly broad and asymmetrical the distribution is. I found this interesting because several years ago I wondered what the error bar was on Impact factors, which are often reported to several decimal places.
The editorial points out that the distribution is probably broader for PLOS ONE because it publishes articles from a diversity of fields. Hence, I would still like to see distributions from other journals.

Saturday, July 12, 2014

When disciplines lose confidence

Increasingly I hear talk about how different academic disciplines have lost confidence in their identity, autonomy, and legitimacy. They are "in crisis", "have an uncertain future", or "need to re-invent themselves". These concerns range from physical chemistry to political science. For example, yesterday I heard a talk along these lines from David Armitage, Chair of the History Department at Harvard.

Generally, I think this anxiety is bad for the discipline and is often for the wrong reasons.
Furthermore, those who stay the course will ultimately be successful. Those who get caught up in the latest soul searching re-invention will dissipate their energies on some passing fad.

Why do people loose confidence in their discipline and the value of what they are doing?

A. Unhealthy comparisons with other disciplines and jealousy. The rates of advance of knowledge vary across disciplines and at different periods of time. There may be a decade where one discipline will see some stunning advances where another will just "plod along".
This pressure to be as "successful" as other disciplines is sometimes measured in superficial terms such as funding levels, student numbers, department size, shiny new buildings, ....

B. Disillusionment sometimes occur when the discipline does not live up to the hype of some research area.
Cosmology hasn't told us "why we are here." The Human Genome Project has not cured diseases. There is no quantum computer with hundreds of qubits. Fusion power is still 30 years away...  Previously, I posted about how Marcelo Gleiser became disillusioned with high energy physics because it did not live up to its reductionist propaganda.

C. Pressure to be a "service industry" for other disciplines. Currently, some biologists seem to think that physics and chemistry should play this role for them, i.e., providing instrumentation, techniques and tools, not concepts and theories. The attitude and issues are embodied in the 1997 Physics Today article "Harnessing the hubris: useful things physicists could do in biology" by V.A. Parsegian and the rebuttal by Bob Austin.

Previously, I posted Ahmed Zewail's persuasive arguments for the value and distinctness of chemical physics.

Why do I think disciplines are valid and will endure?
It is because they reflect the stratification of reality. The consequent autonomy of disciplines is nicely discussed in Phil Anderson's classic, More is Different.


In the humanities, in the first half of the twentieth century, the theologian Karl Barth resisted the pressure to reduce theology to other disciplines such as sociology, history, anthropology, philosophy, ... His colleagues who succumbed have now mostly been forgotten. Elsewhere I explore Barth's resistance to reductionism.

Thursday, July 10, 2014

Stay in touch with your reference letter writers

One ingredient to surviving [and "succeeding"] in science is having a few individuals write supportive letters of reference when you apply for jobs, tenure, and/or promotion. Previously, I posted some thoughts about Who should I get to write a letter of reference?

Avoid making last minute requests to people. This may lead to hastily written letters, no letter, or just "recycling" of old letters. It is worth thinking about who you may need or want to write a letter for you in the next year or so. Then maintain and/or cultivate that relationship. In particular, that means making sure they know what you have been up to scientifically for the last few years. It is idealistic to think that your former advisor/supervisor has been reading all your latest papers, particularly if (hopefully) you have moved into different areas. Hence, occasional update emails, visits, and chats at conferences are a good investment.

I suspect, that one unfortunate consequence of the rise of metrics is that letters are less influential than they used to be, except at the best institutions. Nevertheless, they still play a role.

Wednesday, July 9, 2014

Quantum research in a different era

Physics Today has an interesting review by Noah Graham of the recent book
Exploring Quantum Mechanics: A Collection of 700+ Solved Problems for Students, Lecturers, and Researchers, by Victor Galitski, Boris Karnakov, Vladimir Kogan, and Victor Galitski Jr 
 Galitski Jr points out in the preface, this sort of thorough, detailed collection is a product of “people living and working in completely different times, and they were quite different from us, today’s scientists: with their attention spans undiminished by constant exposure to email, internet, and television, and with their minds free of petty worries about citation counts, indices, and rankings, they were able to devote 100% of their attention to science and take the time to focus on difficult problems that really mattered.
It looks like a great book.

Saturday, July 5, 2014

Seeing zero-point energy with a pH meter

Acids become weaker in heavy water (Di-Deuterium oxide) than in regular water.

The pKa of an acid is a quantitative measure of the strength of an acid, i.e. how readily it gives up protons. pKa is related to the equilibrium constant Ka and Gibbs free energy change associated with the chemical reaction. This is all nicely described on Wikipedia.

The figure below [taken from this article] shows the isotope effect on pKa, i.e. the difference between the value of the pKa in heavy and regular water.
There appears to be a rough correlation with the magnitude of pKa, but for most Delta pKa ~ 0.5.
The largest value is for neat water (pKa=14).


So how is Delta pKa related to zero-point energy?
This way of looking at the problem is stated in a 50 year old J. Chem. Ed. paper by Kreevoy who says it allows students to see concrete effects of Heisenberg’s uncertainty principle.
It has the nice picture below, for acetic acid, which explains the basic physics. Dissociation of the acid lowers the zero-point energy of the proton/deuterium. Due to the H/D mass difference this energy change is less favourable for deuterium than hydrogen.
Why does dissociation [i.e.  H/D removal] lower the zero-point energy?
Basically, the released H/D bonds to water to from H3O+ which will hydrogen bond with other waters to form units like the Zundel cation [H5O2+] in which the O-H stretch frequency becomes much softer.

Actually, doing a real calculation of this is rather non-trivial.
I am unaware of any attempts to theoretically produce the curve above showing the correlation between  Delta pKa and pKa.

Update and correction (21 July, 2014).
Tom Markland kindly pointed out that if one considers a diverse families of compounds the correlation shown above between Delta pKa and pKa does not hold. The figure below is found on page 359 [Figure 11.4] of the book, Isotope Effects in the Chemical, Geological, and BioSciences.
The solid line shows a correlation that does hold for weak inorganic acids, including water.

Friday, July 4, 2014

Why do student grades only get adjusted upwards?

Why don't we ever bump students down?
When considering final grades for students near a particular cutoff you will hear statements such as:
"He got 48 % but he handed in all the assignments and worked really hard. We should bump him up to 50% so he can pass". 
"She got 78% but she asked lots of good questions in class so we should bump her up to 80% and give her an A." 
This seems reasonable and compassionate. However, if you said something like the following, people might say you were being harsh and unfair.
"She got 80% but talking to her showed she really had a superficial understanding and just crammed for the exam. We should bump her down to 79% and give her a B." 
"He got 51% but skipped most of the lectures and appeared not to do much work. We should bump him down to 49% and fail him."
If we were consistent we would be willing to consider such arguments.
Bumping should go both ways.

I can think of one related exception to this. At my university, students can make a formal application for regrading of a piece of assessment. If this occurs it is done by another faculty member and they must accept the new grade if it is lower. Sometimes it is. I think like this.

Wednesday, July 2, 2014

Key concepts in glasses, I.

In 1995 a group of distinguished scientists were asked by Science magazine about outstanding problems that should receive attention in the following decade. The answers are compiled here, and ironically entitled, "Through a glass lightly". Phil Anderson said:
The deepest and most interesting unsolved problem in solid state theory is probably the theory of the nature of glass and the glass transition. This could be the next breakthrough in the next decade.
Although I know this remains an important problem it has been a bit of a mystery to me. However, my understanding has increased by hearing a couple of nice talks in Telluride by David Reichman. This has been solidified [pun intended!] by reading a very accessible (and short) review Supercooled liquids and the glass transition by Pablo Debenedetti and Frank Stillinger. I think I now have a crude/basic understanding of a few of the key ideas including
  • defining the glass transition temperature
  • strong versus fragile glasses
  • dynamical heterogeneity
  • violations of the Stokes-Einstein relation between viscosity and diffusion constant
  • mode coupling theory
Hopefully, I will post about some of these. First, here is the "Angell plot" that distinguishes strong and fragile glasses. It shows the viscosity [on a logarithmic scale] of a supercooled liquid [i.e. a liquid that has been rapidly cooled to below its melting temperature] vs. Tg/T where T is temperature and Tg is the glass temperature. The latter can actually be defined as the temperature at which the viscosity becomes 10^13 poise. [For comparison the viscosity of water at room temperature and pressure is about  0.01 poise!].
In a normal liquid the temperature dependence of the viscosity is activated [eta ~ exp (A/T) and so this plot should give a straight line. (Arrhenius behaviour).

Angell made this plot in 1995 for a wide range of glasses and found they fell into two distinct categories, that he defined as strong and fragile.

The horizontal scale is from 0 to 1.
Note that the data on the vertical scale covers 15 orders of magnitude!

 The strong glasses have a simple activated form for the temperature dependence of the viscosity. The fragile glasses have an activation energy that increases with decreasing temperature.
It is amazing that such chemically and structurally diverse systems exhibit such universal behaviour.