tag:blogger.com,1999:blog-229733572016-10-26T17:59:08.020-04:00BackreactionEvents on the world lines of two theoretical physicists, from the horizon to timelike infinity.<br> A scientifically minded blog with varying amounts of entertainment, distractions, and every day trivialities.Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.comBlogger1619125tag:blogger.com,1999:blog-22973357.post-81082818278447017502016-10-23T02:55:00.002-04:002016-10-23T03:03:07.722-04:00The concordance model strikes back<a href="http://backreaction.blogspot.com/2016/10/what-if-dark-matter-is-not-particle.html">Two weeks ago, I summarized</a> <a href="https://arxiv.org/abs/1609.05917">a recent paper by McGaugh <i>et al</i></a> who reported a correlation in galactic structures. The researchers studied a data-set with the rotation curves of 153 galaxies and showed that the gravitational acceleration inferred from the rotational velocity (including dark matter), g<sub><span style="font-size: xx-small;">obs</span></sub>, is strongly correlated to the gravitational acceleration from the normal matter (stars and gas), g<sub><span style="font-size: xx-small;">bar</span></sub>.<br /><br /><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-6wlhiNHgt5w/V_3sJnQR6XI/AAAAAAAADPk/Gy75kH79fO8i9BIB-axNI8G8ckQU20DSgCLcB/s320/tf.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://3.bp.blogspot.com/-6wlhiNHgt5w/V_3sJnQR6XI/AAAAAAAADPk/Gy75kH79fO8i9BIB-axNI8G8ckQU20DSgCLcB/s320/tf.jpg" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;"><span style="background-color: white; color: #444444; font-family: "arial" , "tahoma" , "helvetica" , "freesans" , sans-serif; font-size: 10.4px;">Figure from arXiv:</span><a href="https://arxiv.org/abs/1609.05917" style="background-color: white; color: #4d469c; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: 10.4px; text-decoration: none;">1609.05917 [astro-ph.GA]</a><span style="background-color: white; color: #444444; font-family: "arial" , "tahoma" , "helvetica" , "freesans" , sans-serif; font-size: 10.4px;"> </span></td></tr></tbody></table><br />This isn’t actually new data or a new correlation, but a new way to look at correlations in previously available data. <br /><br />The authors of the paper were very careful not to jump to conclusions from their results, but merely stated that this correlation requires some explanation. That galactic rotation curves have surprising regularities, however, has been evidence in favor of modified gravity for two decades, so the implication was clear: Here is something that the concordance model might have trouble explaining. <br /><br />As I remarked <a href="http://backreaction.blogspot.com/2016/10/what-if-dark-matter-is-not-particle.html">in my previous blogpost</a>, while the correlation does seem to be strong, it would be good to see the results of a simulation with the concordance model that describes dark matter, as usual, as a pressureless, cold fluid. In this case too one would expect there to be some relation. Normal matter forms galaxies in the gravitational potentials previously created by dark matter, so the two components should have some correlation with each other. The question is how much.<br /><br />Just the other day, <a href="https://arxiv.org/abs/1610.06183">a new paper appeared on the arxiv</a>, which looked at exactly this. The authors of the new paper analyzed the result of a specific numerical simulation within the concordance model. And they find that the correlation in this simulated sample is actually stronger than the observed one!<br /><br /><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-i-d0WJ3s2-k/WAxdpWQeH4I/AAAAAAAADQM/Dzt_mognGuAS0je_UL-o2EQxxZBn8nq3QCLcB/s1600/tf02.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="297" src="https://4.bp.blogspot.com/-i-d0WJ3s2-k/WAxdpWQeH4I/AAAAAAAADQM/Dzt_mognGuAS0je_UL-o2EQxxZBn8nq3QCLcB/s320/tf02.jpg" width="320" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Figure from <a href="https://arxiv.org/abs/1610.06183">arXiv:1610.06183 [astro-ph.GA]</a></td></tr></tbody></table><br /><br />Moreover, they also demonstrate that in the concordance model, the slope of the best-fit curve should depend on the galaxies’ redshift (z), ie the age of the galaxy. This would be a way to test which explanation is correct.<br /><br /><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-icQj7Km-CBs/WAxdttrFGDI/AAAAAAAADQQ/7MfBvPEa81sjJCfLYFbUWCY_pu1bJlncQCLcB/s1600/z.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="308" src="https://1.bp.blogspot.com/-icQj7Km-CBs/WAxdttrFGDI/AAAAAAAADQQ/7MfBvPEa81sjJCfLYFbUWCY_pu1bJlncQCLcB/s320/z.jpg" width="320" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;"><span style="font-size: 12.8px;">Figure from </span><a href="https://arxiv.org/abs/1610.06183" style="font-size: 12.8px;">arXiv:1610.06183 [astro-ph.GA]</a></td></tr></tbody></table><br />I am not familiar with the specific numerical code that the authors use and hence I am not sure what to make of this. It’s been known for a long time that the concordance model has difficulties getting structures on galactic size right, especially galactic cores, and so it isn’t clear to me just how many parameters this model uses to work right. If the parameters were previously chosen so as to match observations already, then this result is hardly surprising.<br /><br />McGaugh, one of the authors of the first paper, <a href="https://tritonstation.wordpress.com/2016/10/21/la-fin-de-quoi/">has already offered some comments</a> (ht Yves). He notes that the sample size of the galaxies in the simulation is small, which might at least partly account for the small scatter. He also expresses himself skeptical of the results: “It is true that a single model does something like this as a result of dissipative collapse. It is not true that an ensemble of such models are guaranteed to fall on the same relation.”<br /><br />I am somewhat puzzled by this result because, as I mentioned above, the correlation in the McGaugh paper is based on previously known correlations, such as the brightness-velocity relation which, to my knowledge, hadn’t been explained by the concordance model. So I would find it surprising should the results of the new paper hold up. I’m sure we’ll hear more about this in the soon future.Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com24http://backreaction.blogspot.com/2016/10/the-concordance-model-strikes-back.htmltag:blogger.com,1999:blog-22973357.post-70493207632012190862016-10-19T04:04:00.000-04:002016-10-20T00:39:43.932-04:00Dear Dr B: Where does dark energy come from and what’s it made of?<blockquote><i>“As the universe expands and dark energy remains constant (negative pressure) then where does the ever increasing amount of dark energy come from? Is this genuinely creating something from nothing (bit of lay man’s hype here), do conservation laws not apply? Puzzled over this for ages now.”</i></blockquote><div style="text-align: right;">-- pete best </div><blockquote><i>“When speaking of the Einstein equation, is it the case that the contribution of dark matter is always included in the stress energy tensor (source term) and that dark energy is included in the cosmological constant term? If so, is this the main reason to distinguish between these two forms of ‘darkness’? I ask because I don’t normally read about dark energy being ‘composed of particles’ in the way dark matter is discussed phenomenologically.”</i></blockquote><div style="text-align: right;">-- CGT </div><br />Dear Pete, CGT:<br /><br /><div class="separator" style="clear: both; text-align: center;"></div><div class="separator" style="clear: both; text-align: center;"><a href="http://www.prime-spot.de/Bilder/BR/lambda_large.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" src="http://www.prime-spot.de/Bilder/BR/lambda_large.jpg" height="198" width="200" /></a></div>Dark energy is often portrayed as very mysterious. But when you look at the math, it’s really the simplest aspect of general relativity. <br /><br />Ahead, allow me to clarify that your questions refer to “dark energy” but are specifically about the cosmological constant which is a certain type of dark energy. For all we know, the cosmological constant fits all existing observations. Dark energy could be more complicated than that, but let’s start with the cosmological constant.<br /><br />Einstein’s field equations can be derived from very few assumptions. First, there’s the equivalence principle, which can be formulated mathematically as the requirement that the equations be tensor-equations. Second, the equations should describe the curvature of space-time. Third, the source of gravity is the stress-energy tensor and it’s locally conserved. <br /><br />If you write down the simplest equations which fulfill these criteria you get Einstein’s field equations with two free constants. One constant can be fixed by deriving the Newtonian limit and it turns out to be Newton’s constant, G. The other constant is the cosmological constant, usually denoted Λ. You can make the equations more complicated by adding higher order terms, but at low energies these two constants are the only relevant ones.<br /><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="http://www.math.fsu.edu/~dli/einsteinfieldeq.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.math.fsu.edu/~dli/einsteinfieldeq.jpg" height="158" width="400" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Einstein's field equations. [<a href="http://www.math.fsu.edu/~dli/einsteinfieldeq.jpg">Image Source</a>]</td></tr></tbody></table>If the cosmological constant is not zero, then flat space-time is no longer a solution of the equations. If the constant is positive-valued in particular, space will undergo accelerated expansion if there are no other matter sources, or these are negligible in comparison to Λ. Our universe presently seems to be in a phase that is dominated by a positive cosmological constant – that’s the easiest way to explain the observations which were awarded the <a href="https://www.nobelprize.org/nobel_prizes/physics/laureates/2011/">2011 Nobel Prize in physics</a>.<br /><br />Things get difficult if one tries to find an interpretation of the rather unambiguous mathematics. You can for example take the term with the cosmological constant and not think of it as geometrical, but instead move it to the other side of the equation and think of it as some stuff that causes curvature. If you do that, you might be tempted to read the entries of the cosmological constant term as if it was a kind of fluid. It would then correspond to a fluid with constant density and with constant, negative pressure. That’s something one can write down. But does this interpretation make any sense? I don’t know. There isn’t any known fluid with such behavior.<br /><br />Since the cosmological constant is also present if matter sources are absent, it can be interpreted as the energy-density and pressure of the vacuum. Indeed, one can calculate such a term in quantum field theory, just that the result is infamously 120 orders of magnitude too large. But that’s a different story and shall be told another time. The cosmological constant term is therefore often referred to as the “vacuum energy,” but that’s sloppy. It’s an energy-density, not an energy, and that’s an important difference. <br /><br />How can it possibly be that an energy density remains constant as the universe expands, you ask. Doesn’t this mean you need to create more energy from somewhere? No, you don’t need to create anything. This is a confusion which comes about because you interpret the density which has been assigned to the cosmological constant like a density of matter, but that’s not what it is. If it was some kind of stuff we know, then, yes, you would expect the density to dilute as space expands. But the cosmological constant is a property of space-time itself. As space expands, there’s more space, and that space still has the same vacuum energy density – it’s constant! <br /><br />The cosmological constant term is indeed conserved in general relativity, and it’s conserved separately from that of the other energy and matter sources. It’s just that conservation of stress-energy in general relativity works differently than you might be used to from flat space. <br /><br />According to Noether’s theorem there’s a conserved quantity for every (continuous) symmetry. A flat space-time is the same at every place and at every moment of time. We say it has a translational invariance in space and time. These are symmetries, and they come with conserved quantities: Translational invariance of space conserves momentum, translational invariance in time conserves energy. <br /><br />In a curved space-time generically neither symmetry is fulfilled, hence neither energy nor momentum are conserved. So, if you take the vacuum energy density and you integrate it over some volume to get an energy, then the total energy grows with the volume indeed. It’s just not conserved. How strange! But that makes perfect sense: It’s not conserved because space expands and hence we have no invariance in time. Consequently, there’s no conserved quantity for invariance in time. <br /><br />But General Relativity has a more complicated type of symmetry to which Noether’s theorem can be applied. This gives rise to a local conservation of stress-momentum when coupled to gravity (the stress-momentum tensor is covariantly conserved). <br /><br />The conservation law for the density of a pressureless fluid, for example, works as you expect it to work: As space expands, the density goes down with the volume. For radiation – which has pressure – the energy density falls faster than that of matter because wavelengths also redshift. And if you put the cosmological constant term with its negative pressure into the conservation law, both energy and pressure remain the same. It’s all consistent: They are conserved if they are constant. <br /><br />Dark energy now is a generalization of the cosmological constant, in which one invents some fields which give rise to a similar term. There are various fields that theoretical physicists have played with: chameleon fields and phantom fields and quintessence and such. The difference to the cosmological constant is that these fields’ densities do change with time, albeit slowly. There is however presently no evidence that this is the case. <br /><br />As to the question which dark stuff to include in which term. Dark matter is usually assumed to be pressureless, which means that for what its gravitational pull is concerned it behaves just like normal matter. Dark energy, in contrast, has negative pressure and does odd things. That’s why they are usually collected in different terms. <br /><br />Why don’t you normally read about dark energy being made of particles? Because you need some really strange stuff to get something that behaves like dark energy. You can’t make it out of any kind of particle that we know – this would either give you a matter term or a radiation term, neither of which does what dark energy needs to do. <br /><br />If dark energy was some kind of field, or some kind of condensate, then it would be made of something else. In that case its density might indeed also vary from one place to the next and we might be able to detect the presence of that field in some way. Again though, there isn’t presently any evidence for that.<br /><br />Thanks for your interesting questions! Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com71http://backreaction.blogspot.com/2016/10/dear-dr-b-where-does-dark-energy-come.htmltag:blogger.com,1999:blog-22973357.post-47976940017796910922016-10-12T04:06:00.004-04:002016-10-12T06:41:20.630-04:00What if dark matter is not a particle? The second wind of modified gravity.Another year has passed and Vera Rubin was not awarded the Nobel Prize. She’s 88 and the prize can’t be awarded posthumously, so I can’t shake the impression the Royal Academy is waiting for her to die while they work off a backlog of condensed-matter breakthroughs. <br /><br />Sure, nobody knows whether galaxies actually contain the weakly interacting and non-luminous particles we have come to call dark matter. And Fritz Zwicky was first to notice a cluster of galaxies which moved faster than the visible mass alone could account for – and the one to coin the term dark matter. But it was Rubin who pinned down the evidence that galaxies are systematically misbehaved by showing the rotational velocities of spiral galaxies don’t flatten out with distance from the galactic center – as if there was unseen extra mass in the galaxies. And Zwicky is dead anyway, so the Nobel committee doesn’t have to worry about him. <br /><br />After Rubin’s discovery, many other observations confirmed that we were missing matter, and not only a little bit, but 80% of all matter in the universe. It’s there, but it’s not some stuff that we know. The fluctuations in the cosmic microwave background, gravitational lensing, the formation of large-scale structures in the universe – none of these would fit with the predictions of general relativity if there wasn’t additional matter to curve space-time. And if you go through all the particles in the standard model, none of them fits the bill. They’re either too light or too heavy or too strongly interacting or too unstable. <br /><br />But once physicists had the standard model, every problem began to look like a particle, and so, beginning in the mid-1980s, dozens of experiments started to search for dark matter particles. So far, they haven’t found anything. No WIMPS, no axions, no wimpzillas, neutralinos, sterile neutrinos, or other things that would be good candidates for the missing matter.<br /><br />This might not mean much. It might mean merely that the dark matter particles are even more weakly interacting than expected. It might mean that the particle types we’ve dealt with so far were too simple. Or maybe it means dark matter isn’t made of particles.<br /><br />It’s an old idea, though one that never rose to popularity, that rather than adding new sources for gravity we could instead keep the known sources but modify the way they gravitate. And the more time passes without a dark matter particle caught in a detector, the more appealing this alternative starts to become. Maybe gravity doesn’t work the way Einstein taught us. <br /><br />Modified gravity had an unfortunate start because its best known variant – Modified Newtonian Dynamics or MOND – is extremely unappealing from a theoretical point of view. It’s in contradiction with general relativity and that makes it a non-starter for most theorists. Meanwhile, however, there are variants of modified gravity which are compatible with general relativity. <br /><br />The benefit of modifying gravity is that it offers an explanation for observations that particle dark matter has nothing to say about: Many galaxies show regularities in the way their stars’ motion is affected by dark matter. Clouds of dark particles that would collect in halos around galaxies can be flexibly adapted to match the observations of all observed galaxies. But dark matter particles are so flexible, that it’s difficult to reproduce regularities. <br /><br />The best known of them is the Tully-Fisher relation, a correlation between the luminosity of a galaxy and the velocity of the outermost stars. Nobody has succeeded to explain this with particle dark matter, but modified gravity can explain it. <br /><a href="https://arxiv.org/abs/1609.05917"><br /></a><a href="https://arxiv.org/abs/1609.05917">In a recent paper</a>, a group of researchers from the United States offers a neat new way to quantify these regularities. They compare the gravitational acceleration that must be acting on stars in galaxies as inferred from observation (g<sub><span style="font-size: xx-small;">obs</span></sub>) with the gravitational acceleration due to the observed stars and gas, ie baryonic matter (g<sub><span style="font-size: xx-small;">bar</span></sub>). As expected, the observed gravitational acceleration is much larger than what the visible mass would lead one to expect. They are also, however, strongly correlated with each other (see figure below). It’s difficult to see how particle dark matter could cause this. (Though I would like to see how this plot looks for a ΛCDM simulation. I would still expect some correlation and would prefer not to judge its strength by gut feeling.)<br /><br /><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-6wlhiNHgt5w/V_3sJnQR6XI/AAAAAAAADPk/Gy75kH79fO8i9BIB-axNI8G8ckQU20DSgCLcB/s1600/tf.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="306" src="https://3.bp.blogspot.com/-6wlhiNHgt5w/V_3sJnQR6XI/AAAAAAAADPk/Gy75kH79fO8i9BIB-axNI8G8ckQU20DSgCLcB/s320/tf.jpg" width="320" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Figure from arXiv:<a href="https://arxiv.org/abs/1609.05917">1609.05917 [astro-ph.GA]</a> </td></tr></tbody></table><br /><br />This isn’t so much new evidence as an improved way to quantify existing evidence for regularities in spiral galaxies. Lee Smolin, always quick on his feet, <a href="https://arxiv.org/abs/1610.01968">thinks he can explain this correlation</a> with quantum gravity. I don’t quite share his optimism, but it’s arguably intriguing.<br /><br />Modifying gravity however has its shortcomings. While it seems to work reasonably well on the level of galaxies, it’s hard to make it work for galaxy clusters too. Observations for example of the Bullet cluster (image below) seem to show that the visible mass can be at a different place than the gravitating mass. That’s straight-forward to explain with particle dark matter but difficult to make sense of with modified gravity.<br /><br /><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="https://apod.nasa.gov/apod/image/0608/bulletcluster_comp_f2048.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="231" src="https://apod.nasa.gov/apod/image/0608/bulletcluster_comp_f2048.jpg" width="320" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">The bullet cluster. <br />In red: estimated distribution of baryonic mass. <br />In blue: estimated distribution of gravitating mass, extracted from gravitational lensing. <br />Source: <a href="https://apod.nasa.gov/apod/ap060824.html">APOD</a>.</td></tr></tbody></table><br />The explanation I presently find most appealing is that <a href="http://aeon.co/essays/is-dark-matter-subatomic-particles-a-superfluid-or-both">dark matter is a type of particle whose dynamical equations sometimes mimic those of modified gravity</a>. This option, pursued, among others, by Stefano Liberati and Justin Khoury, combines the benefits of both approaches without the disadvantages of either. There is, however, a lot of data in cosmology and it will take a long time to find out whether this idea can fit the observations as well – or better – than particle dark matter.<br /><br />But regardless of what dark matter turns out to be, Rubin’s observations have given rise to one of the most active research areas in physics today. I hope that the Royal Academy eventually wakes up and honors her achievement. Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com88http://backreaction.blogspot.com/2016/10/what-if-dark-matter-is-not-particle.htmltag:blogger.com,1999:blog-22973357.post-87279385948975920642016-10-05T07:41:00.001-04:002016-10-05T10:12:59.374-04:00Demystifying Spin 1/2Theoretical physics is the most math-heavy of disciplines. We don’t use all that math because we like to be intimidating, but because it’s the most useful and accurate description of nature we know. <br /><br />I am often asked to please explain this or that mathematical description in layman terms – and I try to do my best. But truth is, it’s not possible. The mathematical description <i>is</i> the explanation. The best I can do is to summarize the conclusions we have drawn from all that math. And this is pretty much how popular science accounts of theoretical physics work: By summarizing the consequences of lots of math. <br /><br />This, however, makes science communication in theoretical physics a victim of its own success. If readers get away thinking they can follow a verbal argument, they’re left to wonder why physicists use all that math to begin with. Sometimes I therefore wish articles reporting on recent progress in theoretical physics would on occasion have an asterisk that notes “It takes several years of lectures to understand how B follows from A.”<br /><br />One of the best examples for the power of math in theoretical physics – if not <i>the</i> best example to illustrate this – are spin 1/2 particles. They are usually introduced as particles that have to be rotated twice to return to the same initial state. I don’t know if anybody who didn’t know the math already has ever been able to make sense of this explanation – certainly not me when I was a teenager.<br /><br />But this isn’t the only thing you’ll stumble across if you don’t know the math. Your first question may be: Why have spin 1/2 to begin with?<br /><br />Well, one answer to this is that we need spin 1/2 particles to describe observations. Such particles are fermionic and therefore won’t occupy the same quantum state. (It takes several years of lectures to understand <a href="https://en.wikipedia.org/wiki/Spin%E2%80%93statistics_theorem">how B follows from A</a>.) This is why for example electrons – which have spin 1/2 – sit in shells around the atomic nucleus rather than clumping together. <br /><br />But a better answer is “Why not?” (Why not?, it turns out, is also a good answer to most why-questions that Kindergartners come up with.)<br /><br />Mathematics allows you to classify everything a quantum state can do under rotations. If you do that you not only find particles that return to their initial state after 1, 1/2, 1/3 and so on of a rotation – corresponding to spin 1, 2, 3... etc – you also find particles that return to their initial state after 2, 2/3, 2/5 and so on of a rotation – corresponding to spin 1/2, 3/2, 5/2 etc. The spin, generally, is the inverse of the fraction of rotations necessary to return the particle to itself. The one exception is spin 0 which doesn’t change at all.<br /><br />So the math tells you that spin 1/2 is a thing, and it’s there in our theories already. It would be stranger if it nature didn’t make use of it. <br /><br />But how come that the math gives rise to such strange and non-intuitive particle behaviors? It comes from the way that rotations (or symmetry transformations more generally) act on quantum states, which is different from how they act on non-quantum states. A symmetry transformation acting on a quantum state must be described by a unitary transformation – this is a transformation which, most importantly, ensures that probabilities always add up to one. And the full set of all symmetry transformations must be described by a “unitary representation” of the group. <br /><br />Symmetry groups, however, can be difficult to handle, and so physicists prefer to instead work with the algebra associated to the group. The algebra can be used to build up the group, much like you can build up a grid from right-left steps and forwards-backwards steps, repeated sufficiently often. But here’s where things get interesting: If you use the algebra of the rotation group to describe how particles transform, you don’t get back merely the rotation group. Instead you get what’s called a “double cover” of the rotation group. It means – guess! – you have to turn the state around twice to get back to the initial state.<br /><br />I’ve been racking my brain trying to find a good metaphor for “double-cover” to use in the-damned-book I’m writing. Last year, I came across the perfect illustration in real life when we took the kids to a Christmas market. Here it is:<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-Yi31rWW4rG4/V_Tj98Lb_UI/AAAAAAAADO4/9YhuGUncw7oYlFio5uNPB6fBV2Lvtor5QCLcB/s1600/16928925114_c073aa9b44_b.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="225" src="https://1.bp.blogspot.com/-Yi31rWW4rG4/V_Tj98Lb_UI/AAAAAAAADO4/9YhuGUncw7oYlFio5uNPB6fBV2Lvtor5QCLcB/s400/16928925114_c073aa9b44_b.jpg" width="400" /></a></div><br /><br />I made a sketch of this for my book:<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/--JJXWaQdpBU/V_TkGxi8NcI/AAAAAAAADO8/I-DVx7VU998l2FecwDaInT8dcW-qmpG3wCLcB/s1600/fw04.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="84" src="https://1.bp.blogspot.com/--JJXWaQdpBU/V_TkGxi8NcI/AAAAAAAADO8/I-DVx7VU998l2FecwDaInT8dcW-qmpG3wCLcB/s320/fw04.jpg" width="320" /></a></div><br /><br />The little trolley has to make two full rotations to get back to the starting point. And that’s pretty much how the double-cover of the rotation group gives rise to particles with spin 1/2. Though you might have to wrap your head around it twice to understand how it works.<br /><br />I later decided not to use this illustration in favor of one easier to generalize to higher spin. But you’ll have to buy the-damned-book to see how this works :pSabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com66http://backreaction.blogspot.com/2016/10/demystifying-spin-12.htmltag:blogger.com,1999:blog-22973357.post-25268474029130229592016-09-27T03:16:00.000-04:002016-09-27T03:23:42.482-04:00Dear Dr B: What do physicists mean by “quantum gravity”?<blockquote><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-OGSN4JXrdd0/V-obXLLTaoI/AAAAAAAADOI/LHCQD4aizlob943GnrwBgwGS-DF2qAgCQCLcB/s1600/giphy.gif" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="150" src="https://4.bp.blogspot.com/-OGSN4JXrdd0/V-obXLLTaoI/AAAAAAAADOI/LHCQD4aizlob943GnrwBgwGS-DF2qAgCQCLcB/s200/giphy.gif" width="200" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">[Image Source: <a href="https://giphy.com/gifs/hu-ease-sto-FzgSikew2N0c">giphy.com</a>]</td></tr></tbody></table><i>“please could you give me a simple definition of "quantum gravity"?</i><br /><i><br /></i><i>J.”</i></blockquote><br />Dear J,<p>Physicists refer with “quantum gravity” not so much to a specific theory but to the sought-after solution to various problems in the established theories. The most pressing problem is that the standard model combined with general relativity is internally inconsistent. If we just use both as they are, we arrive at conclusions which do not agree with each other. So just throwing them together doesn’t work. Something else is needed, and that something else is what we call quantum gravity. </p>Unfortunately, the effects of quantum gravity are very small, so presently we have no observations to guide theory development. In all experiments made so far, it’s sufficient to use unquantized gravity.<br /><br />Nobody knows how to combine a quantum theory – like the standard model – with a non-quantum theory – like general relativity – without running into difficulties (<a href="https://arxiv.org/abs/1208.5874">except for me, but nobody listens</a>). Therefore the main strategy has become to find a way to give quantum properties to gravity. Or, since Einstein taught us gravity is nothing but the curvature of space-time, to give quantum properties to space and time. <br /><br />Just combining quantum field theory with general relativity doesn’t work because, as confirmed by countless experiments, all the particles we know have quantum properties. This means (among many other things) they are subject to Heisenberg’s uncertainty principle and can be in quantum superpositions. But they also carry energy and hence should create a gravitational field. In general relativity, however, the gravitational field can’t be in a quantum superposition, so it can’t be directly attached to the particles, as it should be. <br /><br />One can try to find a solution to this conundrum, for example by not directly coupling the energy (and related quantities like mass, pressure, momentum flux and so on) to gravity, but instead only coupling the average value, which behaves more like a classical field. This solves one problem, but creates a new one. The average value of a quantum state must be updated upon measurement. This measurement postulate is a non-local prescription and general relativity can’t deal with it – after all Einstein invented general relativity to get rid of the non-locality of Newtonian gravity. (Neither decoherence nor many worlds remove the problem, you still have to update the probabilities, somehow, somewhere.)<br /><br />The quantum field theories of the standard model and general relativity clash in other ways. If we try to understand the evaporation of black holes, for example, we run into another inconsistency: Black holes emit Hawking-radiation due to quantum effects of the matter fields. This radiation doesn’t carry information about what formed the black hole. And so, if the black hole entirely evaporates, this results in an irreversible process because from the end-state one can’t infer the initial state. This evaporation however can’t be accommodated in a quantum theory, where all processes can be time-reversed – it’s another contradiction that we hope quantum gravity will resolve.<br /><br />Then there is the problem with the singularities in general relativity. Singularities, where the space-time curvature becomes infinitely large, are not mathematical inconsistencies. But they are believed to be physical nonsense. Using dimensional analysis, one can estimate that the effects of quantum gravity should become large close by the singularities. And so we think that quantum gravity should replace the singularities with a better-behaved quantum space-time.<br /><br />The sought-after theory of quantum gravity is expected to solve these three problems: tell us how to couple quantum matter to gravity, explain what happens to information that falls into a black hole, and avoid singularities in general relativity. Any theory which achieves this we’d call quantum gravity, whether or not you actually get it by quantizing gravity. <br /><br />Physicists are presently pursuing various approaches to a theory of quantum gravity, notably string theory, loop quantum gravity, asymptotically safe gravity, and causal dynamical triangulation, for just to name the most popular ones. But none of these approaches has experimental evidence speaking for it. Indeed, so far none of them has made a testable prediction. <br /><br />This is why, in the area of quantum gravity phenomenology, we’re bridging the gap between theory and experiment with simplified models, some of which motivated by specific approaches (hence: string phenomenology, loop quantum cosmology, and so on). These phenomenological models don’t aim to directly solve the above mentioned problems, they merely provide a mathematical framework – consistent in its range of applicability – to quantify and hence test the presence of effects that could be signals of quantum gravity, for example space-time fluctuations, violations of the equivalence principle, deviations from general relativity, and so on.<br /><br />Thanks for an interesting question! Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com26http://backreaction.blogspot.com/2016/09/dear-dr-b-what-do-physicists-mean-by.htmltag:blogger.com,1999:blog-22973357.post-6386494097743599312016-09-21T01:51:00.001-04:002016-09-21T06:52:07.351-04:00We understand gravity just fine, thank you.<a href="https://3.bp.blogspot.com/-lC3ptosIeJo/V-IeQ5RMeRI/AAAAAAAADNg/64OL3w1Xfhs5TC97rY5Nr7jxCOfXEsNeQCLcB/s1600/hif01.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="150" src="https://3.bp.blogspot.com/-lC3ptosIeJo/V-IeQ5RMeRI/AAAAAAAADNg/64OL3w1Xfhs5TC97rY5Nr7jxCOfXEsNeQCLcB/s200/hif01.jpg" width="200" /></a>Yesterday I came across a Q&A on the website of Discover magazine, titled <a href="http://discovermagazine.com/2016/oct/the-root-of-gravity">“The Root of Gravity - Does recent research bring us any closer to understanding it?”</a> Jeff Lepler from Michigan has the following question: <br /><blockquote>“<b>Q</b>: Are we any closer to understanding the root cause of gravity between objects with mass? Can we use our newly discovered knowledge of the Higgs boson or gravitational waves to perhaps negate mass or create/negate gravity?”</blockquote>A person by name Bill Andrews (unknown to me) gives the following answer: <br /><blockquote>“<b>A</b>: Sorry, Jeff, but scientists still don’t really know why gravity works. In a way, they’ve just barely figured out how it works.”</blockquote>The answer continues, but let’s stop right there where the nonsense begins. What’s that even mean scientists don’t know “why” gravity works? And did the Bill person really think he could get away with swapping “why” for a “how” and nobody would notice? <p>The purpose of science is to explain observations. We have a theory by name General Relativity that explains literally all data of gravitational effects. Indeed, that General Relativity is so dramatically successful is a great frustration for all those people who would like to revolutionize science a la Einstein. So in which sense, please, do scientists barely know how it works?</p>For all we can presently tell gravity is a fundamental force, which means we have no evidence for an underlying theory from which gravity could be derived. Sure, theoretical physicists are investigating whether there is such an underlying theory that would give rise to gravity as well as the other interactions, a “theory of everything”. (Please submit nomenclature complaints to your local language police, not to me.) Would such a theory of everything explain “why” gravity works? No, because that’s not a meaningful scientific question. A theory of everything could potentially explain how gravity can arise from more fundamental principles similar to, say, the ideal gas law can arise from statistical properties of many atoms in motion. But that still wouldn’t explain why there should be something like gravity, or anything, in the first place. <br /><br />Either way, even if gravity arises within a larger framework like, say, string theory, the effects of what we call gravity today would still come about because energy-densities (and related quantities like pressure and momentum flux and so on) curve space-time, and fields move in that space-time. Just that these quantities might no longer be fundamental. We’ve known since 101 years how this works.<br /><br />After a few words on Newtonian gravity, the answer continues: <br /><blockquote>“Because the other forces use “force carrier particles” to impart the force onto other particles, for gravity to fit the model, all matter must emit gravitons, which physically embody gravity. Note, however, that gravitons are still theoretical. Trying to reconcile these different interpretations of gravity, and understand its true nature, are among the biggest unsolved problems of physics.”</blockquote>Reconciling which different interpretations of gravity? These are all the same “interpretation.” It is correct that we don’t know how to quantize gravity so that the resulting theory remains viable also when gravity becomes strong. It’s also correct that the force-carrying particle associated to the quantization – the graviton – hasn’t been detected. But the question was about gravity, not quantum gravity. Reconciling the graviton with unquantized gravity is straight-forward – it’s called perturbative quantum gravity – and exactly the reason most theoretical physicists are convinced the graviton exists. It’s just that this reconciliation breaks down when gravity becomes strong, which means it’s only an approximation.<br /><blockquote>“But, alas, what we do know does suggest antigravity is impossible.”</blockquote>That’s correct on a superficial level, but it depends on what you mean by antigravity. If you mean by antigravity that you can let any of the matter which surrounds us “fall up” it’s correct. But there are modifications of general relativity that have effects one can plausibly call anti-gravitational. That’s a longer story though and shall be told another time. <br /><br />A sensible answer to this question would have been:<br /><blockquote>“Dear Jeff,<br /><br />The recent detection of gravitational waves has been another confirmation of Einstein’s theory of General Relativity, which still explains all the gravitational effects that physicists know of. According to General Relativity the root cause of gravity is that all types of energy curve space-time and all matter moves in this curved space-time. Near planets, such as our own, this can be approximated to good accuracy by Newtonian gravity. <br /><br />There isn’t presently any observation which suggests that gravity itself emergens from another theory, though it is certainly a speculation that many theoretical physicists have pursued. There thus isn’t any deeper root for gravity because it’s presently part of the foundations of physics. The foundations are the roots of everything else.<br /><br />The discovery of the Higgs boson doesn’t tell us anything about the gravitational interaction. The Higgs boson is merely there to make sure particles have mass in addition to energy, but gravity works the same either way. The detection of gravitational waves is exciting because it allows us to learn a lot about the astrophysical sources of these waves. But the waves themselves have proved to be as expected from General Relativity, so from the perspective of fundamental physics they didn’t bring news.<br /><br />Within the incredibly well confirmed framework of General Relativity, you cannot negate mass or its gravitational pull. </blockquote><blockquote>You might also enjoy hearing what Richard Feynman had to say when he was asked a similar question about the origin of the magnetic force:</blockquote><center><iframe allowfullscreen="" frameborder="0" height="253" src="https://www.youtube.com/embed/MO0r930Sn_8?rel=0" width="450"></iframe></center><br /><br />This answer really annoyed me because it’s a lost opportunity to explain how well physicists understand the fundamental laws of nature.Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com59http://backreaction.blogspot.com/2016/09/we-understand-gravity-just-fine-thank.htmltag:blogger.com,1999:blog-22973357.post-11660058631863533922016-09-15T06:31:00.002-04:002016-09-19T06:41:32.346-04:00Experimental Search for Quantum Gravity 2016<a href="https://2.bp.blogspot.com/-ZauqrR_2c5U/V0b0Quj_voI/AAAAAAAADFY/0O6bruG5o0EIPb4xcK_HjSDgyaj7_8AEQCLcB/w1200-h630-p-nu/qg_lens.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="105" src="https://2.bp.blogspot.com/-ZauqrR_2c5U/V0b0Quj_voI/AAAAAAAADFY/0O6bruG5o0EIPb4xcK_HjSDgyaj7_8AEQCLcB/w1200-h630-p-nu/qg_lens.jpg" width="200" /></a>Research in quantum gravity is quite a challenge since we neither have a theory nor data. But some of us like a challenge.<br /><br />So far, most effort in the field has gone into using requirements of mathematical consistency to construct a theory. It is impossible of course to construct a theory based on mathematical consistency alone, because we can never prove our assumptions to be true. All we know is that the assumptions give rise to good predictions in the regime where we’ve tested them. Without assumptions, no proof. Still, you may hope that mathematical consistency tells you where to look for observational evidence.<br /><br />But in the second half of the 20th century, theorists have used the weakness of gravity as an excuse to not think about how to experimentally test quantum gravity at all. This isn’t merely a sign of laziness, it’s back to the days when philosophers believed they could find out how nature works by introspection. Just that now many theoretical physicists believe mathematical introspection is science. Particularly disturbing to me is how frequently I speak with students or young postdocs who have never even given thought to the question what makes a theory scientific. That’s one of the reasons the disconnect between physics and philosophy worries me. <br /><br />In any case, the cure clearly isn’t more philosophy, but more phenomenology. The effects of quantum gravity aren’t necessarily entirely out of experimental reach. Gravity isn’t generally a weak force, not in the same way that, for example, the weak nuclear force is weak. That’s because the effects of gravity get stronger with the amount of mass (or energy) that exerts the force. Indeed, this property of the gravitational force is the very reason why it’s so hard to quantize. <p>Quantum gravitational effects hence were strong in the early universe, they are strong inside black holes, and they can be non-negligible for massive objects that have pronounced quantum properties. Furthermore, the theory of quantum gravity can be expected to give rise to deviations from general relativity or the symmetries of the standard model, which can have consequences that are observable even at low energies.</p>The often repeated argument that we’d need to reach enormously high energies – close by the Planck energy, 16 orders of magnitude higher than LHC energies – is simply wrong. Physics is full with examples of short-distance phenomena that give rise to effects at longer distances, such as atoms causing Brownian motion, or quantum electrodynamics allowing stable atoms to begin with.<br /><br /><div class="separator" style="clear: both; text-align: center;"></div>I have spent the last 10 years or so studying the prospects to find experimental evidence for quantum gravity. Absent a fully-developed theory we work with models to quantify effects that could be signals of quantum gravity, and aim to test these models with data. The development of such models is relevant to identify promising experiments to begin with.<br /><br />Next week, <a href="https://indico.fias.uni-frankfurt.de/event/2/">we will hold the 5th international conference on Experimental Search for Quantum Gravity, here in Frankfurt</a>. And I dare to say we have managed to pull together an awesome selection of talks.<br /><br />We’ll hear about the prospects of finding evidence for quantum gravity in the CMB (Bianchi, Krauss, Vennin) and in quantum oscillators (Paternostro). We have a lecture about the interface between gravity and quantum physics, both on long and short distances (Fuentes), and a talk on how to look for moduli and axion fields that are generic consequences of string theory (Conlon). Of course we’ll also cover Loop Quantum Cosmology (Barrau), asymptotically safe gravity (Eichhorn), and causal sets (Glaser). We’re super up-to-date by having a talk about constraints from the LIGO gravitational wave-measurements on deviations from general relativity (Yunes), and several of the usual suspects speaking about deviations from Lorentz-invariance (Mattingly), Planck stars (Rovelli, Vidotto), vacuum dispersion (Giovanni), and dimensional reduction (Magueijo). There’s neutrino physics (Paes), a talk about what the cosmological constant can tell us about new physics (Afshordi), and, and, and!<br /><br />You can download the abstracts <a href="http://sabinehossenfelder.com/Physics/ESQG_abstracts.pdf">here</a> and the timetable <a href="http://sabinehossenfelder.com/Physics/esqg_schedule.pdf">here</a>.<br /><br />But the best is I’m not telling you this to depress you because you can’t be with us, but because our IT guys still tell me we’ll both record the talks and livestream them (to the extent that the speakers consent of course). I’ll share the URL with you here once everything is set up, so stay tuned.<br><br><b>Update:</b>Streaming link will be posted on the <a href="http://fias.uni-frankfurt.de/events/giersch-international-symposion/">institute's main page</a> briefly before the event. <b>Another update</b>: <a href="https://vfs.fias.uni-frankfurt.de/cdn/pub/hls/gis2016.html">Lifestream is available here</a>.Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com39http://backreaction.blogspot.com/2016/09/experimental-search-for-quantum-gravity.htmltag:blogger.com,1999:blog-22973357.post-9015527925881803332016-09-11T03:11:00.001-04:002016-09-11T05:23:15.209-04:00I’ve read a lot of books recently<a href="https://4.bp.blogspot.com/-ZISihGlyqps/V9UDGbrq5FI/AAAAAAAADM4/3bxOtTan6v0iqiscXFk0S6tRG5kudOEvQCLcB/s1600/lotsofbooks.gif" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" src="https://4.bp.blogspot.com/-ZISihGlyqps/V9UDGbrq5FI/AAAAAAAADM4/3bxOtTan6v0iqiscXFk0S6tRG5kudOEvQCLcB/s1600/lotsofbooks.gif" /></a>[Reading is to writing what eating is to...]<br /><hr /><b>Dreams Of A Final Theory: The Scientist's Search for the Ultimate Laws of Nature</b><br />Steven Weinberg <br />Vintage, Reprint Edition (1994)<br /><br />This book appeared when I was still in high school and I didn’t take note of it then. Later it seemed too out-of-date to bother, but meanwhile it’s almost become a historical document. Written with the pretty explicit aim to argue in favor of the Superconducting Supercollider (a US-proposal for a large particle collider that was scraped in the early 90s), it’s the most flawless popular science book about theoretical physics I’ve ever come across.<br /><br />Weinberg’s explanations are both comprehensible and remarkably accurate. The book contains no unnecessary clutter, is both well-structured and well written, and Weinberg doesn’t hold back with his opinions, neither on religion nor on philosophy.<br /><br />It’s also the first time I’ve tried an audio-book. I listened to it while treadmill running. A lot of sweat went into the first chapters. But I gave up half through and bought the paperback which I read on the plane to Austin. Weinberg is one of the people I interviewed for my book.<br /><br />Lesson learned: Audiobooks aren’t for me.<br /><hr /><b>Truth And Beauty – Aesthetics and Motivations in Science</b><br />Subrahmanyan Chandrasekhar<br />University of Chicago Press (1987)<br /><br />I had read this book before but wanted to remind me of its content. It’s a collection of essays on the role of beauty in physics, mostly focused on general relativity and the early 20th century. Along historical examples like Milne, Eddington, Weyl, and Einstein, Chandrasekhar discusses various aspects of beauty, like elegance, simplicity, or harmony. I find it too bad that Chandrasekhar didn’t bring in more of his own opinion but mostly summarizes other people’s thoughts. <br /><br />Lesson learned: Tell the reader what you think.<br /><hr /><b>Truth or Beauty – Science and the Quest for Order</b><br />David Orrell<br />Yale University Press (2012) <br /><br />In this book, mathematician David Orrell argues that beauty isn’t a good guide to truth. It’s an engagingly written book which covers a lot of ground, primarily in physics, from helocentrism to string theory. But Orrell tries too hard to make everything fit his bad-beauty narrative. Many of his interpretations are over-the-top, like his complaint that<br /><blockquote class="tr_bq"> “[T]he aesthetics of science – and particularly the “hard” sciences such as physics –have been characterized by a distinctly male feel. For example, feminist psychologists have noted that the classical picture of the atom as hard, indivisible, independent, separate, and so on corresponds very closely to the stereotypically masculine sense of self. If must have come as a shock to the young, male, champions of quantum theory when they discovered that their equations describing the atom were actually soft, fuzzy, and uncertain –in other words, stereotypically female.”</blockquote>He further notes that many male physicists like to refer to nature as “she,” that Gell-Mann likes the idea of using particle accelerators to penetrate deeper (into the structure of particles), and quotes Lee Smolin’s remark that “the most cherished goal in physics, as in bad romance novels, is unification.” This is just to illustrate the, erm, depth of Orrell’s arguments.<br /><br />In summary, it’s a nice book, but it’s hard to take Orrell’s argument seriously. Or maybe the whole thing was a joke to begin with.<br /><br />Lesson learned: Don’t try to explain everything.<br /><hr /><b>The End Of Physics - The Myth Of A Unified Theory</b><br />David Lindley<br />Basic Books (1994)<br /><br />This is a strange book. While reading, I got the impression that the author is constantly complaining about something, but it didn’t become clear to me what. Lindley tells the story of how physicists discovered increasingly more fundamental and also more unified laws of nature, and how they are hoping to finally develop a theory of everything. This, so he writes, would be the end of physics. Just that, as he explains in the next sentence, it of course wouldn’t be the end of physics. <br /><br />Lindley likes words and likes to use a lot of them. Consequently the book reads like he wanted to cram in the whole history of physics, from the beginning to the end, with him having the last word.<br /><br>His argument for why a theory of everything would remain a “myth” is essentially that it would be hard to test, something that nobody can really disagree on. But “hard to test” doesn’t mean “impossible to test,” and Lindley is clearly out of his water when it comes to evaluating experimental prospects of, say, probing quantum gravity, so he sticks with superficial polemics. Of course the book is 20 years old, and I can’t blame the author for not knowing what’s happened since, but from today’s perspective his rant seems baseless. <br /><br />In summary, it’s a well-written book, but it has a fuzzy message. (Also, the reprint quality is terrible.)<br /><br />Lesson learned: If you have something to say, say it.<br /><hr /><b>Why Beauty Is Truth – A History of Symmetry</b><br />Ian Stewart<br />Basic Books (2007)<br /><br />This is a book, not about the physics, but the mathematics of symmetries, symmetry groups, Lie groups, Lie algebras, quaternions, global symmetries, local symmetries, and all that. Steward also discusses the relevance of these structure for physics, but his emphasis is on it being an application of mathematics. The book is held together by stories of the mathematicians who lead the way. The title of the book is somewhat misleading. Steward actually doesn’t discuss much the question “why” beauty is truth. He merely demonstrates along examples that many truths are beautiful.<br /><br />It’s a pretty good book, both interesting and well-written, if somewhat too long for my taste. It doesn’t seem to have gotten the attention it deserves.<br /><br />Lesson learned: It’s hard to write a popular science book that anyone will still recall a decade later.<br /><hr /><b>Eyes On The Sky: A Spectrum of Telescopes</b><br />Francis Graham-Smith<br />Oxford University Press (2016)<br /><br />This is a book about telescopes, from then to now, from the radio regime to gamma rays. It’s not a book about astrophysics, it’s not a book about cosmology, and it’s not a book about history. It’s a book about telescopes. It is a thoroughly useful book, full of facts and figures and images, but you need to be really interested in telescopes to get through it. I read this book because I wanted to write a paragraph about the development of modern telescopes but figured I didn’t actually know much about modern telescopes. Now I’m much wiser.<br /><br />Lesson learned: If you need to read a 200 pages book to write a single paragraph, you’ll never get done.<br /><hr /><b>Beauty and Revolution in Science</b><br />James McAllister<br />Cornell University Press (1999)<br /><br />Philosopher James McAllister reexamines the Kuhnian idea of paradigm changes. He proposes that it should be amended, and argues that what characterizes a revolution is not the change of the entire scientific paradigm, but merely the change of aesthetic ideals. To back up his argument, he discusses several historical cases. This is not a popular science book, and it’s not always the most engaging read, but I have found it to be very insightful. It is somewhat unfortunate though that he didn’t spend more time illuminating the social dynamics that goes with the prevalence of beauty ideals in science.<br /><br />Lesson learned: Philosophy isn’t dead.<br /><hr /><b>Higher Speculations</b><br />Helge Kragh<br />Oxford University Press (2011)<br /><br />Kragh’s is a book about the failure of speculative ideas in physics. The steady state universe, mechanism, cyclic models of the universe, and various theories of everything are laid out in historical perspective. I have found this book both interesting and useful, but some parts are quite heavy reads. Kragh doesn’t offer an analysis or draws a lesson, and he mostly restrains from judgement. He simply tells the reader what happened.<br /><br />Lesson learned: Even smart people sometimes believe really strange things.<br /><hr /><b>Supersymmetry: Unveiling The Ultimate Laws Of Nature</b><br />Gordy Kane<br />Basic Books (2001)<br /><br />Particle physicist Gordon Kane explains why the supersymmetric extension of the standard model has become so popular and how it could be tested. Whether or not you are convinced by supersymmetry, you get to learn a lot about particle physics. It’s a straight-forward pop-science book that does a good job explaining why theorists have spent so much time on supersymmetry.<br /><br />Lesson learned: You don’t need to write fancy to write well. <br /><hr /><b>Nature’s Blueprint - Supersymmetry and the Search for a Unified Theory of Matter and Force </b><br />Dan Hooper<br />Smithsonian (2008)<br /><br />A book about high energy particle physics, the standard model, unification and the appeal of supersymmetry. It’s a well-written book that gives the reader a pretty good idea how particle physicists work and think. Hooper does a great job getting across the excitement that comes with the hope of being about to discovery a new fundamental law of nature. The book’s publication date was well timed, just before the LHC started taking data.<br /><br />Lesson learned: Your book might become history faster than you think. Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com32http://backreaction.blogspot.com/2016/09/ive-read-lot-of-books-recently.htmltag:blogger.com,1999:blog-22973357.post-48895668088133287062016-09-06T07:14:00.000-04:002016-09-06T09:47:47.843-04:00Sorry, the universe wasn’t made for you<p>Last month, game reviewers were all over <i>No Man’s Sky</i>, a new space adventure launched to much press attention. Unlike previous video games, this one calculates players’ environments from scratch rather than revealing hand-crafted landscapes and creatures. The calculations populate <i>No Man’s Sky’s</i> virtual universe with about 10<sup><span style="font-size: xx-small;">19</span></sup> planets, all with different flora and fauna – at least that’s what we’re told, not like anyone actually checked. That seems a giganourmous number but is still less than there’s planets in the actual universe, <a href="http://scienceblogs.com/startswithabang/2013/01/05/how-many-planets-are-in-the-universe/">estimated at roughly 10<sup><span style="font-size: xx-small;">24</span></sup></a>. </p><center><iframe allowfullscreen="" frameborder="0" height="253" src="https://www.youtube.com/embed/1jpHldhY_V0?rel=0" width="450"></iframe></center><br /><br />User’s expectations of <i>No Man’s Sky</i> were high – <a href="https://www.theguardian.com/technology/2016/sep/05/no-mans-sky-perils-infinite-promise-sean-murray-hello-games">and were highly disappointed</a>. All the different planets, it turns out, still get a little repetitive with their limited set of options and features. It’s hard to code a universe as surprising as reality and run it on processors that occupy only a tiny fraction of that reality.<br /><br />Theoretical physicists, meanwhile, have the opposite problem: The fictive universes they calculate are more surprising than they’d like them to be. <br /><br />Having failed on their quest for a theory of everything, in the area of quantum gravity many theoretical physicists now accept that a unique theory can’t be derived from first principles. Instead, they believe, additional requirements must be used to select the theory that actually describes the universe we observe. That, of course, is what we’ve always done to develop theories – the additional requirements being empirical adequacy. <br /><br />The new twist is that many of these physicists think the missing observational input is the existence of life in our universe. I hope you just raised an eyebrow or two because physicists don’t normally have much business with “life.” And indeed, they usually only speak about preconditions of life, such as atoms and molecules. But that the sought-after theory must be rich enough to give rise to complex structures has become the most popular selection principle.<br /><br />Known as “anthropic principle” this argument allows physicists to discard all theories that can’t produce sentient observers on the rationale that we don’t inhabit a universe that lacks them. One could of course instead just discard all theories with parameters that don’t match the measured values, but that would be so last century.<br /><br /><a href="https://4.bp.blogspot.com/-qDsc2KcgYs0/V86fo87iJPI/AAAAAAAADMk/GvLB1-9pdXUGmCQDVoea-9fTzsfJ9f-pACLcB/s1600/tuning.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="150" src="https://4.bp.blogspot.com/-qDsc2KcgYs0/V86fo87iJPI/AAAAAAAADMk/GvLB1-9pdXUGmCQDVoea-9fTzsfJ9f-pACLcB/s200/tuning.jpg" width="200" /></a>The anthropic principle is often brought up in combination with the multiverse, but logically it’s a separate argument. The anthropic principle – that our theories must be compatible with the existence of life in our universe – is an observational requirement that can lead to constraints on the parameters of a theory. This requirement must be fulfilled whether or not universes for different parameters actually exist. In the multiverse, however, the anthropic principle is supposedly the <i>only</i> criterion by which to select the theory for our universe, at least in terms of probability so that we are likely to find ourselves here. Hence the two are often discussed together.<br /><br />Anthropic selection had a promising start with Weinberg’s prescient estimate for the cosmological constant. But the anthropic princple hasn’t solved the problem it was meant to solve, because it does not single out one unique theory either. This has been known at least since a decade, but the myth that our universe is “finetuned for life” still hasn’t died.<br /><br />The general argument against the success of anthropic selection is that all evidence for the finetuning of our theories explores only a tiny space of all possible combinations of parameters. A typical argument for finetuning goes like this: If parameter X was only a tiny bit larger or smaller than the observed value, then atoms couldn’t exist or all stars would collapse or something similarly detrimental to the formation of large molecules. Hence, parameter X must have a certain value to high precision. However, these arguments for finetuning – of which there exist many – don’t take into account simultaneous changes in several parameters and are therefore inconclusive.<br /><br />Importantly, besides this general argument there also exist explicit counterexamples. In the 2006 paper <a href="http://arxiv.org/abs/hep-ph/0604027"><i>A Universe Without Weak Interactions</i></a>, Harnik, Kribs, and Perez discussed a universe that seems capable of complex chemistry and yet has fundamental particles entirely different from our own. More recently, <a href="http://arxiv.org/abs/1312.0613">Abraham Loeb from Harvard argued</a> that primitive forms of life might have been possible already in the early universe under circumstances very different from today’s. And a recent paper (<a href="https://www.newscientist.com/article/2104223-stars-burning-strangely-make-life-in-the-multiverse-more-likely">ht Jacob Aron</a>) adds another example:<br /><br><ul><b>Stellar Helium Burning in Other Universes: A solution to the triple alpha fine-tuning problem</b><br />By Fred C. Adams and Evan Grohs<br /><a href="http://arxiv.org/abs/1608.04690">1608.04690 [astro-ph.CO]</a></ul><br>In this work the authors show that some combinations of fundamental constants would actually make it easier for stars to form Carbon, an element often assumed to be essential for the development of life. <br /><br />This is a fun paper because it extends on the work by Fred Hoyle, who was the first to use the anthropic principle to make a prediction (though some historians question whether that was his actual motivation). He understood that it’s difficult for stars to form heavy elements because the chain is broken in the first steps by Beryllium. Beryllium has atomic number 4, but the version that’s created in stellar nuclear fusion from Helium (with atomic number 2) is unstable and therefore can’t be used to build even heavier nuclei. <br /><br />Hoyle suggested that the chain of nuclear fusion avoids Beryllium and instead goes from three Helium nuclei straight to carbon (with atomic number 6). Known as the triple-alpha process (because Helium nuclei are also referred to as alpha-particles), the chances of this happening are slim – unless the Helium merger hits a resonance of the Carbon nucleus. Which it does if the parameters are “just right.” Hoyle hence concluded that such a resonance must exist, and that was later experimentally confirmed.<br /><br />Adams and Groh now point out that there are other sets of parameters altogether in which case Beryllium is just stable and the Carbon resonance doesn’t have to be finely tuned. In their paper, they do not deal with the fundamental constants that we normally use in the standard model – they instead discuss nuclear structure which has constants that are derived from the standard model constants, but are quite complicated functions thereof (if known at all). Still, they have basically invented a fictional universe that seems at least as capable of producing life as ours.<br /><br />This study is hence another demonstration that a chemistry complex enough to support life can arise under circumstances that are not anything like the ones we experience today. <br /><br />I find it amusing that many physicists believe the evolution of complexity is the exception rather than the rule. Maybe it’s because they mostly deal with simple systems, at equilibrium or close by equilibrium, with few particles, or with many particles of the same type – systems that the existing math can deal with.<br /><br />It makes me wonder how many more fictional universes physicists will invent and write papers about before they bury the idea that anthropic selection can single out a unique theory. Fewer, I hope, than there are planets in <i>No Man’s Sky</i>. Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com38http://backreaction.blogspot.com/2016/09/sorry-universe-wasnt-made-for-you.htmltag:blogger.com,1999:blog-22973357.post-48036690818058108032016-08-29T06:45:00.000-04:002016-08-29T06:45:15.408-04:00Dear Dr. B: How come we never hear of a force that the Higgs boson carries?<ul><i>“Dear Dr. Hossenfelder,<br /><br />First, I love your blog. You provide a great insight into the world of physics for us laymen. I have read in popular science books that the bosons are the ‘force carriers.’ For example the photon carries the electromagnetic force, the gluon, the strong force, etc. How come we never hear of a force that the Higgs boson carries?<br /><br><a href="http://www.irontardigrade.com/">Ramiro Rodriguez</a>”</i></ul>Dear Ramiro,<br /><br />The short answer is that you never hear of a force that the Higgs boson carries because it doesn’t carry one. The longer answer is that not all bosons are alike. This of course begs the question just how the Higgs-boson is different, so let me explain.<p>The standard model of particle physics is based on gauge symmetries. This basically means that the laws of nature have to remain invariant under transformations in certain internal spaces, and these transformations can change from one place to the next and one moment to the next. They are what physics call “local” symmetries, as opposed to “global” symmetries whose transformations don’t change in space or time.</p>Amazingly enough, the requirement of gauge symmetry automatically explains how particles interact. It works like this. You start with fermions, that are particles of half-integer spin, like electrons, muons, quarks and so on. And you require that the fermions’ behavior must respect a gauge symmetry, which is classified by a symmetry group. Then you ask what equations you can possibly get that do this.<br /><br />Since the fermions can move around, the equations that describe what they do must contain derivatives both in space and in time. This causes a problem, because if you want to know how the fermions’ motion changes from one place to the next you’d also have to know what the gauge transformation does from one place to the next, otherwise you can’t tell apart the change in the fermions from the change in the gauge transformation. But if you’d need to know that transformation, then the equations wouldn’t be invariant. <br /><br />From this you learn that the only way the fermions can respect the gauge symmetry is if you introduce additional fields – the gauge fields – which exactly cancel the contribution from the space-time dependence of the gauge transformation. In the standard model the gauge fields all have spin 1, which means they are bosons. That's because to cancel the terms that came from the space-time derivative, the fields need to have the same transformation behavior as the derivative, which is that of a vector, hence spin 1. <br /><br />To really follow this chain of arguments – from the assumption of gauge symmetry to the presence of gauge-bosons – requires several years’ worth of lectures, but the upshot is that the bosons which exchange the forces aren’t added by hand to the standard model, they are a consequence of symmetry requirements. You don’t get to pick the gauge-bosons, neither their number nor their behavior – their properties are determined by the symmetry.<br /><br />In the standard model, there are 12 such force-carrying bosons: the photon (γ), the W+, W-, Z, and 8 gluons. They belong to three gauge symmetries, U(1), SU(2) and SU(3). Whether a fermion does or doesn’t interact with a gauge-boson depends on whether the fermion is “gauged” under the respective symmetry, ie transforms under it. Only the quarks, for example, are gauged under the SU(3) symmetry of the strong interaction, hence only the quarks couple to gluons and participate in that interaction. The so-introduced bosons are sometimes specifically referred to as “gauge-bosons” to indicate their origin. <br /><br />The Higgs-boson in contrast is not introduced by a symmetry requirement. It has an entirely different function, which is to break a symmetry (the electroweak one) and thereby give mass to particles. The Higgs doesn’t have spin 1 (like the gauge-bosons) but spin 0. Indeed, it is the only presently known elementary particle with spin zero. Sheldon Glashow has charmingly referred to the Higgs as the “flush toilet” of the standard model – it’s there for a purpose, not because we like the smell.<br /><br />The distinction between fermions and bosons can be removed by postulating an exchange symmetry between these two types of particles, known as supersymmetry. It works basically by generalizing the concept of a space-time direction to not merely be bosonic, but also fermionic, so that there is now a derivative that behaves like a fermion. <br /><br />In the supersymmetric extension of the standard model there are then partner particles to all already known particles, denoted either by adding an “s” before the particle’s name if it’s a boson (selectron, stop quark, and so on) or adding “ino” after the particle’s name if it’s a fermion (Wino, photino, and so on). There is then also Higgsino, which is the partner particle of the Higgs and has spin 1/2. It is gauged under the standard model symmetries, hence participates in the interactions, but still is not itself consequence of a gauge.<br /><br />In the standard model most of the bosons are also force-carriers, but bosons and force-carriers just aren’t the same category. To use a crude analogy, just because most of the men you know (most of the bosons in the standard model) have short hair (are force-carriers) doesn’t mean that to be a man (to be a boson) you must have short hair (exchange a force). Bosons are defined by having integer spin, as opposed to the half-integer spin that fermions have, and not by their ability to exchange interactions.<br /><br />In summary the answer to your question is that certain types of bosons – the gauge bosons – are a consequence of symmetry requirements from which it follows that these bosons do exchange forces. The Higgs isn’t one of them. <br /><br />Thanks for an interesting question! <br /><br /><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-wGNjZ78a__I/V8QLBpPo23I/AAAAAAAADME/dE1Dv9v8MscHsXHaTcfdiQMsp6uJ9Lk_wCLcB/s1600/higgs.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="223" src="https://4.bp.blogspot.com/-wGNjZ78a__I/V8QLBpPo23I/AAAAAAAADME/dE1Dv9v8MscHsXHaTcfdiQMsp6uJ9Lk_wCLcB/s400/higgs.jpg" width="400" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Peter Higgs receiving the Nobel Prize from the King of Sweden.<br />[Img Credits: <a href="http://www.itv.com/news/topic/higgs-boson/">REUTERS/Claudio Bresciani/TT News Agency</a>]</td></tr></tbody></table><br /><hr><br>Previous Dear-Dr-B’s that you might also enjoy: <ul><li><a href="http://backreaction.blogspot.com/2016/05/dear-dr-b-if-photons-have-mass-would.html">If photons have a mass, would this mean special relativity is no longer valid?</a></li><li><a href="http://backreaction.blogspot.com/2016/03/dear-dr-b-what-are-requirements-for.html">What are the requirements for a successful theory of quantum gravity?</a></li><li><a href="http://backreaction.blogspot.com/2015/12/dear-dr-b-is-string-theory-science.html">Is string theory science?</a></li><li><a href="http://backreaction.blogspot.com/2015/12/ask-dr-b-is-multiverse-science-is.html">Is the multiverse real?</a></li><li><a href="http://backreaction.blogspot.com/2015/11/dear-dr-b-what-do-physicists-mean-when.html">What do physicists mean when they say time doesn’t exist?</a></li><li><a href="http://backreaction.blogspot.com/2016/04/dear-dr-b-why-is-lorentz-invariance-in.html">Why is Lorentz-invariance in conflict with discreteness?</a></li></uL>Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com51http://backreaction.blogspot.com/2016/08/dear-dr-b-how-come-we-never-hear-of.htmltag:blogger.com,1999:blog-22973357.post-90585064890786174482016-08-24T11:27:00.002-04:002016-08-24T13:15:12.201-04:00What if the universe was like a pile of laundry?<ul><i>What if the universe was like a pile of laundry?<br /><br />Have one. <br /><br />See this laundry pile? Looks just like our universe. <br /><br />No? <br /><br />Here, have another.<br /><br />See it now? It’s got three dimensions and all. <br /><br />But look again. <br /><br />The shirts and towels, they’re really crinkled and interlocked two-dimensional surfaces. <br /><br />Wait.<br /><br />It’s one-dimensional yarn, knotted up tightly. <br /><br />You ok? <br /><br />Have another.<br /><br />I see it clearly now. It’s everything at once, one-two-three dimensional. Just depends on how closely you look at it. <br /><br />Amazing, don’t you think? What if our universe was just like that?</i></ul><br /><br /><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-avls1vqD8ws/V728dtz2UsI/AAAAAAAADLw/vce0QXxjRK4_1jShWVL8KEMEmkhodZp7ACLcB/s1600/pile-of-clothes.jpg" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="138" src="https://4.bp.blogspot.com/-avls1vqD8ws/V728dtz2UsI/AAAAAAAADLw/vce0QXxjRK4_1jShWVL8KEMEmkhodZp7ACLcB/s200/pile-of-clothes.jpg" width="200" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Universal Laundry Pile.<br />[Img Src: <a href="http://www.clipartkid.com/pile-of-clothes-cliparts/">Clipartkid</a>]</td></tr></tbody></table><p>It doesn’t sound like a sober thought, but it’s got math behind it, so physicists think there might be something to it. Indeed the math piled up lately. They call it “dimensional reduction,” the idea that space on short distances has fewer than three dimensions – and it might help physicists to quantize gravity.</p>We’ve gotten used to space with additional dimensions, rolled up so small we can’t observe them. But how do you get rid of dimensions instead? To understand how it works we first have clarify what we mean by “dimension.” <br /><br />We normally think about dimensions of space by picturing lines which spread from a point. How quickly the lines dilute with the distance from the point tells us the “Hausdorff dimension” of a space. The faster the lines diverge from each other with distance, the larger the Hausdorff dimension. If you speak through a pipe, for example, sound waves spread less and your voice carries farther. The pipe hence has a lower Hausdorff dimension than our normal 3-dimensional office cubicles. It’s the Hausdorff dimension that we colloquially refer to as just dimension.<br /><br />For dimensional reduction, however, it is not the Hausdorff dimension which is relevant, but instead the “spectral dimension,” which is a slightly different concept. We can calculate it by first getting rid of the “time” in “space-time” and making it into space (period). We then place a random walker at one point and measure the probability that it returns to the same point during its walk. The smaller the average return probability, the higher the probability the walker gets lost, and the higher the number of spectral dimensions. <br /><br />Normally, for a non-quantum space, both notions of dimension are identical. However, add quantum mechanics and the spectral dimension at short distances goes down from four to two. The return probability for short walks becomes larger than expected, and the walker is less likely to get lost – this is what physicists mean by “dimensional reduction.” <br /><br />The spectral dimension is not necessarily an integer; it can take on any value. This value starts at 4 when quantum effects can be neglected, and decreases when the walker’s sensitivity to quantum effects at shortest distances increases. Physicists therefore also like to say that the spectral dimension “runs,” meaning its value depends on the resolution at which space-time is probed.<br /><br />Dimensional reduction is an attractive idea because quantizing gravity is considerably easier in lower dimensions where the infinities that plague traditional attempts to quantize gravity go away. A theory with a reduced number of dimensions at shortest distances therefore has much higher chances to remain consistent and so to provide a meaningful theory for the quantum nature of space and time. Not so surprisingly thus, among physicists, dimensional reduction has received quite some attention lately.<br /><br />This strange property of quantum-spaces was first found in Causal Dynamical Triangulation (<a href="https://arxiv.org/abs/hep-th/0505113">hep-th/0505113</a>), an approach to quantum gravity that relies on approximating curved spaces by triangular patches. In this work, the researchers did a numerical simulation of a random walk in such a triangulized quantum-space, and found that the spectral dimension goes down from four to two. Or actually to 1.80 ± 0.25 if you want to know precisely. <br /><br />Instead of doing numerical simulations, it is also possible to study the spectral dimension mathematically, which has since been done in various other approaches. For this, physicists exploit that the behavior of the random walk is governed by a differential equation – the diffusion equation – which depends on the curvature of space. In quantum gravity, the curvature has quantum fluctuations, and then it’s instead its average value which enters the diffusion equation. From the diffusion equation one then calculates the return probability for the random walk. <br /><br />This way, physicists have inferred the spectral dimension also in Asymptotically Safe Gravity (<a href="https://arxiv.org/abs/hep-th/0508202">hep-th/0508202</a>), an approach to quantum gravity which relies on the resolution-dependence (the “running”) of quantum field theories. And they found the same drop from four to two spectral dimensions. <br /><br />Another indication comes from Loop Quantum Gravity, where the scaling of the area operator with length changes at short distances. In this case is somewhat questionable whether the notion of curvature makes sense at all on short distances. But ignoring this, one can construct the diffusion equation and finds that the spectral dimension drops from four to two (<a href="arxiv.org/abs/0812.2214">0812.2214</a>). <br /><br />And then there is Horava-Lifshitz gravity, yet another modification of gravity which some believe helps with quantizing it. Here too, dimensional reduction has been found (<a href="https://arxiv.org/abs/0902.3657">0902.3657</a>).<br /><br />It is difficult to visualize what is happening with the dimensionality of space if it goes down continuously, rather than in discrete steps as in the example with the laundry pile. Maybe a good way to picture it, as Calcagni, Eichhorn and Saueressig suggest, is to think of the quantum fluctuations of space-time hindering a particle’s random walk, thereby slowing it down. It wouldn’t have to be that way. Quantum fluctuations could also kick the particle around wildly, thereby increasing the spectral dimension rather than decreasing it. But that’s not what the math tells us. <br /><br />One shouldn’t take this picture too seriously though, because we’re talking about a random walk in space, not space-time, and so it’s not a real physical process. Turning time into space might seem strange, but it is a common mathematical simplification which is often used for calculations in quantum theory. Still, it makes it difficult to interpret what is happening physically.<br /><br />I find it intriguing that several different approaches to quantum gravity share a behavior like this. Maybe it is a general property of quantum space-time. But then, there are many different types of random walks, and while these different approaches to quantum gravity share a similar scaling behavior for the spectral dimension, they differ in the type of random walk that produces this scaling (1304.7247). So maybe the similarities are only superficial. <br /><br />And of course this idea has no observational evidence speaking for it. Maybe never will. But one day, I’m sure, all the math will click into place and everything will make perfect sense. Meanwhile, <a href="https://arxiv.org/abs/1605.05694">have another</a>.<br><hr><i>[This article first appeared on Starts With A Bang under the title <a href="http://www.forbes.com/forbes/welcome/?toURL=http://www.forbes.com/sites/startswithabang/2016/07/26/dimensional-reduction-the-key-to-physics-greatest-mystery/&refURL=&referrer=">Dimensional Reduction: The Key To Physics' Greatest Mystery?</a>]</i> Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com27http://backreaction.blogspot.com/2016/08/what-if-universe-was-like-pile-of.htmltag:blogger.com,1999:blog-22973357.post-24057296793390462022016-08-19T07:39:00.000-04:002016-08-19T07:39:16.124-04:00Away NoteI'll be in Stockholm next week for a program on <a href="http://agenda.albanova.se/conferenceDisplay.py?confId=4986">Black Holes and Emergent Spacetime</a>, so please be prepared for some service interruptions. Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com1http://backreaction.blogspot.com/2016/08/away-note.htmltag:blogger.com,1999:blog-22973357.post-42719948790175760262016-08-15T08:32:00.001-04:002016-08-15T10:15:38.519-04:00The Philosophy of Modern Cosmology (srsly)<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-vKi8cRyCwmo/V7GzxSuQsHI/AAAAAAAADLM/hQ8WpL98c4wsHekwEvFyHFqu_jRmnd07wCLcB/s1600/Expanded%2BUniverse_8C10.15.JPG" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="150" src="https://2.bp.blogspot.com/-vKi8cRyCwmo/V7GzxSuQsHI/AAAAAAAADLM/hQ8WpL98c4wsHekwEvFyHFqu_jRmnd07wCLcB/s200/Expanded%2BUniverse_8C10.15.JPG" width="200" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Model of Inflation.<br />img src: <a href="https://sharepoint.umich.edu/lsa/physics/demolab/SitePages/8C10.15%20-%20Expanding%20Universe%20Balloon.aspx">umich.edu</a></td></tr></tbody></table>I wrote my recent post on the “<a href="http://backreaction.blogspot.com/2016/08/the-unbearable-lightness-of-philosophy.html">Unbearable Lightness of Philosophy</a>” to introduce a paper summary, but it got somewhat out of hand. I don’t want to withhold the actual body of my summary though. The paper in question is <br /><br><ul><b>Scientific Realism and Primordial Cosmology</b><br />Feraz Azhar, Jeremy Butterfield<br /><a href="https://arxiv.org/abs/1606.04071">arXiv:1606.04071 [physics.hist-ph]</a></ul><br>Before we start I have to warn you that the paper speaks a lot about <i>realism</i> and <i>underdetermination,</i> and I couldn’t figure out what exactly the authors mean with these words. Sure, I looked them up, but that didn’t help because there doesn’t seem to be an agreement on what the words mean. It’s philosophy after all.<p>Personally, I subscribe to a philosophy I’d like to call agnostic instrumentalism, which means I think science is useful and I don’t care what else you want to say about it – anything from realism to solipsism to Carroll’s “poetic naturalism” is fine by me. In newspeak, I’m a whateverist – now go away and let me science.</p>The authors of the paper, in contrast, position themselves as follows: <br /><blockquote>“We will first state our allegiance to scientific realism… We take scientific realism to be the doctrine that most of the statements of the mature scientific theories that we accept are true, or approximately true, whether the statement is about observable or unobservable states of affairs.”</blockquote>But rather than explaining what this means, the authors next admit that this definition contains “vague words,” and apologize that they “will leave this general defense to more competent philosophers.” Interesting approach. A physics-paper in this style would say: “This is a research article about General Relativity which has something to do with curvature of space and all that. This is just vague words, but we’ll leave a general defense to more competent physicists.” <br /><br />In any case, it turns out that it doesn’t matter much for the rest of the paper exactly what realism means to the authors – it’s a great paper also for an instrumentalist because it’s long enough so that, rolled up, it’s good to slap flies. The focus on scientific realism seems somewhat superfluous, but I notice that the paper is to appear in “The Routledge Handbook of Scientific Realism” which might explain it.<br /><br />It also didn’t become clear to me what the authors mean by underdetermination. Vaguely speaking, they seem to mean that a theory is underdetermined if it contains elements unnecessary to explain existing data (which is also what Wikipedia offers by way of definition). But the question what’s necessary to explain data isn’t a simple yes-or-no question – it’s a question that needs a quantitative analysis.<br /><br />In theory development we always have a tension between simplicity (fewer assumptions) and precision (better fit) because more parameters normally allow for better fits. Hence we use statistical measures to find out in which case a better fit justifies a more complicated model. I don’t know how one can claim that a model is “underdetermined” without such quantitative analysis. <br /><br />The authors of the paper for the most part avoid the need to quantify underdetermination by using sociological markers, ie they treat models as underdetermined if cosmologists haven’t yet agreed on the model in question. I guess that’s the best they could have done, but it’s not a basis on which one can discuss what will remain underdetermined. The authors for example seem to implicitly believe that evidence for a theory at high energies can only come from processes at such high energies, but that isn’t so – one can also use high precision measurements at low energies (at least in principle). In the end it comes down, again, to quantifying which model is the best fit. <br /><br />With this advance warning, let me tell you the three main philosophical issues which the authors discuss.<br /><br /><b>1. Underdetermination of topology. </b><br /><br />Einstein’s field equations are local differential equations which describe how energy-densities curve space-time. This means these equations describe how space changes from one place to the next and from one moment to the next, but they do not fix the overall connectivity – the topology – of space-time<sup>*</sup>. <br /><br />A sheet of paper is a simple example. It’s flat and it has no holes. If you roll it up and make a cylinder, the paper is still flat, but now it has a hole. You could find out about this without reference to the embedding space by drawing a circle onto the cylinder and around its perimeter, so that it can’t be contracted to zero length while staying on the cylinder’s surface. This could never happen on a flat sheet. And yet, if you look at any one point of the cylinder and its surrounding, it is indistinguishable from a flat sheet. The flat sheet and the cylinder are locally identical – but they are globally different.<br /><br />General Relativity thus can’t tell you the topology of space-time. But physicists don’t normally worry much about this because you can parameterize the differences between topologies, compute observables, and then compare the results to data. Topology is, in that, no different than any other assumption of a cosmological model. Cosmologists can, and have, looked for evidence of non-trivial space-time connectivity in the CMB data, but they haven’t found anything that would indicate our universe wraps around itself. At least so far. <br /><br />In the paper, the authors point out an argument raised by someone else (Manchak) which claims that different topologies can’t be distinguished almost everywhere. I haven’t read the paper in question, but this claim is almost certainly correct. The reason is that while topology is a global property, you can change it on arbitrarily small scales. All you have to do is punch a hole into that sheet of paper, and whoops, it’s got a new topology. Or if you want something without boundaries, then identify two points with each other. Indeed you could sprinkle space-time with arbitrarily many tiny wormholes and in that way create the most abstruse topological properties (and, most likely, lots of causal paradoxa). <br /><br />The topology of the universe is hence, like the topology of the human body, a matter of resolution. On distances visible to the eye you can count the holes in the human body on the fingers of your hand. On shorter distances though you’re all pores and ion channels, and on subatomic distances you’re pretty much just holes. So, asking what’s the topology of a physical surface only makes sense when one specifies at which distance scale one is probing this (possibly higher-dimensional) surface. <br /><br />I thus don’t think any physicist will be surprised by the philosophers’ finding that cosmology severely underdetermines global topology. What the paper fails to discuss though is the scale-dependence of that conclusion. Hence, I would like to know: Is it still true that the topology will remain underdetermined on cosmological scales? And to what extent, and under which circumstances, can the short-distance topology have long-distance consequences, as eg suggested by the ER=EPR idea? What effect would this have on the separation of scales in effective field theory?<br /><br /><b>2. Underdetermination of models of inflation.</b><br /><br />The currently most widely accepted model for the universe assumes the existence of a scalar field – the “inflaton” – and a potential for this field – the “inflation potential” – in which the field moves towards a minimum. While the field is getting there, space is exponentially stretched. At the end of inflation, the field’s energy is dumped into the production of particles of the standard model and dark matter.<br /><br />This mechanism was invented to solve various finetuning problems that cosmology otherwise has, notably that the universe seems to be almost flat (the “flatness problem”), that the cosmic microwave background has the almost-same temperature in all directions except for tiny fluctuations (the “horizon problem”), and that we haven’t seen any funky things like magnetic monopoles or domain walls that tend to be plentiful at the energy scale of grand unification (the “monopole problem”).<br /><br />Trouble is, there’s loads of inflation potentials that one can cook up, and most of them can’t be distinguished with current data. Moreover, one can invent more than one inflation field, which adds to the variety of models. So, clearly, the inflation models are severely underdetermined.<br /><br />I’m not really sure why this overabundance of potentials is interesting for philosophers. This isn’t so much philosophy as sociology – that the models are underdetermined is why physicists get them published, and if there was enough data to extract a potential that would be the end of their fun. Whether there will ever be enough data to tell them apart, only time will tell. Some potentials have already been ruled out with incoming data, so I am hopeful.<br /><br />The questions that I wish philosophers would take on are different ones. To begin with, I’d like to know which of the problems that inflation supposedly solves are actual problems. It only makes sense to complain about finetuning if one has a probability distribution. In this, the finetuning problem in cosmology is distinctly different from the finetuning problems in the standard model, because in cosmology one can plausibly argue there is a probability distribution – it’s that of fluctuations of the quantum fields which seed the initial conditions. <br /><br />So, I believe that the horizon problem is a well-defined problem, assuming quantum theory remains valid close by the Planck scale. I’m not so sure, however, about the flatness problem and the monopole problem. I don’t see what’s wrong with just assuming the initial value for the curvature is tiny (finetuned), and I don’t know why I should care about monopoles given that we don’t know grand unification is more than a fantasy. <br /><br />Then, of course, the current data indicates that the inflation potential too must be finetuned which, as Steinhardt has aptly complained, means that inflation doesn’t really solve the problem it was meant to solve. But to make that statement one would have to compare the severity of finetuning, and how does one do that? Can one even make sense of this question? Where are the philosophers if one needs them?<br /><br />Finally, I have a more general conceptual problem that falls into the category of underdetermination, which is to which extent the achievements of inflation are actually independent of each other. Assume, for example, you have a theory that solves the horizon problem. Under which circumstances does it also solve the flatness problem and gives the right tilt for the spectral index? I suspect that the assumptions for this do not require the full mechanism of inflation with potential and all, and almost certainly not a very specific type of potential. Hence I would like to know what’s the minimal theory that explains the observations, and which assumptions are really necessary.<br /><br /><b>3. Underdetermination in the multiverse.</b><br /><br />Many models for inflation create not only one universe, but infinitely many of them, a whole “multiverse”. In the other universes, fundamental constants – or maybe even the laws of nature themselves – can be different. How do you make predictions in a multiverse? You can’t, really. But you can make statements about probabilities, about how likely it is that we find ourselves in this universe with these particles and not any other. <br /><br />To make statements about the probability of the occurrence of certain universes in the multiverse one needs a probability distribution or a measure (in the space of all multiverses or their parameters respectively). Such a measure should also take into account anthropic considerations, since there are some universes which are almost certainly inhospitable for life, for example because they don’t allow the formation of large structures.<br /><br />In their paper, the authors point out that the combination of a universe ensemble and a measure is underdetermined by observations we can make in our universe. It’s underdetermined in the same what that if I give you a bag of marbles and say the most likely pick is red, you can’t tell what’s in the bag. <br /><br />I think physicists are well aware of this ambiguity, but unfortunately the philosophers don’t address why physicists ignore it. Physicists ignore it because they believe that one day they can deduce the theory that gives rise to the multiverse and the measure on it. To make their point, the philosophers would have had to demonstrate that this deduction is impossible. I think it is, but I’d rather leave the case to philosophers.<br /><br />For the agnostic instrumentalist like me a different question is more interesting, which is whether one stands to gain anything from taking a “shut-up-and-calculate” attitude to the multiverse, even if one distinctly dislikes it. Quantum mechanics too uses unobservable entities, and that formalism –however much you detest it – works very well. It really adds something new, regardless of whether or not you believe the wave-function is “real” in some sense. For what the multiverse is concerned, I am not sure about this. So why bother with it?<br /><br />Consider the best-case multiverse outcome: Physicists will eventually find a measure on some multiverse according to which the parameters we have measured are the most likely ones. Hurray. Now forget about the interpretation and think of this calculation as a black box: You put in math one side and out comes a set of “best” parameters the other side. You could always reformulate such a calculation as an optimization problem which allows one to calculate the correct parameters. So, independent of the thorny question of what’s real, what do I gain from thinking about measures on the multiverse rather than just looking for an optimization procedure straight away?<br /><br />Yes, there are cases – like bubble collisions in eternal inflation – that would serve as independent confirmation for the existence of another universe. But no evidence for that has been found. So for me the question remains: under which circumstances is doing calculations in the multiverse an advantage rather than unnecessary mathematical baggage?<br /><br />I think this paper makes a good example for the difference between philosophers’ and physicists’ interests which I wrote about in my previous post. It was a good (if somewhat long) read and it gave me something to think, though I will need some time to recover from all the -isms.<br><hr>* Note added: The word connectivity in this sentence is a loose stand-in for those who do not know the technical term “topology.” It does <i>not</i> refer to the technical term “connectivity.”Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com81http://backreaction.blogspot.com/2016/08/the-philosophy-of-modern-cosmology-srsly.htmltag:blogger.com,1999:blog-22973357.post-61848156087526924482016-08-12T07:54:00.001-04:002016-08-12T07:54:16.597-04:00The Unbearable Lightness of Philosophy<div class="separator" style="clear: both; text-align: center;"><a href="https://3.bp.blogspot.com/-eI-5JrsOEDI/V621tSU5--I/AAAAAAAADK4/kSL53h9rTg8aSgt3s-32E0FVG_gvRDKOQCLcB/s1600/hotairballoon.JPG" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="150" src="https://3.bp.blogspot.com/-eI-5JrsOEDI/V621tSU5--I/AAAAAAAADK4/kSL53h9rTg8aSgt3s-32E0FVG_gvRDKOQCLcB/s200/hotairballoon.JPG" width="200" /></a></div>Philosophy isn’t useful for practicing physicists. On that, I am with <a href="http://www.pitt.edu/~mem208/courses/phph_s15/documents/weinberg_against_philosophy.pdf">Steven Weinberg</a> and <a href="http://www.scientificamerican.com/article/the-consolation-of-philos/">Lawrence Krauss</a> who have expressed similar opinions. But I think it’s an unfortunate situation because physicists – especially those who work on the foundations of physics – could need help from philosophers. <p>Massimo Pigliucci, a Prof for Philosophy at CUNY-City College, <a href="http://rationallyspeaking.blogspot.de/2012/04/lawrence-krauss-another-physicist-with.html">has ingeniously addressed physicists’ complaints about the uselessness of philosophy</a> by declaring that “the business of philosophy is not to advance science.” Philosophy, hence, isn’t just useless, but it’s useless on purpose. I applaud. At least that means it has a purpose. </p>But I shouldn’t let Massimo Pigliucci speak for his whole discipline.<br /><br />I’ve been told for what physics is concerned there are presently three good philosophers roaming Earth: David Albert, Jeremy Butterfield, and Tim Maudlin. It won’t surprise you to hear that I have some issues to pick with each of these gentlemen, but mostly they seem reasonable indeed. I would even like to nominate a fourth Good Philosopher, Steven Weinstein from UoW, with whom even I haven’t yet managed to disagree.<br /><br />The good Maudlin, for example, had <a href="http://www.pbs.org/wgbh/nova/blogs/physics/2015/04/physics-needs-philosophy/">an excellent essay last year on PBS NOVA</a>, in which he argued that “Physics needs Philosophy.” I really liked his argument until he wrote that “Philosophers obsess over subtle ambiguities of language,” which pretty much sums up all that physicists hate about philosophy.<br /><br />If you want to know “what follows from what,” as Maudlin writes, you have to convert language into mathematics and thereby remove the ambiguities. Unfortunately, philosophers never seem to take that step, hence physicists’ complaints that it’s just words. Or, as Arthur Koestler put it, “the systematic abuse of a terminology specially invented for that purpose.” <br /><br />Maybe, I admit, it shouldn’t be the philosophers’ job to spell out how to remove the ambiguities in language. Maybe that should already be the job of physicists. But regardless of whom you want to assign the task of reaching across the line, presently little crosses it. Few practicing physicists today care what philosophers do or think.<br /><br />And as someone who has tried to write about topics on the intersection of both fields, I can report that this disciplinary segregation is meanwhile institutionalized: The physics journals won’t publish on the topic because it’s too much philosophy, and the philosophy journals won’t publish because it’s too much physics.<br /><br /><a href="https://aeon.co/essays/the-string-theory-wars-show-us-how-science-needs-philosophy?preview=true">In a recent piece on Aeon</a>, Pigliucci elaborates on the demarcation problem, how to tell science from pseudoscience. He seems to think this problem is what underlies some physicists’ worries about string theory and the multiverse, worries that were topic of <a href="http://www.forbes.com/sites/startswithabang/2015/12/10/why-trust-a-theory-physicists-and-philosophers-debate-the-scientific-method/">a workshop that both he and I attended last year</a>. <br /><br />But he got it wrong. While I know lots of physicists critical of string theory for one reason or the other, none of them would go so far to declare it pseudoscience. No, the demarcation problem that physicists worry about isn’t that between science and pseudoscience. It’s that between science and philosophy. It is not without irony that Pigliucci in his essay conflates the two fields. Or maybe the purpose of his essay was an attempt to revive the “string wars,” in which case, wake me when it’s over.<br /><br />To me, the part of philosophy that is relevant to physics is what I’d like to call “pre-science” – sharpening questions sufficiently so that they can eventually be addressed by scientific means. Maudlin in his above mentioned essay expressed a very similar point of view.<br /><br />Philosophers in that area are necessarily ahead of scientists. But they also never get the credit for actually answering a question, because for that they’ll first have to hand it over to scientists. Like a psychologist, thus, the philosopher of physics succeeds by eventually making themselves superfluous. It seems a thankless job. There’s a reason I preferred studying physics instead.<br /><br />Many of the “bad philosophers” are those who aren’t quick enough to notice that a question they are thinking about has been taken over by scientists. That this failure to notice can evidently persist, in some cases, for decades is another institutionalized problem that originates in the lack of communication between both fields.<br /><br />Hence, I wish there were more philosophers willing to make it their business to advance science and to communicate across the boundaries. Maybe physicists would complain less that philosophy is useless if it wasn’t useless.Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com95http://backreaction.blogspot.com/2016/08/the-unbearable-lightness-of-philosophy.htmltag:blogger.com,1999:blog-22973357.post-90546438296361607942016-08-06T05:53:00.000-04:002016-08-06T05:58:11.464-04:00The LHC “nightmare scenario” has come true. <table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-FMxASH9lmIk/V6WzHfv31pI/AAAAAAAADKg/e2isEw-JPO023KQXwKX3eG9X_dyKs7c6QCLcB/s1600/atlas_cms_diphoton_2015-1.png" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="200" src="https://3.bp.blogspot.com/-FMxASH9lmIk/V6WzHfv31pI/AAAAAAAADKg/e2isEw-JPO023KQXwKX3eG9X_dyKs7c6QCLcB/s200/atlas_cms_diphoton_2015-1.png" width="190" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">The recently deceased diphoton<br />bump. <a href="https://profmattstrassler.com/2015/12/16/is-this-the-beginning-of-the-end-of-the-standard-model/">Img Src: Matt Strassler.</a></td></tr></tbody></table><p>I finished high school in 1995. It was the year the top quark was discovered, a prediction dating back to 1973. As I read the articles in the news, I was fascinated by the mathematics that allowed physicists to reconstruct the structure of elementary matter. It wouldn’t have been difficult to predict in 1995 that I’d go on to make a PhD in theoretical high energy physics.</p>Little did I realize that for more than 20 years the so provisional looking standard model would remain undefeated world-champion of accuracy, irritatingly successful in its arbitrariness and yet impossible to surpass. We added neutrino masses in the late 1990s, but this idea dates back to the 1950s. The prediction of the Higgs, discovered 2012, originated in the early 1960s. And while the poor standard model has been discounted as “ugly” by everyone from Stephen Hawking to Michio Kaku to Paul Davies, it’s still the best we can do. <br /><br />Since I entered physics, I’ve seen grand unified models proposed and falsified. I’ve seen loads of dark matter candidates not being found, followed by a ritual parameter adjustment to explain the lack of detection. I’ve seen supersymmetric particles being “predicted” with constantly increasing masses, from some GeV to some 100 GeV to LHC energies of some TeV. And now that the LHC hasn’t seen any superpartners either, particle physicists are <a href="http://www.scientificamerican.com/article/the-collider-that-could-save-physics/">more than willing to once again move the goalposts.</a><br /><br />During my professional career, all I have seen is failure. A failure of particle physicists to uncover a more powerful mathematical framework to improve upon the theories we already have. Yes, failure is part of science – it’s frustrating, but not worrisome. What worries me much more is our failure to learn from failure. Rather than trying something new, we’ve been trying the same thing over and over again, expecting different results.<br /><br />When I look at the data what I see is that our reliance on gauge-symmetry and the attempt at unification, the use of naturalness as guidance, and the trust in beauty and simplicity aren’t working. The cosmological constant isn’t natural. The Higgs mass isn’t natural. The standard model isn’t pretty, and the concordance model isn’t simple. Grand unification failed. It failed again. And yet we haven’t drawn any consequences from this: Particle physicists are still playing today by the same rules as in 1973.<br /><br />For the last ten years you’ve been told that the LHC must see some new physics besides the Higgs because otherwise nature isn’t “natural” – a technical term invented to describe the degree of numerical coincidence of a theory. I’ve been laughed at when I explained that <a href="http://backreaction.blogspot.de/2009/12/what-is-natural.html">I don’t buy into naturalness because it’s a philosophical criterion, not a scientific one</a>. But on that matter I got the last laugh: Nature, it turns out, doesn’t like to be told what’s presumably natural. <br /><br />The idea of naturalness that has been preached for so long is plainly not compatible with the LHC data, regardless of what else will be found in the data yet to come. And now that naturalness is in the way of moving predictions for so-far undiscovered particles – yet again! – to higher energies, particle physicists, opportunistic as always, are suddenly more than willing to discard of naturalness to justify the next larger collider. <br /><br /><a href="http://www.scientificamerican.com/article/hope-for-new-particle-fizzles-at-the-lhc1/">Now that the diphoton bump is gone</a>, we’ve entered what has become known as the “nightmare scenario” for the LHC: The Higgs and nothing else. Many particle physicists thought of this as the worst possible outcome. It has left them without guidance, lost in a thicket of rapidly multiplying models. Without some new physics, they have nothing to work with that they haven’t already had for 50 years, no new input that can tell them in which direction to look for the ultimate goal of unification and/or quantum gravity. <br /><br />That the LHC hasn’t seen evidence for new physics is to me a clear signal that we’ve been doing something wrong, that our experience from constructing the standard model is no longer a promising direction to continue. We’ve maneuvered ourselves into a dead end by relying on aesthetic guidance to decide which experiments are the most promising. I hope that this latest null result will send a clear message that you can’t trust the judgement of scientists whose future funding depends on their continued optimism. <br /><br />Things can only get better. <br /><hr /><i>[This post previously appeared in a longer version on <a href="http://www.forbes.com/sites/startswithabang/2016/06/28/why-a-physicist-hopes-that-the-lhc-discovers-no-more-new-particles/#27e696a3d6f9">Starts With A Bang</a>.]</i>Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com71http://backreaction.blogspot.com/2016/08/the-lhc-nightmare-scenario-has-come-true.htmltag:blogger.com,1999:blog-22973357.post-2993404838122384162016-08-02T02:34:00.002-04:002016-08-03T02:10:03.318-04:00Math blind<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-51ElUKraBjk/V6A9qHhV4VI/AAAAAAAADKA/uXJiy-p46YwgwdpRX8eusM4RKDM0ka71gCLcB/s1600/equations.jpg" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="155" src="https://2.bp.blogspot.com/-51ElUKraBjk/V6A9qHhV4VI/AAAAAAAADKA/uXJiy-p46YwgwdpRX8eusM4RKDM0ka71gCLcB/s200/equations.jpg" width="200" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">[Img Src: <a href="http://www.livescience.com/26681-most-beautiful-mathematical-equations.html">LifeScience</a>]</td></tr></tbody></table>Why must school children suffer through so much math which they will never need in their life? That’s one of these questions which I see opinion pieces about every couple of months. Most of them go back to a person by name Andrew Hacker <a href="http://www.newyorker.com/magazine/2016/06/27/andrew-hacker-debates-the-value-of-math?mbid=social_twitter">whose complaint is that</a>:<br /><blockquote>“Every other subject is about something. Poetry is about something. Even most modern art is about something. Math is about nothing. Math describes much of the world but is all about itself, and it has the most fantastic conundrums. But it is not about the world.”</blockquote><p>Yes, mathematics is an entirely self-referential language. That’s the very reason why it’s so useful. Complaining that math isn’t about some thing is like complaining that paint isn’t an image – and even Hacker concedes that math can be used to describe much of the world. For most scientists the discussion stops at this point. The verdict in my filter bubble in unanimous: mathematics is the language of nature, and if schools teach one thing, that’s what they should teach.</p>I agree with that of course. And yet, the argument that math is the language of nature preaches to the converted. For the rest it’s meaningless rhetoric, countered by the argument that schools should teach what’s necessary: necessary to fill in a tax return, calculate a mortgage rate, or maybe estimate how many bricks you need to build a wall along the US-Mexican border. <br /><br />School curriculums have to be modernized every now and then, no doubt about this. But the goal cannot be to reduce a subject of education based on the reasoning that it’s difficult. Math is the base of scientific literacy. You need math to understand risk assessments, to read statistics, and to understand graphs. You need math to understand modern science and tell it from pseudoscience. Much of the profusion of quack medicine like quantum healing or homeopathy is due to people’s inability to grasp even the basics of the underlying theories (or their failure to notice the absence thereof). For that you’d need, guess what, math.<br /><br />But most importantly, you need math to understand what it even means to understand. The only real truths are mathematical truths, and so proving theorems is the only way to learn how to lead watertight arguments. That doesn’t mean that math teaches you how to lead <i>successful</i> arguments, in the sense of convincing someone. But it teaches you how to lead <i>correct</i> arguments. And that skill should be worth something, even if Hacker might complain that the arguments are about nothing.<br /><br />I thought of this recently when my daughters had their school enrollment checkup. <br /><br />One of the twins, Lara, doesn’t have stereo vision. We know this because she’s had regular eye exams, and while she sees well on both eyes separately, she doesn’t see anything on the 3d test card. I’ve explained to her why it’s important she wears her eye-cover and I try to coax her into doing some muscle building exercises. But she doesn’t understand. <br /><br />And how could she? She’s never seen 3d. She doesn’t know what she doesn’t see. And it’s not an obvious disability: Lara tells distances by size and context. She knows that birds are small and cars are large and hence small cars are far away. For all she can tell, she sees just as well as everybody else. There are few instances when stereo-vision really makes a difference, one of them is catching a ball. But at 5 years she’s just as clumsy as all the other kids. <br /><br />Being math-blind too is not an obvious disability. You can lead a pleasant life without mathematics because it’s possible to fill in the lack of knowledge with heuristics and anecdotes. And yet, without math, you’ll never see reality for what it is – you’ll lead your life in the fudgy realm of maybe-truths.<br /><br />Lara doesn’t know triangulation and she doesn’t know vector spaces, and when I give her examples for what she’s missing, she’ll just put on this blank look that children reserve for incomprehensible adult talk, listen politely, and then reply “Today I built a moon rocket in kindergarten.”<br /><br />I hear an echo of my 5 year old’s voice in these essays about the value of math education. It’s trying to tell someone they are missing part of the picture, and getting a reply like “I have never used the quadratic formula in my personal life.” Fine then, but totally irrelevant. Rather than factoring polynomials, let’s teach kids differential equations or network growth, which is arguably more useful to understand the world. <br /><br />Math isn’t going away. On the very contrary it’s bound to dramatically increase in significance as the social sciences become more quantitative. We need that precision to make informed decisions and to avoid reinventing the wheel over and over again. And like schools teach the basics of political theory so that children understand the use of democracy, they must teach mathematics so that they understand the use of quantitative forecasts, uncertainties, and, most of all, to recognize the boundary between fact and opinion. Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com59http://backreaction.blogspot.com/2016/08/math-blind.htmltag:blogger.com,1999:blog-22973357.post-28750925543824695362016-07-24T07:14:00.000-04:002016-07-25T00:29:33.214-04:00Can we please agree what we mean by “Big Bang”?<div style="text-align: right;"></div><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody><tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-dtxWh2TzZ40/V5TPfct3saI/AAAAAAAADJg/eR3oFOLv9yMOgxiGHDAufiiRPoyvUGbHACLcB/s1600/duck_bang.jpg" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="200" src="https://3.bp.blogspot.com/-dtxWh2TzZ40/V5TPfct3saI/AAAAAAAADJg/eR3oFOLv9yMOgxiGHDAufiiRPoyvUGbHACLcB/s200/duck_bang.jpg" width="172" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;"><br /></td></tr></tbody></table>Can you answer the following question?<br /><br />At the Big Bang the observable universe had the size of: <br /><ul>A) A point (no size).<br />B) A grapefruit.<br />C) 168 meters.</ul><br />The right answer would be “all of the above.” And that’s not because I can’t tell a point from a grapefruit, it’s because physicists can’t agree what they mean by Big Bang!<p>For someone in quantum gravity, the Big Bang is the initial singularity that occurs in General Relativity when the current expansion of the universe is extrapolated back to the beginning of time. At the Big Bang, then, the universe had size zero and an infinite energy density. Nobody believes this to be a physically meaningful event. We interpret it as a mathematical artifact which merely signals the breakdown of General Relativity.</p>If you ask a particle physicist, they’ll therefore sensibly put the Big Bang at the time where the density of matter was at the Planck scale – about 80 orders of magnitude higher than the density of a neutron star. That’s where General Relativity breaks down; it doesn’t make sense to extrapolate back farther than this. At this Big Bang, space and time were subject to significant quantum fluctuations and it’s questionable that even speaking of size makes sense, since that would require a well-defined notion of distance.<br /><br />Cosmologists tend to be even more conservative. The currently most widely used model for the evolution of the universe posits that briefly after the Planck epoch an exponential expansion, known as inflation, took place. At the end of inflation, so the assumption, the energy of the field which drives the exponential expansion is dumped into particles of the standard model. Cosmologists like to put the Big Bang at the end of inflation because inflation itself hasn’t been observationally confirmed. But they can’t agree how long inflation lasted, and so the estimates for the size of the universe <a href="http://www.forbes.com/sites/startswithabang/2015/12/26/ask-ethan-how-big-was-the-universe-when-it-was-first-born/#7671891174d4">range between a grapefruit and a football field</a>.<br /><br />Finally, if you ask someone in science communication, they’ll throw up their hands in despair and then explain that the Big Bang isn’t an event but a theory for the evolution of the universe. Wikipedia engages in the same obfuscation – if you look up “<a href="https://en.wikipedia.org/wiki/Big_Bang">Big Bang</a>” you get instead an explanation for “Big Bang theory,” leaving you to wonder what it’s a theory of. <br /><br />I admit it’s not a problem that bugs physicists a lot because they don’t normally debate the meaning of words. They’ll write down whatever equations they use, and this prevents further verbal confusion. Of course the rest of the world should also work this way, by first writing down definitions before entering unnecessary arguments.<br /><br />While I am waiting for mathematical enlightment to catch on, I find this state of affairs terribly annoying. I recently had an argument on twitter about whether or not the LHC “recreates the Big Bang,” as the popular press likes to claim. <a href="http://backreaction.blogspot.de/2008/07/recreating-big-bang.html">It doesn’t</a>. But it’s hard to make a point if no two references agree on what the Big Bang is to begin with, not to mention that it was neither big nor did it bang. If biologists adopted physicists standards, they’d refer to infants as blastocysts, and if you complained about it they’d explain both are phases of pregnancy theory.<br /><br />I find this nomenclature unfortunate because it raises the impression we understand far less about the early universe than we do. If physicists can’t agree whether the universe at the Big Bang had the size of the White House or of a point, would you give them 5 billion dollars to slam things into each other? Maybe they’ll accidentally open a portal to a parallel universe where the US Presidential candidates are Donald Duck and Brigitta MacBridge.<br /><br />Historically, the term “Big Bang” was coined by Fred Hoyle, a staunch believer in steady state cosmology. He used the phrase to make fun of Lemaitre, who, in 1927, had found a solution to Einstein’s field equations according to which the universe wasn’t eternally constant in time. Lemaitre showed, for the first time, that matter caused space to expand, which implied that the universe must have had an initial moment from which it started expanding. They didn’t then worry about exactly when the Big Bang would have been – back then they worried whether cosmology was science at all.<br /><br />But we’re not in the 1940s any more, and precise science deserves precise terminology. Maybe we should rename the different stages of the universe that into “Big Bang,” “Big Bing” and “Big Bong.” This idea has much potential by allowing further refinement to “Big Bång,” “Big Bîng” or “Big Böng.” I’m sure Hoyle would approve. Then he would laugh and quote Niels Bohr, “Never express yourself more clearly than you are able to think.”<br /><br />You can count me to the Planck epoch camp. Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com43http://backreaction.blogspot.com/2016/07/can-we-please-agree-what-we-mean-by-big.htmltag:blogger.com,1999:blog-22973357.post-68834243687148317772016-07-18T03:47:00.003-04:002016-07-18T05:55:34.328-04:00Can black holes tunnel to white holes?<div class="separator" style="clear: both; text-align: center;"><a href="https://4.bp.blogspot.com/-YxzO_7K5ll4/V4yHwr0cpBI/AAAAAAAADI0/vDj3t-1hXZ07RfZEwsPy8Of3oUeFbXPjwCLcB/s1600/blackwhite.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" src="https://4.bp.blogspot.com/-YxzO_7K5ll4/V4yHwr0cpBI/AAAAAAAADI0/vDj3t-1hXZ07RfZEwsPy8Of3oUeFbXPjwCLcB/s1600/blackwhite.jpg" /></a></div><b>Tl;dr:</b> Yes, but it’s unlikely.<br /><br />If black holes attract your attention, white holes might blow your mind. <p>A white hole is a time-reversed black hole, an anti-collapse. While a black hole contains a region from which nothing can escape, a white hole contains a region to which nothing can fall in. Since the time-reversal of a solution of General Relativity is another solution, we know that white holes exist mathematically. But are they real?</p>Black holes were originally believed to merely be of mathematical interest, solutions that exist but cannot come into being in the natural world. As physicists understood more about General Relativity, however, the exact opposite turned out to be the case: It is hard to avoid black holes. They generically form from matter that collapses under its own gravitational pull. Today it is widely accepted that the black hole solutions of General Relativity describe to high accuracy astrophysical objects which we observe in the real universe. <br /><br />The simplest black hole solutions in General Relativity are the Schwarzschild-solutions, or their generalizations to rotating and electrically charged black holes. These solutions however are not physically realistic because they are entirely time-independent, which means such black holes must have existed forever. Schwarzschild black holes, since they are time-reversal invariant, also necessarily come together with a white hole. Realistic black holes, on the contrary, which are formed from collapsing matter, do not have to be paired with white holes.<br /><br />(Aside: Karl Schwarzschild was German. Schwarz means black, Schild means shield. Probably a family crest. It’s got nothing to do with children.)<br /><br />But there are many things we don’t understand about black holes, most prominently how they handle information of the matter that falls in. Solving the black hole information loss problem requires that information finds a way out of the black hole, and this could be done for example by flipping a black hole over to a white hole. In this case the collapse would not complete, and instead the black hole would burst, releasing all that it had previously swallowed. <br /><br />It’s an intriguing and simple option. This black-to-white-hole transition has been discussed in the literature for some while, recently by <a href="http://backreaction.blogspot.de/2014/02/can-planck-stars-exist.html">Rovelli and Vidotto in the Planck star idea</a>. It’s also subject of <a href="http://arxiv.org/abs/1607.03480">a last week’s paper by Barcelo and Carballo-Rubio</a>.<br /><br />Is this a plausible solution to the black hole information loss problem? <br /><br />It is certainly possible to join part of the black hole solution with part of the white hole solution. But doing this brings some problems.<br /><br />The first problem is that at the junction the matter must get a kick that transfers it from one state into the other. This kick cannot be achieved by any known physics – we know this from the singularity theorems. There isn’t anything in the known physics can prevent a black hole from collapsing entirely once the horizon is formed. Whatever makes this kick hence needs to violate one of the energy conditions, it must be new physics. <br /><br />Something like this could happen in a region with quantum gravitational effects. But this region is normally confined to deep inside the black hole. A transition to a white hole could therefore happen, but only if the black hole is very small, for example because it has evaporated for a long time.<br /><br />But this isn’t the only problem. <br /><br />Before we think about the stability of black holes, let us think about a simpler question. Why doesn’t dough unmix into eggs and flour and sugar neatly separated? Because that would require an entropy decrease. The unmixing can happen, but it’s exceedingly unlikely, hence we never see it. <br /><br />A black hole too has entropy. It has indeed enormous entropy. It saturates the possible entropy that can be contained within a closed surface. If matter collapses to a black hole, that’s a very likely process to happen. Consequently, if you time-reverse this collapse, you get an exceedingly unlikely process. This solution exists, but it’s not going to happen unless the black hole is extremely tiny, close by the Planck scale. <br /><br />It is possible that the white hole which a black hole supposedly turns into is not the exact time-reverse, but instead another solution that further increases entropy. But in that case I don’t know where this solution comes from. And even so I would suspect that the kick required at the junction must be extremely finetuned. And either way, it’s not a problem I’ve seen addressed in the literature. (If anybody knows a reference, please let me know.)<br /><br />In <a href="http://arxiv.org/abs/1607.00364">a paper written for the 2016 Awards for Essays on Gravitation, Haggard and Rovelli make an argument </a>in favor of their idea, but instead they just highlight the problem with it. They claim that small quantum fluctuations around the semi-classical limit which is General Relativity can add up over time, eventually resulting in large deviations. Yes, this can happen. But the probability that this happens is tiny, otherwise the semi-classical limit wouldn’t be the semi-classical limit. <br /><br />The most likely thing to happen instead is that quantum fluctuations average out to give back the semi-classical limit. Hence, no white-hole transition. For the black-to-white-hole transition one would need quantum fluctuations to conspire together in just the right way. That’s possible. But it’s exceedingly unlikely. <br /><br />In<a href="http://arxiv.org/abs/1607.03480"> the other recent paper</a> the authors find a surprisingly large transition rate for black to white holes. But they use a highly symmetrized configuration with very few degrees of freedom. This must vastly overestimate the probability for transition. It’s an interesting mathematical example, but it has very little to do with real black holes out there.<br /><br /><b>In summary:</b> That black holes transition to white holes and in this way release information is an idea appealing because of its simplicity. But I remain unconvinced because I am missing a good argument demonstrating that such a process is likely to happen. Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com44http://backreaction.blogspot.com/2016/07/can-black-holes-tunnel-to-white-holes.htmltag:blogger.com,1999:blog-22973357.post-76871547924056345052016-07-12T07:34:00.003-04:002016-07-12T08:54:14.183-04:00Pulsars could probe black hole horizons<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody><tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-sqwF05GAUO0/V4TSZpdahzI/AAAAAAAADIY/_p6UOauMsLwFWoZO0EmzH9p7XatmfoVIgCLcB/s1600/2014_meerkatant_26.jpg" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="200" src="https://1.bp.blogspot.com/-sqwF05GAUO0/V4TSZpdahzI/AAAAAAAADIY/_p6UOauMsLwFWoZO0EmzH9p7XatmfoVIgCLcB/s200/2014_meerkatant_26.jpg" width="149" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">The first antenna of MeerKAT,<br />a SKA precursor in South Africa.<br />[<a href="http://www.ska.ac.za/releases/20140327.php">Image Source</a>.]</td></tr></tbody></table><p>It’s hard to see black holes – after all, their defining feature is that they swallow light. But it’s also hard to discourage scientists from trying to shed light on mysteries. In a recent paper, a group of researchers from Long Island University and Virginia Tech have proposed a new way to probe the near-horizon region of black holes and, potentially, quantum gravitational effects.</p><ul><b>Shining Light on Quantum Gravity with Pulsar-Black Hole Binaries</b><br />John Estes, Michael Kavic, Matthew Lippert, John H. Simonetti<br />arXiv:<a href="http://arxiv.org/abs/1607.00018">1607.00018 [hep-th]</a></ul><br />The idea is simple and yet promising: Search for a binary system in which a pulsar and a black hole orbit around each other, then analyze the pulsar signal for unusual fluctuations.<br /><br />A pulsar is a rapidly rotating neutron star that emits a focused beam of electromagnetic radiation. This beam goes into the direction of the poles of the magnetic field, and is normally not aligned with the neutron star’s axis of rotation. The beam therefore spins with a regular period like a lighthouse beacon. If Earth is located within the beam’s reach, our telescopes receive a pulse every time the beam points into our direction.<br /><br />Pulsar timing can be extremely precise. We know some pulsars that have been flashing for decades every couple of milliseconds to a precision of a few microseconds. This high regularity allows astrophysicists to search for signals which might affect the timing. Fluctuations of space-time itself, for example, would increase the pulsar-timing uncertainty, <a href="http://backreaction.blogspot.de/2015/11/mysteriously-quiet-space-baffles.html">a method that has been used to derive constraints on the stochastic gravitational wave background</a>. And if a pulsar is in a binary system with a black hole, the pulsar’s signal might scrape by the black hole and thus encode information about the horizon which we can catch on Earth. <br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://3.bp.blogspot.com/-5CDvowReF5w/V4TTBGdzIwI/AAAAAAAADIg/ZTHCmNqCpaUKnTRdQ_Lslehb_nS_Oe67ACLcB/s1600/pulsar.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="143" src="https://3.bp.blogspot.com/-5CDvowReF5w/V4TTBGdzIwI/AAAAAAAADIg/ZTHCmNqCpaUKnTRdQ_Lslehb_nS_Oe67ACLcB/s400/pulsar.jpg" width="400" /></a></div><br />No such pulsar-black hole binaries are known to date. But upcoming experiments like eLISA and the <a href="https://www.skatelescope.org/">Square Kilometer Array</a> (SKA) will almost certainly detect new pulsars. In their paper, the authors estimate that SKA might observe up to 100 new pulsar-black hole binaries, and they put the probability that a newly discovered system would have a suitable orientation at roughly one in a hundred. If they are right, the SKA would have a good chance to find a promising binary. <br /><br />Much of the paper is dedicated to arguing that the timing accuracy of such a binary pulsar could carry information about quantum gravitational effects. This is not impossible but speculative. Quantum gravitational effects are normally expect to be strong towards the black hole singularity, ie well inside the black hole and hidden from observation. Naïve dimensional estimates reveal that quantum gravity should be unobservably small in the horizon area.<br /><br />However, this argument has recently been questioned in the aftermath of the firewall controversy surrounding black holes, because one solution to the black hole firewall paradox is that quantum gravitational effects can stretch over much longer distances than the dimensional estimates lead one to expect. <a href="http://arxiv.org/abs/1605.05341">Steve Giddings</a> has long been a proponent of such long-distance fluctuations, and scenarios like black hole fuzzballs, or <a href="https://aeon.co/essays/is-the-black-hole-at-our-galaxy-s-centre-a-quantum-computer">Dvali’s Bose-Einstein Computers</a> also lead to horizon-scale deviations from general relativity. It is hence something that one should definitely look for.<br /><br />Previous proposals to test the near-horizon geometry were based on measurements of gravitational waves from merger events or the black hole shadow, each of which could reveal deviations from general relativity. However, so far these were quite general ideas lacking quantitative estimates. To my knowledge, this paper is the first to demonstrate that it’s technologically feasible. <br /><br />Michael Kavic, one of the authors of this paper, will attend our September conference on “<a href="https://indico.fias.uni-frankfurt.de/event/2/">Experimental Search for Quantum Gravity</a>.” We’re still planning to life-streaming the talks, so stay tuned and you’ll get a chance to listen in.Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com18http://backreaction.blogspot.com/2016/07/pulsars-could-probe-black-hole-horizons.htmltag:blogger.com,1999:blog-22973357.post-7366883362089932222016-07-04T08:16:00.000-04:002016-07-05T00:45:21.184-04:00Why the LHC is such a disappointment: A delusion by name “naturalness”<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-t4EBtBYJLdo/V3pQR56_LbI/AAAAAAAADIA/M-FCb1muk48sOoj0EbhU3edTaBpIBiltwCLcB/s1600/unnatural_apples.jpg" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="179" src="https://3.bp.blogspot.com/-t4EBtBYJLdo/V3pQR56_LbI/AAAAAAAADIA/M-FCb1muk48sOoj0EbhU3edTaBpIBiltwCLcB/s200/unnatural_apples.jpg" width="200" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Naturalness, according to physicists.</td></tr></tbody></table><p>Before the LHC turned on, theoretical physicists had high hopes the collisions would reveal new physics besides the Higgs. The chances of that happening get smaller by the day. The possibility still exists, but the absence of new physics so far has already taught us an important lesson: Nature isn’t natural. At least not according to theoretical physicists.</p>The reason that many in the community expected new physics at the LHC was the criterion of naturalness. Naturalness, in general, is the requirement that a theory should not contain dimensionless numbers that are either very large or very small. If that is so, then theorists will complain the numbers are “finetuned” and regard the theory as contrived and hand-made, not to say ugly. <br /><br />Technical naturalness (originally proposed by ‘t Hooft) is a formalized version of naturalness which is applied in the context of effective field theories in particular. Since you can convert any number much larger than one into a number much smaller than one by taking its inverse, it’s sufficient to consider small numbers in the following. A theory is technically natural if all suspiciously small numbers are protected by a symmetry. The standard model is technically natural, except for the mass of the Higgs. <br /><br />The Higgs is the only (fundamental) scalar we know and, unlike all the other particles, its mass receives quantum corrections of the order of the cutoff of the theory. The cutoff is assumed to be close by the Planck energy – that means the estimated mass is 15 orders of magnitude larger than the observed mass. This too-large mass of the Higgs could be remedied simply by subtracting a similarly large term. This term however would have to be delicately chosen so that it almost, but not exactly, cancels the huge Planck-scale contribution. It would hence require finetuning. <br /><br />In the framework of effective field theories, a theory that is not natural is one that requires a lot of finetuning at high energies to get the theory at low energies to work out correctly. The degree of finetuning can, and has been, quantified in various measures of naturalness. Finetuning is thought of as unacceptable because the theory at high energy is presumed to be more fundamental. The physics we find at low energies, so the argument, should not be highly sensitive to the choice we make for that more fundamental theory.<br /><br />Until a few years ago, most high energy particle theorists therefore would have told you that the apparent need to finetuning the Higgs mass means that new physics must appear nearby the energy scale where the Higgs will be produced. The new physics, for example supersymmetry, would avoid the finetuning. <br /><br />There’s a standard tale they have about the use of naturalness arguments, which goes somewhat like this: <br /><br />1) The electron mass isn’t natural in classical electrodynamics, and if one wants to avoid finetuning this means new physics has to appear at around 70 MeV. Indeed, new physics appears even earlier in form of the positron, rendering the electron mass technically natural. <br /><br />2) The difference between the masses of the neutral and charged pion is not natural because it’s suspiciously small. To prevent fine-tuning one estimates new physics must appear around 700 MeV, and indeed it shows up in form of the rho meson. <br /><br />3) The lack of flavor changing neutral currents in the standard model means that a parameter which could a priori have been anything must be very small. To avoid fine-tuning, the existence of the charm quark is required. And indeed, the charm quark shows up in the estimated energy range.<br /><br />From these three examples only the last one was an actual prediction (Glashow, Iliopoulos, and Maiani, 1970). To my knowledge this is the only prediction that technical naturalness has ever given rise to – the other two examples are post-dictions.<br /><br />Not exactly a great score card.<br /><br />But well, given that the standard model – in hindsight – obeys this principle, it seems reasonable enough to extrapolate it to the Higgs mass. Or does it? Seeing that the cosmological constant, the only other known example where the Planck mass comes in, isn’t natural either, I am not very convinced.<br /><br />A much larger problem with naturalness is that it’s a circular argument and thus a merely aesthetic criterion. Or, if you prefer, a philosophic criterion. You cannot make a statement about the likeliness of an occurrence without a probability distribution. And that distribution already necessitates a choice. <br /><br />In the currently used naturalness arguments, the probability distribution is assumed to be uniform (or at least approximately uniform) in a range that can be normalized to one by dividing through suitable powers of the cutoff. Any other type of distribution, say, one that is sharply peaked around small values, would require the introduction of such a small value in the distribution already. But such a small value justifies itself by the probability distribution just like a number close to one justifies itself by its probability distribution. <br /><br />Naturalness, hence, becomes a chicken-and-egg problem: Put in the number one, get out the number one. Put in 0.00004, get out 0.00004. The only way to break that circle is to just postulate that some number is somehow better than all other numbers. <br /><br />The number one is indeed a special number in that it’s the unit element of the multiplication group. One can try to exploit this to come up with a mechanism that prefers a uniform distribution with an approximate width of one by introducing a probability distribution on the space of probability distributions, leading to a recursion relation. But that just leaves one to explain why that mechanism. <br /><br />Another way to see that this can’t solve the problem is that any such mechanism will depend on the basis in the space of functions. Eg, you could try to single out a probability distribution by asking that it’s the same as its Fourier-transformation. But the Fourier-transformation is just one of infinitely many basis transformations in the space of functions. So again, why exactly this one? <br /><br />Or you could try to introduce a probability distribution on the space of transformations among bases of probability distributions, and so on. Indeed I’ve played around with this for some while. But in the end you are always left with an ambiguity, either you have to choose the distribution, or the basis, or the transformation. It’s just pushing around the bump under the carpet. <br /><br />The basic reason there’s no solution to this conundrum is that you’d need another theory for the probability distribution, and that theory per assumption isn’t part of the theory for which you want the distribution. (It’s similar to the issue with the meta-law for time-varying fundamental constants, in case you’re familiar with this argument.)<br /><br />In any case, whether you buy my conclusion or not, it should give you a pause that high energy theorists don’t ever address the question where the probability distribution comes from. Suppose there indeed was a UV-complete theory of everything that predicted all the parameters in the standard model. Why then would you expect the parameters to be stochastically distributed to begin with?<br /><br />This lacking probability distribution, however, isn’t my main issue with naturalness. Let’s just postulate that the distribution is uniform and admit it’s an aesthetic criterion, alrighty then. My main issue with naturalness is that it’s a fundamentally nonsensical criterion. <br /><br />Any theory that we can conceive of which describes nature correctly must necessarily contain hand-picked assumptions which we have chosen “just” to fit observations. If that wasn’t so, all we’d have left to pick assumptions would be mathematical consistency, and we’d end up in Tegmark’s mathematical universe. In the mathematical universe then, we’d no longer have to choose a consistent theory, ok. But we’d instead have to figure out where we are, and that’s the same question in green. <br /><br />All our theories contain lots of assumptions like Hilbert-spaces and Lie-algebras and Haussdorf measures and so on. For none of these is there any explanation other than “it works.” In the space of all possible mathematics, the selection of this particular math is infinitely fine-tuned already – and it has to be, for otherwise we’d be lost again in Tegmark space. <br /><br />The mere idea that we can justify the choice of assumptions for our theories in any other way than requiring them to reproduce observations is logical mush. The existing naturalness arguments single out a particular type of assumption – parameters that take on numerical values – but what’s worse about this hand-selected assumption than any other hand-selected assumption?<br /><br />This is not to say that naturalness is always a useless criterion. It can be applied in cases where one knows the probability distribution, for example for the typical distances between stars or the typical quantum fluctuation in the early universe, etc. I also suspect that it is possible to find an argument for the naturalness of the standard model that does not necessitate to postulate a probability distribution, but I am not aware of one.<br /><br />It’s somewhat of a mystery to me why naturalness has become so popular in theoretical high energy physics. I’m happy to see it go out of the window now. Keep your eyes open in the next couple of years and you’ll witness that turning point in the history of science when theoretical physicists stopped dictating nature what’s supposedly natural.Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com74http://backreaction.blogspot.com/2016/07/why-lhc-is-such-disappointment-delusion.htmltag:blogger.com,1999:blog-22973357.post-68985833093826988522016-06-24T09:44:00.000-04:002016-06-24T09:46:15.881-04:00Where can new physics hide?<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-PIj5yOQaNYo/V204yC9KjmI/AAAAAAAADHk/VtbWMkWYWlsam5EbizXuoHF_pmChGtaWACLcB/s1600/new.jpg" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="160" src="https://1.bp.blogspot.com/-PIj5yOQaNYo/V204yC9KjmI/AAAAAAAADHk/VtbWMkWYWlsam5EbizXuoHF_pmChGtaWACLcB/s200/new.jpg" width="200" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Also an acronym for “Not Even Wrong.”</td></tr></tbody></table><p>The year is 2016, and physicists are restless. Four years ago, the LHC confirmed the Higgs-boson, the last outstanding prediction of the standard model. The chances were good, so they thought, that the LHC would also discover other new particles – naturalness seem to demand it. But their hopes were disappointed. </p>The standard model and general relativity do a great job, but physicists know this can’t be it. Or at least they think they know: The theories are incomplete, not only disagreeable and staring each other in the face without talking, but inadmissibly wrong, giving rise to paradoxa with no known cure. There has to be more to find, somewhere. But where?<br /><br />The hiding places for novel phenomena are getting smaller. But physicists haven’t yet exhausted their options. Here are the most promising areas where they currently search: <br /><br /><b>1. Weak Coupling</b><br /><br />Particle collisions at high energies, like those reached at the LHC, can produce all existing particles up to the energy that the colliding particles had. The amount of new particles however depends on the strength by which they couple to the particles that were brought to collision (for the LHC that’s protons, or their constituents quarks and gluons, respectively). A particle that couples very weakly might be produced so rarely that it could have gone unnoticed so far.<br /><br />Physicists have proposed many new particles which fall into this category because weakly interacting stuff generally looks a lot like dark matter. Most notably there are the weakly interacting massive particles (WIMPs), sterile neutrinos (that are neutrinos which don’t couple to the known leptons), and axions (proposed to solve the strong CP problem and also a dark matter candidate). <br /><br />These particles are being looked for both by direct detection measurements – monitoring large tanks in underground mines for rare interactions – and by looking out for unexplained astrophysical processes that could make for an indirect signal.<br /><br /><b>2. High Energies </b><br /><br />If the particles are not of the weakly interacting type, we would have noticed them already, unless their mass is beyond the energy that we have reached so far with particle colliders. In this category we find all the supersymmetric partner particles, which are much heavier than the standard model particles because supersymmetry is broken. Also at high energies could hide excitations of particles that exist in models with compactified extra dimensions. These excitations are similar to higher harmonics of a string and show up at certain discrete energy levels which depend on the size of the extra dimension.<br /><br />Strictly speaking, it isn’t the mass that is relevant to the question whether a particle can be discovered, but the energy necessary to produce the particles, which includes binding energy. An interaction like the strong nuclear force, for example, displays “confinement” which means that it takes a lot of energy to tear quarks apart even though their masses are not all that large. Hence, quarks could have constituents – often called “preons” – that have an interaction – dubbed “technicolor” – similar to the strong nuclear force. The most obvious models of technicolor however ran into conflict with data decades ago. The idea however isn’t entirely dead, and though the surviving models aren’t presently particularly popular, some variants are still viable.<br /><br />These phenomena are being looked for at the LHC and also in highly energetic cosmic ray showers.<br /><br /><b>3. High Precision</b><br /><br />High precision tests of standard model processes are complementary to high energy measurements. They can be sensitive to tiniest effects stemming from virtual particles with energies too high to be produced at colliders, but still making a contribution at lower energies due to quantum effects. Examples for this are proton decay, neutron-antineutron oscillation, the muon g-2, the neutron electric dipole moment, or Kaon oscillations. There are existing experiments for all of these, searching for deviations from the standard model, and the precision for these measurements is constantly increasing. <br /><br />A somewhat different high precision test is the search for neutrinoless double-beta decay which would demonstrate that neutrinos are Majorana-particles, an entirely new type of particle. (When it comes to fundamental particles that is. Majorana particles have recently been produced as emergent excitations in condensed matter systems.)<br /><br /><b>4. Long ago </b><br /><br />In the early universe, matter was much denser and hotter than we can hope to ever achieve in our particle colliders. Hence, signatures left over from this time can deliver a bounty of new insights. The temperature fluctuations in the cosmic microwave background (B-modes and non-Gaussianities) may be able to test scenarios of inflation or its alternatives (like phase transitions from a non-geometric phase), whether our universe had a big bounce instead of a big bang, and – with some optimism – even whether gravity was quantized back them. <br /><br /><b>5. Far away</b><br /><br />Some signatures of new physics appear on long distances rather than of short. An outstanding question is for example what’s the shape of the universe? Is it really infinitely large, or does it close back onto itself? And if it does, then how does it do this? One can study these questions by looking for repeating patterns in the temperature fluctuation of the cosmic microwave background (CMB). If we live in a multiverse, it might occasionally happen that two universes collide, and this too would leave a signal in the CMB.<br /><br />New insights might also hide in some of the well-known problems with the cosmological concordance model, such as the too pronounced galaxy cusps or the too many dwarf galaxies that don’t fit well with observations. It is widely believed that these problems are numerical issues or due to a lack of understanding of astrophysical processes and not pointers to something fundamentally new. But who knows?<br /><br />Another novel phenomenon that would become noticeable on long distances is a fifth force, which would lead to subtle deviations from general relativity. This might have all kinds of effects, from violations of the equivalence principle to a time-dependence of dark energy. Hence, there are experiments testing the equivalence principle and the constancy of dark energy to every higher precision.<br /><br /><b>6. Right here</b><br /><br />Not all experiments are huge and expensive. While tabletop discoveries have become increasingly unlikely simply because we’ve pretty much tried all that could be done, there are still areas where small-scale lab experiments reach into unknown territory. This is the case notably in the foundations of quantum mechanics, where nanoscale devices, single photon sources and – detectors, and increasingly sophisticated noise-control technics have enabled previously impossible experiments. Maybe one day we’ll be able to solve the dispute over the “correct” interpretation of quantum mechanics simply by measuring which one is right.<br /><br />So, physics isn’t over yet. It has become more difficult to test new fundamental theories, but we are pushing the limits in many currently running experiments. <br /><hr /><i>[This post previously appeared on <a href="http://www.forbes.com/sites/startswithabang/2016/05/24/where-is-new-physics-hiding-and-how-can-we-find-it/#599416681dd3">Starts With a Bang</a>.] </i>Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com52http://backreaction.blogspot.com/2016/06/where-can-new-physics-hide.htmltag:blogger.com,1999:blog-22973357.post-40958744330371137022016-06-24T01:15:00.000-04:002016-06-24T01:15:01.725-04:00Wissenschaft auf AbwegenIch war am Montag in Regensburg und habe dort einen öffentlichen Vortrag gegeben zum Thema “Wissenschaft auf Abwegen” für eine Reihe unter dem Titel “<a href="http://was-ist-wirklich.de/was-ist-wirklich/">Was ist Wirklich?</a>” Das ganze ist jetzt auf YouTube. Das Video besteht aus etwa 30 Minuten Vortrag und danach noch eine Stunde Diskussion. Alles in Deutsch. Nur was für eche Fans ;) <br><br> <center><iframe width="450" height="253" src="https://www.youtube.com/embed/DSbkIMh-aoY" frameborder="0" allowfullscreen></iframe></center>Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com14http://backreaction.blogspot.com/2016/06/wissenschaft-auf-abwegen.htmltag:blogger.com,1999:blog-22973357.post-81116930940005356092016-06-18T09:42:00.001-04:002016-06-18T11:18:27.262-04:00New study finds no sign of entanglement with other universes<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-qfrZgI0TWQQ/V2VOgeMUdlI/AAAAAAAADHI/y1QlEjkf0tAzgNQrsMQu38LyNi20NCotACLcB/s1600/multiverse.jpg" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="125" src="https://2.bp.blogspot.com/-qfrZgI0TWQQ/V2VOgeMUdlI/AAAAAAAADHI/y1QlEjkf0tAzgNQrsMQu38LyNi20NCotACLcB/s200/multiverse.jpg" width="200" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Somewhere in the multiverse<br />you’re having a good day.</td></tr></tbody></table>The German Autobahn is famous for its lack of speed limits, and yet the greatest speed limit of all comes from a German: Nothing, Albert Einstein taught us, is allowed to travel faster than light. This doesn’t prevent our ideas from racing, but sometimes it prevents us from ticketing them.<p>If we live in an eternally inflating multiverse that contains a vast number of universes, then the other universes recede from us faster than light. We are hence “causally disconnected” from the rest of the multiverse, separated from the other universes by the ongoing exponential expansion of space, unable to ever make a measurement that could confirm their existence. It is this causal disconnect that has lead multiverse critics to complain the idea isn’t within the realm of science.</p>There are however some situations in which a multiverse can give rise to observable consequences. One is that our universe might in the past have collided with another universe, which would have left a tell-tale signature in the cosmic microwave background. Unfortunately, no evidence for this has been found.<br /><br />Another proposal for how to test the multiverse is to exploit the subtle non-locality that quantum mechanics gives rise to. If we live in an ensemble of universes, and these universes started out in an entangled quantum state, then we might be able to today detect relics of their past entanglement.<br /><br />This idea was made concrete by Richard Holman, Laura Mersini-Houghton, and Tomo Takahashi ten years ago. In their model (<a href="http://arxiv.org/abs/hep-th/0611223">hep-th/0611223</a>, <a href="http://arxiv.org/abs/hep-th/0612142">hep-th/0612142</a>), the original entanglement present among universes in the landscape decays and effectively leaves a correction to the potential that gives rise to inflation in our universe. This corrected potential in return affects observables that we can measure today. <br /><br />The particular way of Mersini-Houghton and Holman to include entanglement in the landscape isn’t by any means derived from first principles. It is a phenomenological construction that implicitly makes many assumptions about the way quantum effects are realized on the landscape. But, hey, it’s a model that makes predictions, and in theoretical high energy today that’s something to be grateful for.<br /><br />They predicted back then that such an entanglement-corrected cosmology would in particular affect the physics on very large scales, giving rise to a modulation of the power spectrum that makes the cold spot a more likely appearance, a suppression of the power at large angular scale, and an alignment in the directions in which large structures move – the so-called “dark flow.” The tentative evidence of a dark flow, which was predicted in <a href="https://www.blogger.com/%E2%80%9Dhttp://arxiv.org/abs/0810.5388%E2%80%9D">2008</a> had gone by <a href="http://arxiv.org/abs/1303.5090">2013</a>. But this disagreement with the data didn’t do much to the <a href="http://www.dailymail.co.uk/sciencetech/article-2326869/Is-universe-merely-billions-Evidence-existence-multiverse-revealed-time-cosmic-map.html">popularity of the model</a> <a href="http://www.thesundaytimes.co.uk/sto/news/uk_news/Science/article1261602.ece">in the press</a>.<br /><br />In a recent paper, William Kinney from the University at Buffalo put to test the multiverse-entanglement with the most recent cosmological data:<br /><ul><b>Limits on Entanglement Effects in the String Landscape from Planck and BICEP/Keck Data</b><br />William H. Kinney<br /><a href="http://arxiv.org/abs/1606.00672">arXiv:1606.00672</a> [astro-ph.CO]</ul>The brief summary is that not only hasn’t he found any evidence for the entanglement-modification, he has ruled out the formerly proposed model for two general types of inflationary potentials. The first, a generic exponential inflation, is by itself incompatible with the data, but adding the entanglement correction doesn’t help to make it fit. The second, Starobinski inflation, is by itself a good fit to the data, but the entanglement correction spoils the fit. <br /><br />Much to my puzzlement, his analysis also shows that some of the predictions of the original model (such as the modulation of the power spectrum) weren’t predictions to begin with, because Kinney in his calculation found that there are choices of parameters in which these effects don’t appear at all. <br /><br />Leaving aside that this sheds a rather odd light on the original predictions, it’s not even clear exactly what has been ruled out here. What Kinney’s analysis does is to exclude a particular form of the effective potential for inflation (the one with the entanglement modification). This potential is, in the model by Holman and Mersini-Houghton, a function of the original potential (the one without the entanglement correction). Rather than ruling out the entanglement-modification, I can hence interpret this result to mean that the original potential just wasn’t the right one.<br /><br />Or, in other words, how am I to know that one can’t find some other potential that will fit the data after adding the entanglement correction. The only difficulty I see in this would be to ensure that the uncorrected potential should still lead to eternal inflation. <br /><br />To add meat to an unfalsifiable idea that made predictions which weren’t, one of the authors who proposed the entanglement model, Laura Mersini-Houghton, is apparently quite unhappy with Kinney’s paper and tries to use an intellectual property claim to get it removed from the arXiv (see comments for details). I will resist the temptation to comment on the matter and simply direct you to the Wikipedia entry on the <a href="https://en.wikipedia.org/wiki/Streisand_effect">Streisand Effect</a>. Dear Internet, please do your job.<br /><br />For better or worse, I have in the last years been dragged into a discussion about what is and isn’t science, which has forced me to think more about the multiverse than I and my infinitely many copies believe is good for their sanity. After this latter episode, the status is that I side with Joe Silk <a href="http://astrogeo.oxfordjournals.org/content/48/2/2.30.full">who captured it well</a>: “[O]ne can always find inflationary models to explain whatever phenomenon is represented by the flavour of the month.” Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com34http://backreaction.blogspot.com/2016/06/new-study-finds-no-sign-of-entanglement.htmltag:blogger.com,1999:blog-22973357.post-24998296347733102782016-06-13T06:32:00.001-04:002016-06-14T02:09:51.400-04:00String phenomenology of the somewhat different kind<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody><tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-jvVbW0ut-rI/V16JgKw4fNI/AAAAAAAADGs/v5gZuG25h-EFl8VWwWsafatg6HzAH_RcwCLcB/s1600/catscradle.jpg" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="138" src="https://2.bp.blogspot.com/-jvVbW0ut-rI/V16JgKw4fNI/AAAAAAAADGs/v5gZuG25h-EFl8VWwWsafatg6HzAH_RcwCLcB/s200/catscradle.jpg" width="200" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">[Cat’s cradle. <a href="http://www.ifyoulovetoread.com/book/chten_cats1105.htm">Image Source</a>.]</td></tr></tbody></table>Ten years ago, I didn’t take the “string wars” seriously. To begin with, referring to such an esoteric conflict as “war” seems disrespectful to millions caught in actual wars. In comparison to their suffering it’s hard to take anything seriously. <p>Leaving aside my discomfort with the nomenclature, the focus on string theory struck me as odd. String theory as a research area stands out in hep-th and gr-qc merely because of the large number of followers, not by the supposedly controversial research practices. For anybody working in the field it is apparent that string theorists don’t differ in their single-minded focus from physicists in other disciplines. Overspecialization is a common disease of academia, but one that necessarily goes along with division of labor, and often it is an efficient route to fast progress.</p>No, I thought back then, string theory wasn’t the disease, it was merely a symptom. The underlying disease was one that would surely soon be recognized and addressed: Theoreticians – as scientists whose most-used equipment is their own brain – must be careful to avoid systematic bias introduced by their apparatuses. In other words, scientific communities, and especially those which lack timely feedback by data, need guidelines to avoid social and cognitive biases. <br /><br />This is so obvious it came as a surprise to me that, in 2006, everybody was hitting on Lee Smolin for pointing out what everybody knew anyway, that string theorists, lacking experimental feedback for decades, had drifted off in a math bubble with questionable relevance for the description of nature. It’s somewhat ironic that, from my personal experience, the situation is actually worse in Loop Quantum Gravity, an approach pioneered, among others, by Lee Smolin. At least the math used by string theorists seems to be good for something. The same cannot be said about LQG. <br /><br />Ten years later, it is clear that I was wrong in thinking that just drawing attention to the problem would seed a solution. Not only has the situation not improved, it has worsened. We now have <a href="http://www.nature.com/news/scientific-method-defend-the-integrity-of-physics-1.16535">some theoretical physicists who argue that we should alter the scientific method</a> so that the success of a theory can be assessed by means other than empirical evidence. This idea, which has sprung up in the philosophy community, isn’t all that bad in principle. In practice, however, it will merely serve to exacerbate social streamlining: If theorists can draw on criteria other than the ability of a theory to explain observations, the first criterion they’ll take into account is aesthetic value, and the second is popularity with their colleagues. Nothing good can come out of this.<br /><br />And nothing good has come out of it, nothing has changed. The string wars clearly were more interesting for sociologists than they were for physicists. In the last couple of months several articles have appeared which comment on various aspects of this episode, which I’ve read and want to briefly summarize for you. <br /><br />First, there is<br /><ul><b>Collective Belief, Kuhn, and the String Theory Community</b><br />Weatherall, James Owen and Gilbert, Margaret<br /><a href="http://philsci-archive.pitt.edu/11413/">philsci-archive:11413</a></ul>This paper is a very Smolin-centric discussion of whether string theorists are exceptional in their group beliefs. The authors argue that, no, actually string theorists just behave like normal humans and “these features seem unusual to Smolin not because they are actually unusual, but because he occupies an unusual position from which to observe them.” He is unusual, the authors explain, for having worked on string theory, but then deciding to not continue in the field. <br /><br />It makes sense, the authors write, that people whose well-being to some extent depends on the acceptance by the group will adapt to the group: <br /><blockquote>“Expressing a contrary view – bucking the consensus – is an offense against the other members of the community… So, irrespective of their personal beliefs, there are pressures on individual scientists to speak in certain ways. Moreover, insofar as individuals are psychologically disposed to avoid cognitive dissonance, the obligation to speak in certain ways can affect one’s personal beliefs so as to bring them into line with the consensus, further suppressing dissent from within the group.”</blockquote>Furthermore: <br /><blockquote>“As parties to a joint commitment, members of the string theory community are obligated to act as mouthpieces of their collective belief.”</blockquote>I actually thought we knew this since 1895, when Le Bon’s published his “Study of the Popular Mind.”<br /><br />The authors of the paper then point out that it’s normal for members of a scientific community to not jump ship at the slightest indication of conflicting evidence because often such evidence turns out to be misleading. It didn’t become clear to me what evidence they might be referring to; supposedly it’s non-empirical.<br /><br />They further argue that a certain disregard for what is happening outside one’s own research area is also normal: “Science is successful in part because of a distinctive kind of focused, collaborative research,” and due to their commitment to the agenda “participants can be expected to resist change with respect to the framework of collective beliefs.”<br /><br />This is all reasonable enough. Unfortunately, the authors entirely miss the main point, the very reason for the whole debate. The question isn’t whether string theorists’ behavior is that of normal humans – I don’t think that was ever in doubt – but whether that “normal human behavior” is beneficial for science. Scientific research requires, in a very specific sense, non-human behavior. It’s not normal for individuals to disregard subjective assessments and to not pay attention to social pressure. And yet, that is exactly what good science would require. <br /><br />The second paper is<br /><ul><b>Contested Boundaries: The String Theory Debates and Ideologies of Science </b><br />Sophie Ritson, and Kristian Camilleri <br /><a href="http://www.mitpressjournals.org/doi/abs/10.1162/POSC_a_00168#.V15LGPl97Dc">Perspectives on Science, Vol. 23, No. 2, pp 192-227</a>. </ul>This paper is basically a summary of the string wars that focuses on the question whether or not string theory can be considered science. This “demarcation problem” is a topic that philosophers and sociologists love to discuss, but to me it really isn’t particularly interesting how you classify some research area, to me the question is whether it’s good for something. This is a question which should be decided by the community, but as long as decision making is influenced by social pressures and cognitive biases I can’t trust the community judgement.<br /><br />The article has a lot of fun quotations from very convinced string theorists, for example by David Gross: “String theory is full of qualitative predictions, such as the production of black holes at the LHC.” I’m not sure what’s the difference between a qualitative prediction and no prediction, but either way it’s certainly not a prediction that was very successful. Also nice is John Schwarz claiming that “supersymmetry is the major prediction of string theory that could appear at accessible energies” and that “some of these superpartners should be observable at the LHC.” Lots of coulds and shoulds that didn’t quite pan out. <br /><br />While the article gives a good overview on the opinions about string theory that were voiced during the 2006 controversy, the authors themselves clearly don’t know very well the topic they are writing about. A particularly odd statement that highlights their skewed perspective is: “String theory currently enjoys a privileged status by virtue of being the dominant paradigm within theoretical physics.” <br /><br />I find it quite annoying how frequently I encounter this extrapolation from a particular research area – may that be string theory, supersymmetry, or multiverse cosmology – to all of physics. <a href="http://backreaction.blogspot.com/2013/05/what-do-most-physicists-work-on.html">The vast majority of physicists</a> work in fields like quantum optics, photonics, hadronic and nuclear physics, statistical mechanics, atomic physics, solid state physics, low-temperature physics, plasma physics, astrophysics, condensed matter physics, and so on. They have nothing whatsoever to do with string theory, and certainly would be very surprised to hear that it’s “the dominant paradigm.”<br /><br />In any case, you might find this paper useful if you didn’t follow the discussion 10 years ago.<br /><br />Finally, there is this paper<br /><ul><b>‘Crackpots’ and ‘active researchers’: The controversy over links between arXiv and the scientific blogosphere</b><br />Sophie Ritson<br /><a href="http://sss.sagepub.com/content/early/2016/06/03/0306312716647508.abstract?rss=1">Social Studies of Science, 1-22, 2016</a></ul><br />The title of the paper doesn’t explicitly refer to string theory, but most of it is also a discussion of the demarcation problem on the example of arXiv trackbacks. (I suspect this paper is a spin-off of the previous paper.)<br /><br />ArXiv trackbacks, in case you didn’t know, are links to blogposts that show up on some papers’ arxiv sites, when the blogpost has referred to the paper. To exactly which blogs trackbacks show up and who makes the decision whether they do is one of the arXiv’s best-kept secrets. <a href="http://www.math.columbia.edu/~woit/wordpress/">Peter Woit’s blog</a>, infamously, doesn’t show up in the arXiv trackbacks on the, rather spurious, reason that he supposedly doesn’t count as “active researcher.” The paper tells the full 2006 story with lots of quotes from bloggers you are probably familiar with.<br /><br />The arXiv recently conducted a user survey, among other things about the trackback feature, which makes me think they might have some updates planned. <br /><br />On the question who counts as crackpot, the paper (unsurprisingly) doesn’t come to a conclusion other than noting that scientists deal with the issue by stating “we know one when we see one.” I don’t think there can be any other definition than that. To me the notion of “crackpot” is an excellent example of an emergent feature – it’s a demarcation that the community creates during its operation. Any attempt to come up with a definition from first principles is hence doomed to fail. <br /><br />The rest of the paper is a general discussion of the role of blogs in science communication, but I didn’t find it particularly insightful. The author comes to the (correct) conclusion that blog content turned out not to have such a short life-time as many feared, but otherwise basically just notes that there are as many ways to use blogs as there are bloggers. But then if you are reading this, you already knew that. <br /><br />One of the main benefits that I see in blogs isn’t mentioned in the paper at all, which is that blogs supports communication between scientific communities that are only loosely connected. In my own research area, I read the papers, hear the seminars, and go to conferences, and I therefore know pretty well what is going on – with or without blogs. But I use blogs to keep up to date in adjacent fields, like cosmology, astrophysics and, to a lesser extent, condensed matter physics and quantum optics. For this purpose I find blogs considerably more useful than popular science news, because the latter often doesn’t provide a useful amount of detail and commentary, not to mention that they all tend to latch onto the same three papers that made big unsubstantiated claims.<br /><br />Don’t worry, I haven’t suddenly become obsessed with string theory. I’ve read through these sociology papers mainly because I cannot not write a few paragraphs about the topic in my book. But I promise that’s it from me about string theory for some while. <br><br><b>Update:</b> <a href="http://www.math.columbia.edu/~woit/wordpress/?p=8578">Peter Woit has some comments on the trackback issue</a>.Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com47http://backreaction.blogspot.com/2016/06/string-phenomenology-of-somewhat.htmltag:blogger.com,1999:blog-22973357.post-34627707771360593022016-06-06T07:25:00.001-04:002016-06-08T05:52:47.642-04:00Dear Dr B: Why not string theory?[I got this question in reply to my last week’s <a href="http://backreaction.blogspot.com/2016/05/book-review-why-string-theory-by-joseph.html">book review of <i>Why String Theory?</i> by Joseph Conlon</a>.]<br /><br />Dear Marco:<br /><a href="http://4.bp.blogspot.com/-CnPW42VNL2Y/Vg0ax2y2UjI/AAAAAAAACw8/wLZ9g9ylcM0/s200/120324-kitten-wool.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" src="https://4.bp.blogspot.com/-CnPW42VNL2Y/Vg0ax2y2UjI/AAAAAAAACw8/wLZ9g9ylcM0/s200/120324-kitten-wool.jpg" /></a><br />Because we might be wasting time and money and, ultimately, risk that progress stalls entirely. <p>In contrast to many of my colleagues I do not think that trying to find a quantum theory of gravity is an endeavor purely for the sake of knowledge. Instead, it seems likely to me that finding out what are the quantum properties of space and time will further our understanding of quantum theory in general. And since that theory underlies all modern technology, this is research which bears relevance for applications. Not in ten years and not in 50 years, but maybe in 100 or 500 years. </p>So far, string theory has scored in two areas. First, it has proved interesting for mathematicians. But I’m not one to easily get floored by pretty theorems – I care about math only to the extent that it’s useful to explain the world. Second, string theory has shown to be useful to push ahead with the lesser understood aspects of quantum field theories. This seems a fruitful avenue and is certainly something to continue. However, this has nothing to do with string theory as a theory of quantum gravity and a unification of the fundamental interactions.<br /><br />As far as quantum gravity is concerned, string theorist’s main argument seems to be “Well, can you come up with something better?” Then of course if someone answers this question with “Yes” they would never agree that something else might possibly be better. And why would they – there’s no evidence forcing them one way or the other.<br /><br />I don’t see what one learns from discussing which theory is “better” based on philosophical or aesthetic criteria. That’s why I decided to stay out of this and instead work on quantum gravity phenomenology. As far as testability is concerned all existing approaches to quantum gravity do equally badly, and so I’m equally unconvinced by all of them. It is somewhat of a mystery to me why string theory has become so dominant.<br /><br />String theorists are very proud of having a microcanonical explanation for the black hole entropy. But we don’t know whether that’s actually a correct description of nature, since nobody has ever seen a black hole evaporate. In fact one could read the firewall problem as a demonstration that indeed this cannot be a correct description of nature. Therefore, this calculation leaves me utterly unimpressed. <br /><br />But let me be clear here. Nobody (at least nobody whose opinion matters) says that string theory is a research program that should just be discontinued. The question is instead one of balance – does the promise justify the amount of funding spend on it? And the answer to this question is almost certainly no. <br /><br />The reason is that academia is currently organized so that it invites communal reinforcement, prevents researchers from leaving fields whose promise is dwindling, and supports a rich-get-richer trend. That institutional assessments use the quantity of papers and citation counts as a proxy for quality creates a bonus for fields in which papers can be cranked out quickly. Hence it isn’t surprising that an area whose mathematics its own practitioners frequently describe as “rich” would flourish. What does mathematical “richness” tell us about the use of a theory in the description of nature? I am not aware of any known relation.<br /><br />In his book <i>Why String Theory?</i>, Conlon tells the history of the discipline from a string theorist’s perspective. As a counterpoint, let me tell you how a cynical outsider might tell this story:<br /><br />String theory was originally conceived as a theory of the strong nuclear force, but it was soon discovered that quantum chromodynamics was more up to the task. After noting that string theory contains a particle that could be identified as the graviton, it was reconsidered as a theory of quantum gravity. <br /><br />It turned out however that string theory only makes sense in a 25-dimensional space. To make that compatible with observations, 22 of the dimensions were moved out of sight by rolling them up (compactifying) them to a radius so small they couldn’t be observationally probed. <br /><br />Next it was noted that the theory also needs supersymmetry. This brings down the number of space dimensions to 9, but also brings a new problem: The world, unfortunately, doesn’t seem to be supersymmetric. Hence, it was postulated that supersymmetry is broken at an energy scale so high we wouldn’t see the symmetry. Even with that problem fixed, however, it was quickly noticed that moving the superpartners out of direct reach would still induce flavor changing neutral currents that, among other things, would lead to proton decay and so be in conflict with observation. Thus, theorists invented R-parity to fix that problem. <br /><br />The next problem that appeared was that the cosmological constant turned out to be positive instead of zero or negative. While a negative cosmological constant would have been easy to accommodate, string theorists didn’t know what to do with a positive one. But it only took some years to come up with an idea to make that happen too. <br /><br />String theory was hoped to be a unique completion of the standard model including general relativity. Instead it slowly became clear that there is a huge number of different ways to get rid of the additional dimensions, each of which leads to a different theory at low energies. String theorists are now trying to deal with that problem by inventing some probability measure according to which the standard model is at least a probable occurrence in string theory.<br /><br />So, you asked, why not string theory? Because it’s an approach that has been fixed over and over again to make it compatible with conflicting observations. Every time that’s been done, string theorists became more convinced of their ideas. And every time they did this, <i>I</i> became more convinced they are merely building a mathematical toy universe.<br /><br />String theorists of course deny that they are influenced by anything but objective assessment. One noteworthy exception is Joe Polchinski who has considered that social effects play a role, but just came to the conclusion that they aren’t relevant. I think it speaks for his intellectual sincerity that he at least considered it. <br /><br />At the Munich workshop last December, David Gross (in an exchange with Carlo Rovelli) explained that funding decisions have no influence on whether theoretical physicists chose to work in one field or the other. Well, that’s easy to say if you’re a Nobel Prize winner.<br /><br />Conlon in his book provides “evidence” that social bias plays no role by explaining that there was only one string theorist in a panel that (positively) evaluated one of his grants. To begin with anecdotes can’t replace data and there is ample evidence that social biases are common human traits, so by default scientists should be susceptible. But even considering his anecdote, I’m not sure why Conlon thinks leaving decisions to non-experts limits bias. My expectation would be that it amplifies bias because it requires drawing on simplified criteria, like the number of papers published and how often they’ve been cited. And what does that depend on? Depends on how many people there are in the field and how many peers favorably reviewed papers on the topic of your work.<br /><br />I am listing these examples to demonstrate that it is quite common of theoretical physicists (not string theorists in particular) to dismiss the mere possibility that social dynamics influences research decisions.<br /><br />How large a role play social dynamics and cognitive biases, and how much do they slow down progress on the foundations of physics? I can’t tell you. But even though I can’t tell you how much faster progress could be, I am sure it’s slowed down. I can tell that in the same way that I can tell you diesel in Germany is sold under market value even though I don’t know the market value. I know that because it’s subsidized. And in the same way I can tell that string theory is overpopulated and its promise is overestimated because it’s an idea that benefits from biases which humans demonstrably possess. But I can’t tell you what its real value would be.<br /><br />The reproduction crisis in the life-sciences and psychology has spurred a debate for better measures of statistical significance. Experimentalists go to length to put into place all kinds of standardized procedures to not draw the wrong conclusions from what their apparatuses measures. In theory development, we have our own crisis, but nobody talks about it. The apparatuses that we use are our own brains and biases we should guard against are cognitive and social biases, communal reinforcement, sunk cost fallacy, wishful thinking and status-quo bias, for just to mention the most common ones. These however are presently entirely unaccounted for. Is this the reason why string theory has gathered so many followers?<br /><br />Some days I side with Polchinski and Gross and don’t think it makes that much of a difference. It really is an interesting topic and it’s promising. On other days I think we’ve wasted 30 years studying bizarre aspects of a theory that doesn’t bring us any closer to understanding quantum gravity, and it’s nothing but an empty bubble of disappointed expectations. Most days I have to admit I just don’t know. <br /><br />Why not string theory? Because enough is enough.<br /><br />Thanks for an interesting question. Sabine Hossenfelderhttps://plus.google.com/111136225362929878171noreply@blogger.com60http://backreaction.blogspot.com/2016/06/dear-dr-b-why-not-string-theory.html