mercredi 18 juin 2014

Un philosophe (apporte son grain de sel) à la table du physicien

Le spectacle de la Nature est un banquet où la soupe phénoménologique se doit d'être riche en modèles mathématiques variés
Où le blogueur essaie d'argumenter sur la nécessité de comparer les différents modèles mathématiques proposés par les physiciens pour comprendre et explorer plus avant la réalité, en le faisant à sa manière habituelle* c'est-à-dire par une citation de texte:
From the times of Niels Bohr, many physicists, mathematicians and biologists have been attentive to philosophical aspects of our doing. Most of us are convinced that the frontier situation of our research can point to aspects of some philosophical relevance - if only the professional philosophers would take the necessary time to become familiar with our thinking. Seldom, however, we read something of the philosophers which can inspire us. The US-American philosopher Charles Sanders Peirce (1839-1914) is an admirable exception. In his semiotics and pragmaticist (he avoids the word “pragmatic”) thinking, he provides a wealth of ideas, spread over an immense life work. It seems to me that many of his ideas, comments, and concepts can shed light on the why and how of mathematization...
 The quality of a mathematical model is not how similar it is to the segment of reality under consideration, but whether it provides a flexible and goal-oriented approach, opening for doubts and indicating ways for the removal of doubts (later trivialized by Popper’s falsification claim). More precisely, Peirce claims
  •  Be aware of differences between different approaches! 
  • Try to distinguish different goals (different priorities) of modelling as precise as possible! 
  • Investigate whether different goals are mutually compatible, i.e., can be reached simultaneously!
  • Behave realistically! Don’t ask: How well does the model reflect a given segment of the world? But ask: Does this model of a given segment of the world support the wanted and possibly wider activities / goals better than other models?
I may add: we have to strike a balance between Abstraction vs. construction, Top-down vs. bottom-up, and Unification vs. specificity. We better keep aware of the variety of Modelling purposes and the multifaceted relations between Theory - model - experiment. Our admiration for the Power of mathematization, the Unreasonable effectiveness of mathematics (Wigner) should not blind us for the Staying and deepening limitations of mathematization opposite new tasks.


*Remarques transtextuelles (ou portrait du blogueur en métacognition)
Quelque part dans son Moi profond, le transcyberphysicien se rêve en soldat inconnu de la guerre épistémologique que se livrent les défenseurs des différents modèles scientifiques de la gravitation quantique (théories des supercordes, gravitation quantique à boucles, piste tensorielle, géométrie spectrale non commutative...); mais à travers son discours basé essentiellement sur un usage immodéré d'extraits de ses propres lectures, il se voit aussi comme une sorte de Sancho Panza: (son Ça en somme ;-) infidèle compagnon de route virtuel d'un célèbre blogueur de sciences polémiste (et parfois triste sire) dont il relate parfois les tribulations dans le métatexte de ce blog.

dimanche 25 mai 2014

Simple comme le nouveau (mais déjà ancien) modèle standard minimal

Rubrique : Sans commentaire //ou presque


Un modèle simple avant d'être beau...
//Début mai on évoquait l'apparente simplicité des lois de la Nature qui sont aujourd'hui presque toutes condensées dans le Modèle Standard Minimal ( MSM en anglais) de la physique des particules et la théorie de la relativité générale (voir ce billet pour un aperçu de l'expression mathématique de cette relative simplicité). On dit bien presque parce que depuis l'achèvement du MSM au début des années 70 de nouveaux faits expérimentaux ont été mis en évidence à la fin des années 90 ou au début des années 2000 qui ne sont pas prédit par ce dernier...
There exist many possible directions to go beyond the Minimal Standard Model (MSM): supersymmetry,extra dimensions, extra gauge symmetries (e.g., grand unification), etc. They are motivated to solve aesthetic and theoretical problems of the MSM, but not necessarily to address empirical problems. It is embarrassing that all currently proposed frameworks have some phenomenological problems, e.g., excessive flavor-changing effects, CP violation, too-rapid proton decay, disagreement with electroweak precision data, and unwanted cosmological relics. In this letter, we advocate a different and conservative approach to physics beyond the MSM. We include the minimal number of new degrees of freedom to accommodate convincing (e.g., > 5σ) evidence for physics beyond the MSM. We do not pay attention to aesthetic problems, such as fine-tuning, the hierarchy problem, etc. We stick to the principle of minimality seriously to write down the Lagrangian that explains everything we know. We call such a model the New Minimal Standard Model (NMSM). In fact, the MSM itself had been constructed in this spirit, and it is a useful exercise to follow through with the same logic at the advent of the major discoveries we have witnessed. Of course, we require it to be a consistent Lorentz-invariant renormalizable four-dimensional quantum field theory, the way the MSM was constructed.

 





Hooman Davoudiasl, Ryuichiro Kitano, Tianjun Li, Hitoshi Murayama,The New Minimal Standard Model 12/05/2004

 


...aux prévisions cosmologiques vieilles de 10 ans déjà...
... The spectrum index of the ϕ^2 chaotic inflation modelis predicted to be 0.96. This may be confirmed in improved  cosmic- microwave background anisotropy data, with more years of WMAP and Planck. The tensor-to-scalar ratio is 0.16.
Id.

... et toujours robustes aujourd'hui jusqu'à preuve du contraire
The Planck nominal mission temperature anisotropy measurements, combined with the WMAP large-angle polarization, constrain the scalar spectral index to ns=0.9603±0.0073.
Subtracting the various dust models and re-deriving the r constraint still results in high significance of detection. For the model which is perhaps the most likely to be close to re- ality (DDM2 cross) the maximum likelihood value shifts to r = 0.16+0.0616 −0.05 with r = 0 disfavored at 5.9σ.
ns=0.9603±0.0073

Quid d(')u(ne) prochain(e extension du) nouveau modèle standard minimal?
http://arxiv.org/abs/1309.1231
http://arxiv.org/pdf/1111.0273.pdf
 //Rédaction encore en cours

dimanche 18 mai 2014

(Voyage) dans la tête de Gerard 't Hooft


Rubrique : Curiositêtes (1)
Le blogueur démarre une nouvelle rubrique qui aura pour but de présenter des physiciens plus ou moins connus du public amateur de sciences, en mettant l'accent sur des travaux originaux plus ou moins reconnus par leurs pairs.

L' tête au carré
Gerardus van 't Hooft est un physicien théoricien hollandais dont le patronyme pourrait se traduire en français par "L' tête" puisqu'en hollandais la tête se dit het Hoofd). Bien qu'ayant été récompensé par le prestigieux prix Nobel il y a déjà quinze ans, il est toujours actif scientifiquement (*) et il n'hésite pas à prendre une part active à la visibilité et à la défense de ses idées sur internet:
  • ses derniers articles sont toujours déposés en libre accès sur arxiv
  • il continue à être invité dans des institutions scientifiques prestigieuses pour des séminaires comme on peut le voir ici, la toile est également riche d'autres vidéos de ses interventions, on recommande particulièrement celle-, qui s'adresse au grand publique;
  • son site personnel est une mine d'or pour l'esprit curieux qui veut entrer dans la tête d'un physicien aussi généreux dans le partage de ses travaux et ses idées qu'il est grand par : l'importance de ses contributions scientifiques et la clarté avec laquelle il expose la physique contemporaine et ses idées plus originales;
  • soulignons enfin qu'il est à notre connaissance le seul Prix Nobel de physique à avoir un compte et à intervenir sur le site public collaboratif : Physics Stack Exchange.
(* recevoir un prix Nobel n'est pas qu'une récompense, c'est aussi une charge de travail et d'obligations sociales diverses et nombreuses qui peuvent entraver la créativité et la productivité de l'heureux récipiendaire).


Le dernier des héros (du Modèle Standard et de la théorie) quantique (des champs)?
Comme le prouve son prix Nobel, on peut être sûr que G. 't Hooft a déjà sa place dans l'histoire des sciences de par sa contribution décisive à l'achèvement du Modèle Standard.
Rappelons que ce modèle, dont une partie essentielle appelée théorie d'unification électrofaible était pour l'essentielle déjà construite à la fin des années soixante (grâce aux travaux de Glashow, Salam et Weinberg en particulier), attendait encore au début des années soixante-dix une véritable reconnaissance de l'ensemble des physiciens. Or cette reconnaissance fut acquise grâce à la démonstration par 't Hooft  - encore étudiant et bien épaulé par son directeur de thèse Veltmann - de la renormalisabilité de cette théorie : propriété fondamentale qui permet de "dominer" les infinis qui apparaissent systématiquement dans les calculs de théorie quantique des champs et qui menacent sans cesse leur pouvoir prédictif et fait planer le doute sur leur cohérence interne.
On pourrait poursuivre en parlant aussi de la place de choix qu'occupe aussi 't Hooft dans l'autre secteur du Modèle Standard: celui qui porte sur l'interaction forte modélisée par la chromodynamique quantique, théorie dont il a été l'un des premiers à comprendre la nature topologique et non perturbative mais dont il ne maîtrisait peut-être pas assez tous les aspects phénoménologiques pour apprécier l'importance de ses propre résultats ainsi que la justesse du conseil d'un autre physicien lui recommandant de publier rapidement ses travaux...
I announced at that meeting my finding that the coefficient determining the running of the coupling strength, that he called β(g^2), for non-Abelian gauge theories is negative, and I wrote down Eq. (5.3) on the blackboard. [Kurt] Symanzik was surprised and skeptical. “If this is true, it will be very important, and you should publish this result quickly, and if you won’t, somebody else will,” he said. I did not follow his advice. A long calculation on quantum gravity with Veltman had to be finished first.
G. 't Hooft, WHEN WAS ASYMPTOTIC FREEDOM DISCOVERED? or THE REHABILITATION OF QUANTUM FIELD THEORY 09/1998
On voit en lisant la dernière phrase de la citation précédente que 't Hooft était déjà impliqué dans un programme de recherche encore plus vaste visant à intégrer la dernière interaction fondamentale connue: la gravitation dans le cadre de la théorie quantique des champs.


Un des fondateurs de la vision quantique des trous noirs et père du principe holographique
Quels ont donc été les apports de 't Hooft au programme d'unification de la physique fondamentale depuis lors? La transcription écrite d'une conférence en l'honneur d'Abdus Salam (autre héro du Modèle Standard) donnée en 1993 et revue et corrigée en 2009 nous donne un élément de réponse:
I am given the opportunity to contemplate some very deep questions concerning the ultimate unification that may perhaps be achieved when all aspects of quantum theory, particle theory and general relativity are combined. One of these questions is the dimensionality of space and time... When we quantize gravity perturbatively we start by postulating a Fock space in which basically free particles roam in a three plus one dimensional world. Naturally, when people discuss possible cut-off mechanisms, they think of some sort of lattice scheme either in 3+1 dimenisional Minkowski space or in 4 dimensional Euclidean space. The cut-off distance scale is then suspected to be the Planck scale. Unfortunately any such lattice scheme seems to be in conflict with local Lorentz invariance or Euclidean invariance, as the case may be, and most of all also with coordinate reparametrization invariance. It seems to be virtually impossible to recover these symmetries at large distance scales, where we want them. So the details of the cut-off are kept necessarily vague. The most direct and obvious physical cut-off does not come from non-renormalizability alone, but from the formation of microscopic black holes as soon as too much energy would be accumulated into too small a region. From a physical point of view it is the black holes that should provide for a natural cut-off all by themselves. This has been this author’s main subject of research for over a decade. 
't Hooft, Dimensional Reduction in Quantum Gravity,1993-2009

Pour contempler plus simplement la vision quantique originale qu'à 't Hooft du trou noir (une particule élémentaire comme les autres à l'échelle de Planck?) on peut se reporter à l'extrait suivant:
For an intuitive understanding of our world, the Hawking effect seems to be quite welcome. It appears to imply that black holes are just like ordinary forms of matter: they absorb and emit things, they have a finite temperature, and they have a finite lifetime. One would have to admit that there are still important aspects of their internal dynamics that are not yet quite understood, but this could perhaps be considered to be of later concern. Important conclusions could already be drawn: the Hawking effect implies that black holes come in a denumerable set of distinct quantum states. This also adds to a useful and attractive picture of what the dynamical properties of space, time and matter may be like at the Planck scale: black holes seem to be a natural extension of the spectrum of elementary physical objects, which starts from photons, neutrinos, electrons and all other elementary particles.
 't Hooft, Quantum gravity without space-time singularities or horizons, 18/09/2009

Au final ces réflexions l'ont conduit à formuler plusieurs conjectures dont la plus connue et reconnue semble-t-il est le principe holographique:
What is known for sure is that Quantum Mechanics works, that the gravitational force exists, and that General Relativity works. The approach advocated by me during the last decades is to consider in a direct way the problems that arise when one tries to combine these theories, in particular the problem of gravitational instability. These considerations have now led to what is called “the Holographic Principle”, and it in turn led to the more speculative idea of deterministic quantum gravity ... 

't Hooft, The Holographic Principle, 2000

La dernière phrase de l'extrait précédent se termine sur une autre idée beaucoup plus controversée: celle d'une théorie déterministe sous-jacente à la mécanique quantique.

(La compréhension de la physique à) l'échelle de Planck vaut bien une conjecture de 't Hooft (non orthodoxe)
Voyons donc un peu plus en détail ce qui peut conduire un spécialiste de la théorie quantique à remettre en question son interprétation standard:
It is argued that the so-called holographic principle will obstruct attempts to produce physically realistic models for the unification of general relativity with quantum mechanics, unless determinism in the latter is restored. The notion of time in GR is so different from the usual one in elementary particle physics that we believe that certain versions of hidden variable theories can – and must – be revived. A completely natural procedure is proposed, in which the dissipation of information plays an essential role.
't Hooft, Quantum Gravity as a Dissipative Deterministic System, 03-04/1999
Beneath Quantum Mechanics, there may be a deterministic theory with (local) information loss. This may lead to a sufficiently complex vacuum state, and to an apparent non-locality in the relation between the deterministic (“ontological”) states and the quantum states, of the kind needed to explain away the Bell inequalities. Theories of this kind would not only be appealing from a philosophical point of view, but may also be essential for understanding causality at Planckian distance scales.
't Hooft,  DETERMINISM BENEATH QUANTUM MECHANICS, 12/2002

Évidemment cette conjecture de 't Hooft suscite semble-t-il pas mal de scepticisme d'autant qu'elle remet en cause rien de moins que le programme de l'informatique quantique comme on le verra dans le prochain paragraphe. Les critiques les plus explicites s'expriment naturellement sur la blogosphère. Pour avoir aussi une idée de la réaction plus officielle de ses pairs, on peut lire cette entrevue récente de 't Hooft dans laquelle il relate une brève discussion avec le physicien John Bell; mais la rencontre date des années 80 et Bell a depuis disparu tandis que le modèle développé par 't Hooft (basé sur des automates cellulaires) s'est raffiné depuis.




Un bel exemple d'échange de réflexion scientifique en ligne sur Physics.Stack.Exchange
Heureusement internet nous offre un autre lieu virtuel d'échanges de point de vue intéressant à travers un site de questions, réponses et commentaires où l'on peut suivre un dialogue en ligne entre 't Hooft et une grande figure de l'information quantique, en l'occurrence Peter Shor (le père de l'algorithme du même nom):

The problem with these blogs is that people are inclined to start yelling at each other. (I admit, I got infected and it's difficult not to raise one's electronic voice.) I want to ask my question without an entourage of polemics.
My recent papers were greeted with scepticism. I've no problem with that. What disturbes me is the general reaction that they are "wrong". My question is summarised as follows:
Did any of these people actually read the work and can anyone tell me where a mistake was made?
... A revised version of my latest paper was now sent to the arXiv ... Thanks to you all. My conclusion did not change, but I now have more precise arguments concerning Bell's inequalities and what vacuum fluctuations can do to them.
asked Aug 15 '12 at 9:35 G. 't Hooft

Réponse: I can tell you why I don't believe in it. I think my reasons are different from most physicists' reasons, however. Regular quantum mechanics implies the existence of quantum computation. If you believe in the difficulty of factoring (and a number of other classical problems), then a deterministic underpinning for quantum mechanics would seem to imply one of the following.

  • There is a classical polynomial-time algorithm for factoring and other problems which can be solved on a quantum computer.
  • The deterministic underpinnings of quantum mechanics require 2n resources for a system of size O(n).
  • Quantum computation doesn't actually work in practice.
None of these seem at all likely to me ... For the third, I haven't seen any reasonable way to how you could make quantum computation impossible while still maintaining consistency with current experimental results.
answered Aug 17 '12 at 14:11 Peter Shor
Commentaire: @Peter Shor: I have always opted for your 3rd possibility: the "error correcting codes" will eventually fail. The quantum computer will not work perfectly (It will be beaten by a classical computer, but only if the latter would be scaled to Planckian dimensions). This certainly has not yet been contradicted by experiment. – G. 't Hooft Aug 17 '12 at 20:45

Signalons que pour le moment il semble que la physique expérimentale ne soit pas encore parvenue à réfuter la prévision de 't Hooft. 

La nature est(-elle) plus folle à l'échelle de Planck que les théoriciens des cordes peuvent l'imaginer(?) 
Voilà une formule que l'on emprunte presque littéralement à un passage de l'article fondateur de 1993 écrit par 't Hooft et cité précédemment. Le lecteur peut la voir comme un clin d'oeil à un célèbre blogueur très critique aujourd'hui comme hier envers un modèle déterministe supposé sous-tendre la mécanique quantique dont 't Hooft est l'auteur. Lubos Motl, pour ne pas le citer, profite de la prépublication d'un long article de synthèse du physicien hollandais sur ce sujet pour attaquer un de ses postulats: l'existence d'une base ontologique dans l'espace de Hilbert qui décrit les états possibles d'un système quantique. Comme à son habitude Lubos développe une argumentation qui s'appuie sur des exemples de grande valeur pédagogique pour qui veut comprendre la physique quantique; mais son analyse de la thèse qu'il critique nous semble trop superficielle pour que le lecteur se fasse une idée précise du pouvoir heuristique de cette dernière et de ses enjeux épistémologiques.
On se contentera pour notre part (pour le moment) de mettre en exergue les points suivants qui nous paraissent intéressants:
... I do find that local deterministic models reproducing quantum mechanics, do exist; they can easily be constructed. The difficulty signalled by Bell and his followers, is actually quite a subtle one. The question we do address is: where exactly is the discrepancy? If we take one of our classical models, what goes wrong in a Bell experiment with entangled particles? Were assumptions made that do not hold? Or do particles in our models refuse to get entangled? ...
The evolution is deterministic. However, this term must be used with caution. “De- terministic” cannot imply that the outcome of the evolution process can be foreseen. No human, nor even any other imaginable intelligent being, will be able to compute faster than Nature itself. The reason for this is obvious: our intelligent being would also have to employ Nature’s laws, and we have no reason to expect that Nature can duplicate its own actions more efficiently than itself. ...
... There are some difficulties with our theories that have not yet been settled. A recurring mystery is that, more often than not, we get quantum mechanics alright, but a hamiltonian emerges that is not bounded from below. In the real world there is a lower bound, so that there is a vacuum state. A theory without such a lower bound not only has no vacuum state, but it also does not allow a description of thermodynamics using statistical physics. Such a theory would not be suitable for describing our world. How serious do we have to take this difficulty? We suspect that there will be several ways to overcome it, the theory is not yet complete, but a reader strongly opposed to what we are trying to do here, may well be able to find a stick that seems suitable to destroy our ideas. Others, I hope, will be inspired to continue along this path. There are many things to be further investigated, one of them being superstring theory. This theory seems to be ideally suited for the approach we are advocating.  
G. 't Hooft, The Cellular Automaton Interpretation of Quantum Mechanics, 7/05/2014



Prophétie à propos d'une symétrie conformationnelle locale exacte de la Nature (mais spontanément brisée en deçà de l'échelle de Planck)
Au delà de ce débat sur l'interprétation de la mécanique quantique, le travail de 't Hooft offre l'occasion de voir un chercheur en action, prêt à élaborer et défendre des hypothèses audacieuses en construisant des modèles aussi précis que possible (et dans une certaine mesure réfutables) pour voir jusqu'où peuvent le guider selon son point de vue les concepts qui ont si bien servit la physique comme la causalité et la localité.
Voici pour finir une dernière de ses conjectures qui aura peut-être un plus grand avenir, telle qu'elle est évoquée sur son site personnel:
I claim to have found how to put quantum gravity back in line so as to restore quantum mechanics for pure black holes. It does not happen automatically, you need a new symmetry. It is called local conformal invariance. This symmetry is often used in superstring and supergravity theories, but very often the symmetry is broken by what we call “anomalies”. These anomalies are often looked upon as a nuisance but a fact of life. I now claim that black holes only behave as required in a consistent theory if all conformal anomalies cancel out. This is a very restrictive condition, and, very surprisingly, this condition also affects the Standard Model itself. All particles are only allowed to interact with gravity and with each other in very special ways. Conformal symmetry must be an exact  local symmetry, which is spontaneously broken by the vacuum,  exactly  like in the Higgs mechanism.

This leads to the prediction that models exist where all unknown parameters of the Standard Model, such as the finestructure constant, the proton-electron mass ratio, and in fact all other such parameters are computable. Up till now these have been freely adjustable parameters of the theory, to be determined by experiment but they were not yet predicted by any theory.
I am not able to compute these numbers today because the high energy end of the elementary particle properties is not known. There is one firm prediction: constants of Nature are truly constant. All attempts to detect possible space and time dependence of the Standard Model parameters will give negative results. This is why I am highly interested in precision measurements of possible space-time dependence of constants of Nature, such as the ones done by using a so-called "frequency comb". These are high precision comparisons between different spectral frequencies in atoms and molecules. They tell us something very special about the world we live in. 
't Hooft

//Rédaction et dernières retouches éditoriales le jeudi 22 mai 2014.

jeudi 15 mai 2014

Chercher : la beauté (rétrospectivement), la simplicité (maintenant) et la surprise (à venir)

S comme simplicité?
Rétrospectivement on peut dire que la physique théorique du XX siècle a été marquée par l'affirmation revendiquée de la beauté comme critère heuristique, à travers les découvertes de chercheurs comme Albert Einstein et Paul Dirac dans la première moitié du siècle puis celles de Chen Ning Yang  par exemple dans la seconde moitié. Or il semble qu'en ce début de XXI siècle, ce qui préoccupe maintenant les théoriciens c'est de mieux comprendre la simplicité des lois de la Nature; c'est ce que montre le résumé suivant du programme d'une conférence qui vient de s'achever aujourd'hui à Princeton et dont le titre a inspirée ce billet:
... recent data from Cosmic Microwave Background measurements, the Large Hadron Collider at CERN and Dark Matter experiments that show the universe to be surprisingly simple on both the microphysical and macrophysical scale: there is a striking absence -- thus far -- of new particles, WIMP dark matter or non-gaussianity. The recent report by BICEP2 of the detection of primordial gravitational waves produces some tension with current results from the Planck and WMAP satellites that may indicate unexpected complexity. However, this workshop will occur at a time when the results have yet to be confirmed, so we are free to imagine various scenarios. At the same time, there is the intriguing fact (clue?) that the measured Higgs and top quark mass lie within the narrow range corresponding to a metastable Higgs vacuum in the standard model. What could all this mean? What ideas need to be jettisoned, revised or created to naturally explain this simplicity?
SEARCHING FOR SIMPLICITY, 12-15 May 2014

Or si la simplicité de la Nature intrigue les théoriciens c'est parce qu'elle n'est bizarrement pas aussi "naturelle" qu'ils voudraient qu'elle soit. Mais derrière cette idée de naturalité se cachent des critères esthétiques, ceux forgés par les succès théoriques passés, qu'il faut peut-être aujourd'hui abandonner.


S comme surprise!
Néanmoins pour remettre en cause les conceptions théoriques trop étroites du siècle précédent le physicien doit aussi s'appuyer sur elles pour imaginer leurs extrapolations les plus hardies et tester leurs conséquences les plus ténues, dans le secret espoir de voir l'inattendu. Parvenir à une découverte surprenante comme celles qui marqua les débuts de la physique quantique (la structure de l'atome, l'effet photoélectrique, le spin électronique), voilà la quête secrète de tous les physiciens:
During a visit to the Super Proton Synchrotron [Margaret Thatcher] spoke John Ellis, who introduced himself as a theoretical physicist. The conversation continued:

Thatcher: “What do you do?”
Ellis: “Think of things for the experiments to look for, and hope they find something different.”
Thatcher: “Wouldn’t it be better if they found what you predicted?”
Ellis: “Then we would not learn how to go further!”
Aidan Randle-Conde, Blog Quantum Diaries, Margaret Thatcher, politician, scientist,15/04/2013



mardi 13 mai 2014

Pas de rétractation (mais plus d'incertitude) sur une preuve expérimentale récente de l'inflation cosmologique

D'une rumeur qui enfle à une autre qui (la dé)gonfle (ou bienvenue dans l'ère de la Science Ouverte)
Il y a quelques semaines, le transcyberphysicien relayait ici le battage médiatique orchestré autour de l'annonce d'une possible découverte astrophysique importante. Peter Coles, un cosmologiste anglais, mais aussi un blogueur reconnu, a très bien parlé de cet événement: 
When the BICEP2 team announced that a “major astrophysics discovery” would be announced this Monday I have to admit that I was quite a bit uncomfortable about the way things were being done. I’ve never been keen on “Science by Press Release” and when it became clear that the press conference would be announcing results that hadn’t yet been peer-reviewed my concerns deepened.
However, the BICEP2 team immediately made available not only the “discovery” paper but also the data products, so people with sufficient expertise (and time) could try to unpick the content. This is fully in the spirit of open science and I applaud them for it. Indeed one could argue that putting everything out in the open the way they have is ensuring that that their work is being peer-reviewed in the open by the entire cosmological community not secretly and by one or two anonymous individuals. The more I think about it the more convinced I am becoming that this is a better way of doing peer review than the traditional method, although before I decide that for sure I’d like to know whether the BICEP2 actually does stand up!
One of the particularly interesting developments in this case is the role social media are playing in the BICEP2 story. A Facebook Group was set up in advance of Monday’s announcement and live discussion started immediately the press conference started. The group now has well over 700 members, including many eminent cosmologists. And me. There’s a very healthy scientific discussion going on there which may well prove to be a model of how such things happen in the future. Is this a sign of a major change in the way science is done, the use of digital technology allowing science to break free from the shackles placed on it by traditional publication processes? Maybe.
 Telescoper (alias Peter Coles), Blog In the dark, Bicep2, Social Media and Open Science 14/05/2014

Dans un autre de ses billets il discutait aussi de façon précise et claire les incertitudes qui pèsent sur la véracité de la découverte en question, ou plus précisément sur l'origine cosmologique du signal mis en évidence par l'expérience Bicep2. Mais les discussions techniques autour des incertitudes expérimentales se déroulaient "à bas bruit" semble-t-il dans la blogosphère jusqu'à ce que la situation change il y a trois jours, suite à la publication d'un billet d'Adam Falkowski, physicien des particules (et blogueur "fou" alias Jester ;-) qui prétendait relayer une rumeur selon laquelle les chercheurs de la collaboration Bicep2 auraient reconnus avoir commis une erreur dans l'évaluation d'un signal parasite. 


La parole à un scrutateur sceptique (lanceur d'alerte ou diffuseur de rumeur?)
Voilà ce que dit entre autre chose le billet en question: 
The BICEP claim of detecting the primordial B-mode in the polarization of the Cosmic Microwave Background was a huge news. If confirmed, it would be an evidence of gravity waves produced during cosmic inflation, and open a window on physics at an incredibly high energy scale of order 10^16 GeV.
Barring a loose cable, the biggest worry about the BICEP signal is that the collaboration may have underestimated the galactic foreground emission. BICEP2 performed the observations at only one frequency of 150 GHz which is very well suited to study the CMB, but less so for polarized dust or synchrotron emission. As for the latter, more can be learned by going to higher frequencies, while combining maps at different frequencies allows one to separate the galactic and the CMB component. Although the patch of the sky studied by BICEP is well away from the galactic plane, the recently published 353 GHz polarized map from Planck demonstrates that there may be significant emission from these parts of the sky.
... New data from Planck, POLARBEAR, ACTpole, and Keck Array should clarify the situation within a year from now.  However, at this point, there seems to be no statistically significant evidence for the primordial B-modes of inflationary origin in the CMB.
 Jester (alias Adam Falkowski), Blog Résonaances Is BICEP wrong? 12/05/14


La réponse des scientifiques scrutés (chercheurs de vérité ou promoteurs de découverte prématurée?)
Le billet un peu "poil à gratter" de Jester a tout de suite suscité beaucoup de réactions et l'intérêt des média scientifiques populaires anglo-saxons qui ont fait réagir les chercheurs dont le travail était remis en question:
Clement Pryke, a cosmologist at the University of Minnesota, Twin Cities, and a co-principal investigator for the BICEP team, acknowledges that the foreground map is an important and thorny issue. Part of the problem is that the Planck team has not made the raw foreground data available, he says. Instead, BICEP researchers had to do the best they could with a PDF file of that map that the Planck team presented at a conference.
... The BICEP team will not be revising or retracting its work, which it posted to the arXiv preprint server, Pryke says: "We stand by our paper."

On 12 May, a rumour emerged on the physics blog Résonaances that the BICEP2 team has already admitted defeat. The blogger, particle physicist Adam Falkowski at CERN, says he has heard through the scientific grapevine that the BICEP2 collaboration misinterpreted a preliminary Planck map in its analysis.
... "We tried to do a careful job in the paper of addressing what public information there was, and also being upfront about the uncertainties. We are quite comfortable with the approach we have taken." [says principal investigator John Kovac at Harvard University]
Lisa Grossman, Rumours swirl over credibility of big bang ripple find, 13/05/2014

Au moment où ces lignes sont écrites (15 mai 11h50 heure de Paris) on apprend que le prestigieux Princeton Center for Theoretical Science organise ce jour même (de façon impromptue semble-t-il) un événement spécial pour discuter justement des dernières avancées dans la compréhension des incertitudes expérimentales incriminées :

Special Event: May 15, 2014
Towards an Understanding of Foregrounds in the BICEP2 Region
Speakers: Raphael Flauger (IAS and NYU) with discussion by Lyman Page (Princeton)
Thursday, May 15 at 9:30 am
PCTS Seminar Room, Room 407
(video recording and slides will be made available after the talk)

SLIDES

Bicep2 versus Planck?
En attendant de nouvelles informations scientifiques de première main sur la question on peut essayer de mettre en lumière les enjeux du débat en croisant les sources d'informations. Commençons par comparer ces deux avis de blogueurs plus ou moins habiles dans l'art de la provocation: 
... the BICEP2 experiment announced a significant detection of the primordial B-model in the CMB power spectrum... 
  • If this holds up, it's huge, comparable in magnitude to the discovery of the Higgs boson. Probably even more exciting because of the surprise element...
  • If you hear a sledgehammer in the corridor of your lab, that may be your local Planck member banging his head on the wall. Yeah, apart from many noble aspects, science also has this lowly competition side. A billion dollar experiment that misses a Nobel-prize-worth low-hanging fruit... I wouldn't wish  to be in their skin if BICEP is right. 
Jester,  Blog RésonaancesCurly impressions, 17/03/2014

The total price of the Planck satellite was €700 million, almost a billion of dollars. On the other hand, BICEP2's expenses are comparable to $10 million, about one hundred times smaller. That's why Planck is the Goliath and BICEP2 was the David ...
If your budget is 100 times smaller than the budget of someone else, it may be and feel more likely that you won't win but it simply does not imply that you can't discover something important before the Goliath does. Even though "chance" could be enough to explain all these unexpected events, the victories of the under dogs, there are also detailed reasons why BICEP2 has apparently done the discovery before Planck.
BICEP2 had a vision. They were focusing on the B-modes, assuming from the beginning that there could be something new over there. Their devices were sufficiently optimized to do the job and they have apparently succeeded in the job. In comparison, Planck has gotten into a pessimistic mode in which people assume that "they can't discover something really new, anyway" which is why they don't even try so hard.
Lubos Motl, Blog The Reference FrameBICEP2 vs Planck: nothing wrong with screen scraping, 14/05/2014

Mais laissons-là la polémique à tendance sociologique pour revenir à des intérêts plus centrés sur la connaissance scientifique.

Les résultats de Bicep2 dépendent(-ils) de ceux de Planck !(?)
Cherchons dans les réactions des lecteurs de blogs des questions, des remarques et des informations intéressantes:


Nick said... 13 May 2014 07:29 (blog Résonaances)
Is it correct then that BICEP2 was systematically unable to precede PLANCK? Surely in their methodology before getting funded someone would have asked about how they would measure the foreground, the answer would have been "We will have to wait for PLANCK!" So in this sense this very modest telescope was never designed to compete for a Nobel Prize but was always complementary to PLANCK? 

Jester said... 13 May 2014 08:29  (blog Résonaances)
Nick, my understanding is that the amount of polarized foreground in the BICEP patch is a bit of a surprise. That region is rather clean in temperature maps, apparently it is less clean in polarization. But, right, it was always clear that for a fully reliable estimation of the foregrounds we need measurements at several frequencies, and Planck is by far best suited to do that.

pion says: May 14, 2014 at 2:40 am (blog Not Even Wrong)

It seems to me that Planck is not our best hope to settle this issue mainly due to the fact that it is a satellite, and information from certain ground-based telescopes might be more credible.
Since the CMB polarization level is obtained from differencing two intensity measurements toward the same direction on the sky, any optical imperfection of the detectors can potentially leak the dominant intensity to the faint B-mode polarization. A few of these spurious signals can be potentially mitigated by the telescope’s scanning strategy; basically, each pixel in the field is observed multiple times with (ideally) different orientations of the polarimeter. Ideally, this would suppress a large fraction of the systematic but not entirely.
Most ground-based telescopes benefit from the earth rotation, others use half waveplate, and in general the scanning strategy could be optimized for minimizing the intensity-to-B-mode leakage. Satellites in orbit, however, are limited and their typical scanning strategy is sub-optimal if not poor. We know, as a fact, that Planck never published their B-maps, and I guess that this is partially due to the issue of B-mode systematics which is always a challenge, for satellites in particular...
My best bet is that a joint effort of ground-based instruments (which are located off the pole) might ultimately provide a conclusive answer to this thorny issue. The problem with this alternative, though, is that ground-based experiments are limited to a relatively narrow frequency window. Hopefully, two or three frequency bands will suffice for the construction of a reliable polarized dust model, but this is not a priori guaranteed.

Le mot de la fin à un cosmologiste encore dubitatif 
I repeat what I’ve said before in response to the BICEP2 analysis, namely that the discussion of foregrounds in their paper is disappointing. I’d also say that I think the foreground emission at these frequencies is so complicated that none of the simple approaches that were available to the BICEP2 team are reliable enough to be convincing. ... I think BICEP2 has definitely detected something at 150 GHz but we simply have no firm evidence at the moment that it is primordial. That will change shortly, with the possibility of other experiments (specifically Planck, but also possibly SPTPol) supplying the missing evidence.

I’m not particularly keen on the rumour-mongering that has gone on, but then I’m not very keen either on the way the BICEP2 result has been presented in some quarters as being beyond reasonable doubt when it clearly doesn’t have that status. Yet.

Rational scepticism is a very good thing. It’s one of the things that makes science what it is. But it all too easily turns into mudslinging. 

Telescoper (alias Peter Coles), Blog In the dark, That BICEP Rumour…, 14/05/2014

Conclusion et morale pour cette histoire
L'annonce très médiatique du premier signe expérimental de l'existence d'ondes gravitationnelles primordiales et de la preuve de la véracité de l'inflation cosmologique était forcément un peu gonflée ...
Le transcyberphysicien, 15/05/2014

Voilà pour la conclusion en forme de boutade; quant-à la morale, il en est une qui s'impose encore et toujours, chaque fois du moins qu'une découverte scientifique extraordinaire est annoncée publiquement:  
Extraordinary claims require extraordinary evidence
Des affirmations extraordinaires exigent des preuves extraordinaires 
Carl Sagan Cosmos,12 - Encyclopaedia Galactica, 14/12/1980


//La rédaction de ce billet a été achevée le jeudi 15/05/2015

dimanche 4 mai 2014

Un scalaire peut en cacher un autre

Rubrique Dévissage (4)

On a déjà beaucoup parlé dans ce blog et ailleurs du boson scalaire de Higgs, on a aussi déjà évoqué l'existence hypothétique d'autres particules scalaires qui seraient étroitement associées à ce dernier et dont l'une d'elle est désignée par la lettre σ (sigma). Or il se trouve qu'il existe dans la littérature physique encore une autre particule, désignée par la même lettre et scalaire elle-aussi, mais qui n'est pas ou plus hypothétique car - nous allons le voir dans ce billet - elle a semble-t-il déjà été mise en évidence expérimentalement et cela avant même la découverte du Higgs!  On voit que la situation peut prêter à confusion pour l'intrépide curieux qui chercherait à en savoir d'avantage en parcourant la littérature scientifique. Essayons donc de démêler un peu avec le lecteur cet embrouillamini grâce à un article de Martin Schumacher sur ce sujet, intitulé : Nambu’s Nobel Prize, the σ meson and the mass of visible matter daté du 07/04/2014.


Un prix Nobel qui en rappelle un autre 
The 2013 Nobel Prize in Physics has been awarded to François Englert and Peter Higgs ”for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN’s Large Hadron Collider”. This award followed a related one where one half of the 2008 Nobel Prize in Physics was awarded to Yoichiro Nambu “for the discovery of the mechanism of spontaneous broken symmetry in subatomic physics”. In order to understand the importance of this latter award it is very important to read the Nobel Lecture which was presented by Giovanni Jona-Lasinio, a younger coauthor of the famous Nambu–Jona-Lasinio model [3] published in 1961. This Nobel Lecture having the title “Spontaneous symmetry breaking in particle physics: A case of cross fertilization” describes the way from superconductivity to particle physics which led to the Nambu– Jona-Lasinio (NJL) Lagrangian ...  

In the standard model of particle physics, the NJL model may be regarded as an effective theory for the QCD with respect to generation of the so-called constituent masses. In analogy to different descriptions of superconductivity the NJL model goes over to the linear σ model (LσM) of Gell-Mann and Lévy [4]. In the Nobel Lecture Nambu makes the important statement: ”If this analogy turns out real, the Higgs field might be an effective description of the underlying dynamics.” This Higgs field is the Higgs field of strong interaction represented by the σ meson.

(Partir) du boson de Higgs (pour revenir) au méson sigma (σ)
An explanation of the constituent-quark mass in terms of symmetry breaking mediated by the σ meson remained uncertain as long as the σ meson had not been observed. This, however, has changed dramatically in the last years after the σ meson has been observed as part of the constituent-quark structure via Compton scattering by the nucleon. This experiment was carried out at MAMI (Mainz) and published in 2001 [5,6]. The final interpretation of the results obtained required some further theoretical studies which were published in 2010, 2011 and 2013. Through this experimental and theoretical work the σ meson is by now well investigated and the process of mass generation of constituent quarks well understood ... 
For the Higgs boson the theoretical research started with the work of Goldstone (1961 [10] and 1962 [11] ) where it was shown that spontaneous symmetry breaking leads to massless particles in addition to a heavy particle. This is not a problem for the σ meson where the light π mesons are massless in the chiral limit and have only a small mass as real particles, serving as pseudo-Goldstone bosons. For the Higgs boson these massless Goldstone bosons are strongly unwanted particles because they seem not to be present in nature. Therefore, in a number of papers scenarios were developed leading to symmetry breaking without Goldstone bosons. This essential modification is related to the introduction of massless gauge boson which swallow the Goldsone bosons and in this way generate mass and a longitudinal field component ... 

Les deux sources (essentielles) de la génération de masse des particules quantiques
Nowadays the origin of the theory of spontaneous symmetry breaking is most frequently attributed to the work of Peter Higgs ... But it was Peter Higgs himself who correctly pointed out ... that vacuum expectation values of scalar fields might play a role in breaking of symmetries was first noted by Schwinger. This means that strong and electroweak symmetry breaking both can be traced back to the seminal work of Schwinger ... and that the introduction of the σ meson inspired the electroweak sym- metry breaking, though these two processes take place at completely different scales. The interest in this interplay between the two sectors of symmetry breaking is of importance up to the present. 
Translated into present-day language Schwinger introduced a generic equation between the mass m of a particle and the vacuum expectation value of a scalar field φ(0) given in the form 
m ∝ g<φ(0)>.                                                           (2)
For the purpose of the present paper we translate Eq. (2) into three related equations, viz. 
 ml = gHll (1/√2) v,                                                       (3)
m0q = gHqq (1/√2) v                                                     (4)
and
mclq = gσqq clπ .                                                        (5) 
Eqs. (3) and (4) relate the lepton and current-quark masses, respectively, to the elec- troweak vacuum expectation value v of the Higgs field and Eq. (5) the constituent quark mass in the chiral limit (cl), where the effects of the Higgs field are turned off, to the pion decay constant clπ in the chiral limit. In case of Eqs. (3) and (4) the Higgs-lepton and Higgs-quark coupling constants gHll and gHqq can be calculated from the known electroweak vacuum expectation value v = 246 GeV and the known lepton mass ml  and current-quark mass m0q, respectively. In case of Eq. (5) the pion decay constant in the chiral limit is clπ = 89.8 MeV and gσqq = 2π/√3 = 3.62 ... 
In (3) – (5) two sources of mass are discussed, viz symmetry breaking mediated by the Higgs field (Eqs. 3 and 4) and spontaneous symmetry breaking mediated by the σ field (Eq. 5). These are the main sources of mass generation. In case of strong interaction there are in addition dynamic effects related to excited states, the interaction of spins and effects due to gluons where the latter effects show up in the form of glueballs and the UA(1) anomaly.

Un boson de Higgs probablement fondamental et un méson sigma indubitablement composite
The discovery of the electroweak Higgs boson has led to models of the vacuum where an overall Higgs field is assumed to exist. For a mass of the Higgs boson of 126 GeV this Higgs field is expected to be elementary, i.e. not composed of fermion-antifermion pairs. The reason for this conclusion is that Higgs bosons composed of t̄t pairs or techniquark- antitechniquark pairs are predicted to have higher masses. In parallel to this, for the strong-interaction counterpart an overall σ field may be assumed to exist in the QCD vacuum which is composed ūu and d̄d pairs forming the structure σ = 1/√2(ūu + d̄d). Then the generation of the mass of the constituent quarks may be understood in terms a q̄q condensate attached to the current quarks. In this condensate the q̄q pairs are ordered to form mesons, like the π meson isospin triplet and the σ meson.

Un méson de 600 MeV pour expliquer la masse des nucléons et un boson de 126 GeV pour comprendre celles de leurs composants
Telle est le résultat étonnant des expériences les plus récentes faites pour comprendre la génération des masses des particules quantiques.  
The most important recent discovery is that the largest part of the electric polarizability and the total diamagnetic polarizability of the nucleon are properties of the σ meson as part of the constituent-quark structure, as expected from the mechanism of chiral symmetry breaking. This view is supported by an experiment on Compton scattering by the proton carried out in the second resonance region, where a large contribution from the σ meson enters into the scattering amplitudes. This experiment led to a determination of the mass of the σ meson of mσ = 600 ± 70 MeV. From the experimental αp and predicted differences (αn − αp) neutron polarizabilities in the range αn = 12.0 − 13.4 are predicted ...
Martin Schumacher, Dispersion theory of nucleon Compton scattering and polarizabilities 27/04/2013

Mésons sigma d'hier et d'aujourd'hui (sont-ils les mêmes ?)
(rédaction encore en cours)

samedi 3 mai 2014

(Il ne faut pas) renoncer au paradigme de la supersymétrie (de basse énergie) ...

Sans commentaire // ou presque

//Le blogueur continue son enquête obstinée sur la supersymétrie et le problème de la naturalité évoquée déjà à plusieurs reprises (comme ici et ) en s'arrêtant aujourd'hui sur les contributions récentes de deux physiciens expérimentés mais encore jeunes et car avides de découvertes !

... avant que le second "round" de fonctionnement du LHC à des énergies plus élevées n'ait épuisé le champ des possibles à 10-3 près ?
//C'est l'avis défendu par H. Murayama, physicien théoricien japonais, dans un récent article qui dresse un panorama des perspectives ouvertes à la physique expérimentale des hautes énergies par les avancées les plus récentes:
The discovery of a “Higgs-like particle” on July 4, 2012 was a truly historic moment in the history of science ... So far, what we’ve seen looks minimal ... It took a whopping eighty years to come to the point where we now have a UV-complete theory of strong, weak, and electromagnetic forces with all of the parameters measured ...
Despite this achievement, or rather because of it, there is a building anxiety in the community. How come we don't see anything else? Will the progress stop? There is no sign of physics beyond the Standard Model in the LHC data. For a typical search for supersymmetric particles, for example, squarks and gluinos are excluded up to 1.3 TeV or so. On the other hand, the conventional arguments based on the naturalness concept suggested that we should have new particles that stabilize the electroweak scale below TeV. It appears that "natural and simple" models have been excluded ...
I have to point out, however, that certain levels of fine-tuning do occur in nature. All examples I'm aware of, with the glaring exception of the cosmological constant, are at most at the level of a few per-mille. The current LHC limit has not quite reached that level; the next runs at 13-14 TeV may well reveal new physics as we hoped for.
 Hitoshi Murayama, Future experimental programs, 01/11/2013

Ne pas renoncer à [la meilleur{?} solution potentielle pour] résoudre un problème de naturalité qui a déjà conduit à plusieurs découvertes fondamentales  
//Voilà l'argumentation développée dans la suite de l'article de Murayama:
... the best argument we have right now to expect new physics in the TeV range is the naturalness: we would like to avoid fine-tuning between the bare m2h and the radiative correction ... Even though many in the community are ditching the naturalness altogether, I still take the argument seriously because it has worked many times before.
One example I always bring up is the discovery of the positron [24, 25]. In classical electrodynamics, the Coulomb self-energy of the electron is linearly divergent, ... It would have required a fine cancellation between the “bare” mass of the electron ... and the correction to yield a small mass [of] 0.511 MeV. However, the discovery of the positron and quantum mechanics told us that the vacuum is always fluctuating, producing a pair of e+e−, that annihilates back to the vacuum within the time allowed by the uncertainty principle ... When you place an electron in this fluctuating vacuum, it may find a (virtual) positron near it and decide to annihilate it. Then the other electron that was originally in the vacuum fluctuation is now left out and becomes a “real” particle. It turns out that this process cancels the linear divergence exactly, leaving only a logarithmic divergence. Even for an electron as small as the Planck distance, it amounts to only 9% correction. The cancellation is guaranteed by a (softly broken) chiral symmetry. You can see that the naturalness problem was solved by doubling the number of particles!
The idea of supersymmetry was pushed to repeat the history. Because the Higgs boson must repel itself, it also has a divergent self-repulsion energy ... But by doubling the number of particles (namely introducing superpartners), there is a cancellation between the self-repulsion among Higgs bosons, and the induced attraction arising from the loop of higgsinos (fermionic partner of the Higgs boson). Again, the correction is down to a logarithmic divergence ...

In the case of the electron, new physics (positron) appears “early” at [a] Compton wave length  [of] 400 fm well before we get down to the smaller “classical radius of electron” rc = e2/mec2 ≈ 1 fm where the theory becomes fine-tuned. In another well-known case, however, nature did fine-tune it so that the discovery was delayed. 
The example is COBE (Cosmic Background Explorer) that discovered the CMB anisotropy. People expected anisotropy at the level of 10−5 so that the observed large- scale structure can be explained. But the search went on, so much so that people started writing articles questioning the inflationary cosmology itself. When COBE discovered the quadrupole moment, it was small. Actually, compared to our best prediction today based on the WMAP data, it was nearly an order of magnitude smaller than theory. This is usually understood today as a consequence of cosmic variance, namely that the quadrupole moment has only 2l + 1 = 5 numbers to be measured and hence is subject to a statistical uncertainty of O(1/√5). I find the observed quadrupole moment to be fine-tuned at the 2% level. Note that the inflation was invented to solve the naturalness problems, horizon problem and flatness problem of the standard Big Bang cosmology. It worked: the current data beautifully confirm predictions of inflation. But it was a little fine-tuned and it required patience and more work. So the moral I draw from these examples is that the naturalness argument generally does work. But there are cases where fine-tuning at the level of a few percent or even few per-mille (some examples in nuclear physics are well-known, see [26]). ... we have not fully explored down to that level of not-that-fine-tuning yet. And it took ten years for Tevatron to discover [the]top [quark]. Patience pays, hence my optimism. 
Ibid. 

L'histoire ne peut servir de justification scientifique mais l'analyse rétrospective nous en apprend beaucoup sur les erreurs passées  ou les opportunités manquées
//C'est ce que montre N. Arkani-Hamed, un jeune (il a l'âge du blogueur ;-) et brillant physicien théoricien américain, dans un article qui expose de façon claire et condensée le défi théorique posé par la mise en évidence expérimentale du boson de Higgs :

As is well known, for massive particles of spin one like the W and Z bosons, merely ignoring mass at high energies is missing one piece of physics, given the mismatch in the degrees of freedom between the three massive polarization states and the two helicities of the massless spin 1 particles. Furthermore, the interactions of the third ‘longitudinal’ component become large at high energies. Something new is needed to unitarize scattering processes involving the longitudinal W’s and Z’s, and from the list of consistent possibilities for particle interactions, only spin 0 particles—the simplest possibility being a single Higgs particle—can do the job. It is remarkable that this simplest possibility is actually realized in Nature ... why is the Higgs mass so much smaller than ultraviolet scales like the Planck scale? For massless particles with spin, we have a satisfying answer to this question: as we have already remarked, the number of spin degrees of freedom are discretely different for massless versus massive particles: 3≠2 for spin 1, and 5 ≠ 2 for spin 2. Therefore, interactions cannot change massless particles into massive ones. But the situation is different for the Higgs: 1 = 1, there is no difference in degrees of freedom between massive and massless spin 0 particles, and thus no direct understanding for the lightness of the Higgs, absent new physics at the weak scale.
This is not the first time naturalness arguments have arisen in the development of fundamental physics. Indeed, issues similar to the Higgs tuning problem arose three times in the 20 th century, and were resolved in the expected way, with ‘natural’ new physics showing up where expected... naturalness issues arose in the context of [(i) the linearly divergent energy stored in the electric field surrounding a point-like electron, soften by the quantum field effects of the positron, (ii)] the mass splitting between the charged and neutral pions, where the ρ meson cuts off the quadratically divergent contribution to the charged pion mass from the photon loop at the required scale, [(iii)] as well as with K–K¯ mixing, where the charm quark appeared where needed to cut off the quadratically divergent contribution to ΔmK. 
Of course arguments from history are suspect, and can be used to illustrate any polemical point one wishes to make. If we go back further in time, naturalness arguments have also failed spectacularly. A good example relates to the heliocentric model of the solar system, which was already advanced in ancient times by Aristarchos. He had already brilliantly determined the distance from the Earth to the Sun, which was thought to be enormously large. But by putting the Sun at the center of the solar system, his theory made a prediction of parallax for the distant stars. This is too small to be seen by the naked eye. Of course he knew a way out—he had to declare that the distant stars were even more ridiculously far away than the Sun. This struck most of his contemporaries as obviously wrong, and special pleading for this model: why should the earth be so close to one object, and so far away from all the others, conveniently far enough for the model not to be false? Their notion of naturalness thus led them to reject the simple and correct model of the solar system.
Nima Arkani-Hamed, Beyond the Standard Model theory, 01/10/2013

//Reste à savoir si les physiciens actuels ne répètent pas d'une certaine manière l'erreur passée des contemporains d'Aristarque en négligeant pour le moment certains modèles qui ne rentrent pas dans le cadre peut-être trop étroit de leur notion de naturalité ... mais ceci est une autre histoire débattue ailleurs par le blogueur.