sábado, 21 de setembro de 2013

CAST - Matéria e anti-matéria



CAST - CERN Axion Solar Telescope 

CAST

Nobel Prizes - (4) Max Planck, Sheldon Glashow,  Howard Georgi and Steven Weinberg

Hypothetical particles called axions could explain differences between matter and antimatter - and we may find them at the centre of the Sun.


The CERN Axion Solar Telescope (CAST) is an experiment to search for hypothetical particles called "axions". These have been proposed by some theoretical physicists to explain why there is a subtle difference between matter and antimatter in processes involving the weak force, but not the strong force. If axions exist, they could be found in the centre of the Sun and they could also make up invisible dark matter.

CAST is searching for these particles with a telescope designed to detect axions from the Sun. It uses an unexpected hybrid of equipment from particle physics and astronomy. The telescope is made from a prototype of a dipole magnet for the Large Hadron Collider, with its hollow beam pipes acting as viewing tubes. 

To allow the magnet to operate in a superconducting state, it is supplied with cryogenic infrastructure previously used by the Large Electron-Positron collider's DELPHI experiment. A focusing mirror system for X-rays (recovered from the German space programme), an X-ray detector at each end, and a moving platform add the final touches to turn the magnet into a telescope that tracks the Sun.

The idea is that the magnetic field acts as a catalyst to transform axions into X-rays, making them relatively easy to detect. The strength of the superconducting dipole magnet and its long length ensure the efficiency of the process. CAST brings together techniques from particle physics and astronomy, and benefits from CERN’s expertise in accelerators, X-ray detection, magnets and cryogenics.

Des particles hypothétiques pourraient expliquer la différence entre la matière et l'antimatière - et nous pourrions les trouver au centre du soleil

The CERN Axion Solar Telescope (CAST) is an experiment to search for hypothetical particles called "axions". These have been proposed by some theoretical physicists to explain why there is a subtle difference between matter and antimatter in processes involving the weak force, but not the strong force. If axions exist, they could be found in the centre of the Sun and they could also make up invisible dark matter.

CAST is searching for these particles with a telescope designed to detect axions from the Sun. It uses an unexpected hybrid of equipment from particle physics and astronomy. The telescope is made from a prototype of a dipole magnet for the Large Hadron Collider, with its hollow beam pipes acting as viewing tubes. To allow the magnet to operate in a superconducting state, it is supplied with cryogenic infrastructure previously used by the Large Electron-Positron collider's DELPHI experiment. A focusing mirror system for X-rays (recovered from the German space programme), an X-ray detector at each end, and a moving platform add the final touches to turn the magnet into a telescope that tracks the Sun.

The idea is that the magnetic field acts as a catalys

Expanding Universe

The galaxies we see in all directions are moving away from the Earth, as evidenced by their red shifts. Hubble's law describes this expansion.


The fact that we see all other galaxies moving away from us does not imply that we are the center of the universe! All galaxies will see all other stars moving away from them in an expanding universe. A rising loaf of raisin bread is a good visual model: each raisin will see all other raisins moving away from it as the loaf expands.

The fact that the universe is expanding then raises the question "Will it always expand?" Since the action of gravity works against the expansion, then if the density were large enough, the expansion would stop and the universe would collapse in a "big crunch".

This is called a closed universe. If the density were small enough, the expansion would continue forever (an open universe). At a certain precise critical density, the universe would asymtotically approach zero expansion rate, but never collapse. Remarkably, all evidence indicates that the universe is very close to that critical density. Discussions about the expansion of the universe often refer to adensity parameter Ω which is the density divided by the critical density, such that Ω = 1 represents the critical density condition.


Hubble's Law

Hubble's law is a statement of a direct correlation between the distance to a galaxy and its recessional velocity as determined by the red shift. It can be stated as



The reported value of the Hubble parameter has varied widely over the years, testament to the difficulty of astronomical distance measurement. But with high precision experiments after 1990 the range of the reported values has narrowed greatly to values in the range




An often mentioned problem for the Hubble law is Stefan's Quintet. Four of these five stars have similar red shifts but the fifth is quite different, and they appear to be interacting.


The Particle Data Group documents quote a "best modern value" of the Hubble parameter as 72 km/s per megaparsec (+/- 10%). This value comes from the use of type Ia supernovae (which give relative distances to about 5%) along with data from Cepheid variables gathered by the Hubble Space Telescope. The WMAP mission data leads to a Hubble constant of 71 +/- 5% km/s per megaparsec.

Hubble Parameter 

The proportionality between recession velocity and distance in the Hubble Law is called the Hubble constant, or more appropriately the Hubble parameter since it does depend upon time. In recent years the value of the Hubble parameter has been considerably refined, and the current value given by the WMAP mission is 71 km/s per megaparsec.

Edwin Powell Hubble

The recession velocities of distant galaxies are known from the red shift, but the distances are much more uncertain. Distance measurement to nearby galaxies uses Cepheid variables as the main standard candle, but more distant galaxies must be examined to determine the Hubble constant since the direct Cepheid distances are all within the range of the gravitational pull of the local cluster. Use of the Hubble Space Telescope has permitted the detection of Cepheid variables in the Virgo cluster which have contributed to refinement of the distance scale.

The Particle Data Group documents quote a "best modern value" of the Hubble constant as 72 km/s per megaparsec (+/- 10%). This value comes from the use of type Ia supernovae (which give relative distances to about 5%) along with data from Cepheid variables gathered by the Hubble Space Telescope. The value from the WMAP survey is 71 km/s per megaparsec.

Another approach to the Hubble parameter gives emphasis to the fact that space itself is expanding, and at any given time can be described by a dimensionless scale factor R(t). The Hubble parameter is the ratio of the rate of change of the scale factor to the current value of the scale factor R:



The scale factor R for a given observed object in the expanding universe relative to R0 = 1 at the present time may be implied from the z parameter expression of the redshift. The Hubble parameter has the dimensions of inverse time, so a Hubble time tH may be obtained by inverting the present value of the Hubble parameter.


One must use caution in interpreting this "Hubble time" since the relationship of the expansion time to the Hubble time is different for the radiation dominated era and the mass dominated era. Projections of the expansion time may be made from the expansion models.



Hubble Parameter and Red Shifts

The Hubble Law states that the distance to a given galaxy is proportional to the recessional velocity as measured by the Doppler red shift. The red shift of the spectral lines is commonly expressed in terms of the z-parameter, which is the fractional shift in the spectral wavelength. The Hubble distance is given by

and can be calculated from the wavelength shift of any spectral line. If a spectral line which is normally at  nm is redshifted to nm, then z =  and = v/c = .The Hubble distance is given by:

r = c/( km/s/Mpc) = Mpc = Mly
Note: Values may be entered in any of the boxes to perform calculations. If needed parameters have not been entered, then they will default to values for the hydrogen red line with a 10% redshift and a Hubble constant of 70.

Mpc = mega parsecs
Mly = million light years


The Antimatter Problem 

Why such a predominance of matter over antimatter in the universe? From Trefil, pg 38. "after the beginning of the particle era. there is no known process which can change the net particle number of the universe" " ..by the time the universe is a millisecond old, the balance between matter and antimatter is fixed forever."

Clearly there is some asymmetry in the way nature treats matter and antimatter. One promising line of investigation is that of CP symmetry violations in the decay of particles by the weak interaction. The main expermental evidence comes from the decay of neutral kaons, which shows a small violation of CP symmetry. In the decay of the kaons to electrons, we have a clear distinction between matter and antimatter, and this could be at least one of the keys to the predominance of matter over antimatter in the universe.

A new discovery at the Large Hadron Collider is a 0.8% difference in the decay rate of the D-meson and its antiparticle, which could be another contribution to the solution of the antimatter problem.

The Galaxy Formation Problem 

Random nonuniformities in the expanding universe are not sufficient to allow the formation of galaxies. In the presence of the rapid expansion, the gravitational attraction is too slow for galaxies to form with any reasonable model of turbulence created by the expansion itself. "..the question of how the large-scale structure of the universe could have come into being has been a major unsolved problem in cosmology" Trefil p43 "we are forced to look to the period before 1 millisecond to explain the existence of galaxies.

The Horizon Problem 

The microwave background radiation from opposite directions in the sky is characterized by the same temperature within 0.01%, but the regions of space from which they were emitted at 500,000 years were more than light transit time apart and could not have "communicated" with each other to establish the apparent thermal equilibrium - they were beyond each other's "horizon". 

This situation is also referred to as the "isotropy problem", since the background radiation reaching us from all directions in space is so nearly isotropic. One way of expressing the problem is to say that the temperature of parts of space in opposite directions from us is almost exactly the same, but how could they be in thermal equilibrium with each other if they cannot communicate with each other?

If you considered the ultimate lookback time as 14 billion years (14 thousand million ) as obtained from a Hubble constant of 71 km/s per megaparsec as suggested by WMAP , then these remote parts of the universe are 28 billion light years apart, so why do they have exactly the same temperature?

Being twice the age of the universe apart is enough to make the point about the horizon problem, but as Schramm points out, if you look at this problem from earlier perspectives it is even more severe. At the time the photons were actually emitted, they would have been 100 times the age of the universe apart, or 100 times causally disconnected.

This problem is one of the lines of thought which led to the inflationary hypothesis put forth by Alan Guth in the early 1980's. The answer to the horizon problem from the inflationary point of view is that there was a period of incredibly rapid inflation very early in the big bang process which increased the size of the universe by 1020 or 1030, and that the present observable universe is "inside" that expansion.

The radiation we see is isotropic because all that space "inflated" from a tiny volume and had essentially identical initial conditions. This is a way to explain why parts of the universe so distant that they could never have communicated with each other look the same.

The Flatness Problem 

Observations indicate that the amount of matter in the universe is surely greater than one-tenth and surely less than ten times the critical amount needed to stop the expansion. It is either barely open or barely closed, or "very nearly flat". There is a good analogy here - a ball thrown up from the earth slows down. With the same velocity from a small asteroid, it might never stop (Trefil pp46-47).

Early in this theoretical toss from the asteroid, it might appear that you have thrown it with just the right velocity to go on forever, slowing toward zero velocity at infinite time and distance. But as time progressed, it would become more and more evident if you had missed the velocity window even a small amount. If after 20 billion years of travel, it still appeared that you had thrown it with the right velocity, then that original throw was precise indeed.

Any departures from "flatness" should become exaggerated with time, and at this stage of the universe, tiny irregularities should have been much amplified. If the density of the present universe appears to be very close to the critical density, then it must have been even closer to "flat" in earlier epochs. Alan Guth credits a lecture by Robert Dicke as one influence which put him on the "inflationary" path; Dicke pointed out that the flatness of todays universe would require that the universe be flat to one part in 1014 at one second after the big bang. Kaufmann suggests that right after the big bang, the density must have been equal to the critical density to 50 decimal places!

In the early 1980's, Alan Guth proposed that there was a brief period of extremely rapid expansion following the Planck time of 10-43 seconds. This "inflationary model" was a way of dealing with both the flatness problem and the horizon problem. If the universe inflated by 20 to 30 orders of magnitude, then the properties of an extremely tiny volume which could have been considered to be intimately connected were spread over the whole of the known universe today, contributing both extreme flatness and the extremely isotropic nature of the cosmic background radiation.

Before 1 Planck Time 

Before a time classified as a Planck time, 10-43 seconds, all of the four fundamental forces are presumed to have been unified into one force. All matter, energy, space and time are presumed to have exploded outward from the original singularity. Nothing is known of this period.

It is not that we know a great deal about later periods either, it is just that we have no real coherent models of what might happen under such conditions. The electroweak unification has been supported by the discovery of the W and Z particles, and can be used as a platform for discussion of the next step, the Grand Unification Theory (GUT). The final unification has been called a "supergrand unification theory", and becoming more popular is the designation "theory of everything" (TOE). But "theories of everything" are separated by two great leaps beyond the experiments we could ever hope to do on the Earth.

Era of 1 Planck Time 

In the era around one Planck time, 10-43 seconds, it is projected by present modeling of the fundamental forces that the gravity force begins to differentiate from the other three forces. This is the first of the spontaneous symmetry breaks which lead to the four observed types of interactions in the present universe.

Looking backward, the general idea is that back beyond 1 Planck time we can make no meaningful observations within the framework of classical gravitation. One way to approach the formulation of the Planck time is presented by Hsu. One of the characteristics of a black hole is that there is an event horizon beyond which we can obtain no information - scales smaller than that are hidden from the outside world. For a given enclosed mass, this limit is on the order of

where G is the gravitational constant and c is the speed of light. But from the uncertainty principle and the DeBroglie wavelength, we can infer that the smallest scale at which we could locate the event horizon would be the Compton wavelength.

Equating L and λ, we obtain a characteristic mass called the Planck mass:

Substituting this mass back into one of the length expressions gives the Planck length

and the light travel time across this length is called the Planck time:


Keep in mind that this is a characteristic time, so its order of magnitude is what should be noted. Sometimes it is defined with the wavelength above divided by 2π, so don't worry about the number of significant digits.

Separation of the Strong Force 

At a time around 10-36 seconds, present models project a separation of the strong force, one of the four fundamental forces. Before this time the forces other than gravity would be unified in what is called the grand unification. The spontaneous symmetry breaking which occurs in this era would distinguish as a separate interaction the force which would hold nuclei together in later eras.

In the 1970's. Sheldon Glashow and Howard Georgi proposed the grand unification of the strong, weak, and electromagnetic forces at energies above 1014 GeV. If the ordinary concept of thermal energy applied at such times, it would require a temperature of 1027 K for the average particle energy to be 1014 GeV.

Though the strong force is distinct from gravity and the electroweak force in this era, the energy level is still too high for the strong force to hold protons and neutrons together, so that the universe is still a "sizzling sea of quarks".

Inflationary Period
Triggered by the symmetry breaking that separates off the strong force, models suggest an extraordinary inflationary phase in the era 10-36 seconds to 10-32 seconds. More expansion is presumed to have occurred in this instant than in the entire period ( 14 billion years?) since.

The inflationary epoch may have expanded the universe by 1020 or 1030 in this incredibly brief time. The inflationary hypothesis offers a way to deal with the horizon problem and the flatness problem of cosmological models.

Lemonick and Nash in a popular article for Time describe inflation as an "amendment to the original Big Bang" as follows: "when the universe was less than a billionth of a billionth of a billionth of a second old, it briefly went through a period of superchanged expansion, ballooning from the size of a proton to the size of a grapegruit (and thus expanding at many, many times the speed of light). Then the expansion slowed to a much more stately pace. Improbable as the theory sounds, it has held up in every observation astronomers have managed to make."

Quark-antiquark Period 

As the inflationary period ends, the universe consists mostly of energy in the form of photon , and those particles which exist cannot bind into larger stable particles because of the enormous energy density. They would exist as a collection of quarks and antiquarks along with their exchange particles, a state which has been described as a "sizzling sea of quarks". This time period is estimated at 10-32 seconds to 10-5 seconds. During this period the electromagnetic and weak forces undergo the final symmetry break, ending the electroweak unification at about 10-12 seconds.

Quark Confinement 


When the expansion of the "primordial fireball" had cooled it to 1013 Kelvin, a time modeled to be about 10-6 seconds, the collision energies had dropped to about 1 GeV and quarks could finally hang onto each other to form individual protons and neutrons (and presumably other baryons.) 

At this time, all the kinds of particles which are a part of the present universe were in existence, even though the temperature was still much too high for the formation of nuclei. 

As indicated by Steven Weinberg in The First Three Minutes.


Sem comentários:

Enviar um comentário