What Is the Standard Model?

What is a Photon?In physics, a photon is an elementary particle, the quantum of the electromagnetic interaction and the basic unit of light and all other forms of electromagnetic radiation. In other words a photon is a little packet of energy which can carry electromagnetic radiation. It is also the force carrier for the electromagnetic force. The effects of this force are easily observable at both the microscopic and macroscopic level, because the photon has no rest mass; this allows for interactions at long distances. Like all elementary particles, photons are currently best explained by quantum mechanics and will exhibit wave–particle duality, exhibiting properties of both waves and particles. For example, a single photon may be refracted by a lens or exhibit wave interference with itself, but also act as a particle giving a definite result when quantitative momentum is measured. The modern concept of the photon was developed gradually by Albert Einstein to explain experimental observations that did not fit the classical wave model of light. In particular, the photon model accounted for the frequency dependence of light's energy, and explained the ability of matter and radiation to be in thermal equilibrium. It also accounted for anomalous observations, including the properties of black body radiation, that other physicists, most notably Max Planck, had sought to explain using semiclassical models, in which light is still described by Maxwell's equations, but the material objects that emit and absorb light are quantized. Although these semiclassical models contributed to the development of quantum mechanics, further experiments validated Einstein's hypothesis that light itself is quantized; the quanta of light are photons. In the modern Standard Model of particle physics, photons are described as a necessary consequence of physical laws having a certain symmetry at every point in spacetime. The intrinsic properties of photons, such as charge, mass and spin, are determined by the properties of this gauge symmetry. The neutrino theory of light, which attempts to describe the photon as a composite structure, has been unsuccessful so far. The photon concept has led to momentous advances in experimental and theoretical physics, such as lasers, Bose–Einstein condensation, quantum field theory, and the probabilistic interpretation of quantum mechanics. It has been applied to photochemistry, highresolution microscopy, and measurements of molecular distances. Recently, photons have been studied as elements of quantum computers and for sophisticated applications in optical communication such as quantum cryptography.
Standard model of particle physicsIn 1900, Max Planck was working on blackbody radiation and suggested that the energy in electromagnetic waves could only be released in "packets" of energy. In his 1901 article in Annalen der Physik he called these packets "energy elements". The word quanta (singular quantum) was used even before 1900 to mean particles or amounts of different quantities, including electricity . Later, in 1905 Albert Einstein went further by suggesting that electromagnetic waves could only exist in these discrete wavepackets. He called such a wavepacket the light quantum (German: das Lichtquant). The name photon derives from the Greek word for light, and was coined in 1926 by the physical chemist Gilbert Lewis, who published a speculative theory in which photons were "uncreatable and indestructible". Although Lewis' theory was never accepted as it was contradicted by many experiments, his new name, photon, was adopted immediately by most physicists. Isaac Asimov credits Arthur Compton with defining quanta of energy as photons in 1927. In physics, a photon is usually denoted by the Greek letter gamma. This symbol for the photon probably derives from gamma rays, which were discovered in 1900 by Paul Villard, named by Ernest Rutherford in 1903, and shown to be a form of electromagnetic radiation in 1914 by Rutherford and Edward Andrade. In chemistry and optical engineering, photons are usually symbolized by h nu, the energy of a photon, where h is Planck's constant and the Greek letter nu is the photon's frequency. Much less commonly, the photon can be symbolized by hf, where its frequency is denoted by f.
Physical propertiesThe photon is massless, has no electric charge, and does not decay spontaneously in empty space. A photon has two possible polarization states and is described by exactly three continuous parameters: the components of its wave vector, which determine its wavelength and its direction of propagation. The photon is the gauge boson for electromagnetism, and therefore all other quantum numbers of the photon (such as lepton number, baryon number, and flavour quantum numbers) are zero. Photons are emitted in many natural processes. For example, when a charge is accelerated it emits synchrotron radiation. During a molecular, atomic or nuclear transition to a lower energy level, photons of various energy will be emitted, from infrared light to gamma rays. A photon can also be emitted when a particle and its corresponding antiparticle are annihilated (for example, electronpositron annihilation). In empty space, the photon moves at c (the speed of light) and its energy and momentum are related by E = pc, where p is the magnitude of the momentum vector p. The photon also carries spin angular momentum that does not depend on its frequency. The magnitude of its spin is and the component measured along its direction of motion, its helicity, must be ±ħ. These two possible helicities, called righthanded and lefthanded, correspond to the two possible circular polarization states of the photon. To illustrate the significance of these formulae, the annihilation of a particle with its antiparticle in free space must result in the creation of at least two photons for the following reason. In the center of mass frame, the colliding antiparticles have no net momentum, whereas a single photon always has momentum (since it is determined, as we have seen, only by the photon's frequency or wavelength—which cannot be zero). Hence, conservation of momentum (or equivalently, translational invariance) requires that at least two photons are created, with zero net momentum. (However it is possible if the system interacts with another particle or field for annihilation to produce one photon, as when a positron annihilates with a bound atomic electron, it is possible for only one photon to be emitted, as the nuclear Coulomb field breaks translational symmetry.) The energy of the two photons, or, equivalently, their frequency, may be determined from conservation of fourmomentum. Seen another way, the photon can be considered as its own antiparticle. The reverse process, pair production, is the dominant mechanism by which highenergy photons such as gamma rays lose energy while passing through matter. That process is the reverse of "annihilation to one photon" allowed in the electric field of an atomic nucleus. The classical formulae for the energy and momentum of electromagnetic radiation can be reexpressed in terms of photon events. For example, the pressure of electromagnetic radiation on an object derives from the transfer of photon momentum per unit time and unit area to that object, since pressure is force per unit area and force is the change in momentum per unit time.
Experimental checks on photon massThe photon is currently believed to be strictly massless, but this is an experimental question. If the photon is not a strictly massless particle, it would not move at the exact speed of light in vacuum, c. Its speed would be lower and depend on its frequency. Relativity would be unaffected by this; the socalled speed of light, c, would then not be the actual speed at which light moves, but a constant of nature which is the maximum speed that any object could theoretically attain in spacetime. Thus, it would still be the speed of spacetime ripples (gravitational waves and gravitons), but it would not be the speed of photons. A massive photon would have other effects as well. Coulomb's law would be modified and the electromagnetic field would have an extra physical degree of freedom. These effects yield more sensitive experimental probes of the photon mass than the frequency dependence of the speed of light. If Coulomb's law is not exactly valid, then that would cause the presence of an electric field inside a hollow conductor when it is subjected to an external electric field. This thus allows one to test Coulomb's law to very high precision. Sharper upper limits have been obtained in experiments designed to detect effects caused by the Galactic vector potential. Although the galactic vector potential is very large because the galactic magnetic field exists on very long length scales, only the magnetic field is observable if the photon is massless. In case of a massive photon, the mass term would affect the galactic plasma. The fact that no such effects are seen implies an upper bound on the photon mass of m < 3×10−27 eV/c2. The galactic vector potential can also be probed directly by measuring the torque exerted on a magnetized ring. Such methods were used to obtain the sharper upper limit of 10−18eV/c2 given by the Particle Data Group.
These sharp limits from the nonobservation of the effects caused by the galactic vector potential have been shown to be model dependent. If the photon mass is generated via the Higgs mechanism then the upper limit of m≲10−14 eV/c2 from the test of Coulomb's law is valid.
Historical developmentThomas Young's doubleslit experiment in 1805 showed that light can act as a wave, helping to defeat early particle theories of light. In most theories up to the eighteenth century, light was pictured as being made up of particles. Since particle models cannot easily account for the refraction, diffraction and birefringence of light, wave theories of light were proposed by René Descartes (1637), Robert Hooke (1665), and Christian Huygens (1678); however, particle models remained dominant, chiefly due to the influence of Isaac Newton. In the early nineteenth century, Thomas Young and August Fresnel clearly demonstrated the interference and diffraction of light and by 1850 wave models were generally accepted. In 1865, James Clerk Maxwell's prediction that light was an electromagnetic wave—which was confirmed experimentally in 1888 by Heinrich Hertz's detection of radio waves—seemed to be the final blow to particle models of light. In 1900, Maxwell's theoretical model of light as oscillating electric and magnetic fields seemed complete. However, several observations could not be explained by any wave model of electromagnetic radiation, leading to the idea that lightenergy was packaged into quanta described by E=hν. Later experiments showed that these lightquanta also carry momentum and, thus, can be considered particles: the photon concept was born, leading to a deeper understanding of the electric and magnetic fields themselves. The Maxwell wave theory, however, does not account for all properties of light. The Maxwell theory predicts that the energy of a light wave depends only on its intensity, not on its frequency; nevertheless, several independent types of experiments show that the energy imparted by light to atoms depends only on the light's frequency, not on its intensity. For example, some chemical reactions are provoked only by light of frequency higher than a certain threshold; light of frequency lower than the threshold, no matter how intense, does not initiate the reaction. Similarly, electrons can be ejected from a metal plate by shining light of sufficiently high frequency on it (the photoelectric effect); the energy of the ejected electron is related only to the light's frequency, not to its intensity. At the same time, investigations of blackbody radiation carried out over four decades (1860–1900) by various researchers culminated in Max Planck's hypothesis that the energy of any system that absorbs or emits electromagnetic radiation of frequency ν is an integer multiple of an energy quantum E=hν. As shown by Albert Einstein, some form of energy quantization must be assumed to account for the thermal equilibrium observed between matter and electromagnetic radiation; for this explanation of the photoelectric effect, Einstein received the 1921 Nobel Prize in physics. Since the Maxwell theory of light allows for all possible energies of electromagnetic radiation, most physicists assumed initially that the energy quantization resulted from some unknown constraint on the matter that absorbs or emits the radiation. In 1905, Einstein was the first to propose that energy quantization was a property of electromagnetic radiation itself. Although he accepted the validity of Maxwell's theory, Einstein pointed out that many anomalous experiments could be explained if the energy of a Maxwellian light wave were localized into pointlike quanta that move independently of one another, even if the wave itself is spread continuously over space. In 1909 and 1916, Einstein showed that, if Planck's law of blackbody radiation is accepted, the energy quanta must also carry momentum p=h/λ, making them fullfledged particles. This photon momentum was observed experimentally by Arthur Compton, for which he received the Nobel Prize in 1927. The pivotal question was then: how to unify Maxwell's wave theory of light with its experimentally observed particle nature? The answer to this question occupied Albert Einstein for the rest of his life, and was solved in quantum electrodynamics and its successor, the Standard Model
Early objectionsUp to 1923, most physicists were reluctant to accept that light itself was quantized. Instead, they tried to explain photon behavior by quantizing only matter, as in the Bohr model of the hydrogen atom. Even though these semiclassical models were only a first approximation, they were accurate for simple systems and they led to quantum mechanics. Einstein's 1905 predictions were verified experimentally in several ways in the first two decades of the 20th century, as recounted in Robert Millikan's Nobel lecture. However, before Compton's experiment showing that photons carried momentum proportional to their wave number (or frequency) (1922), most physicists were reluctant to believe that electromagnetic radiation itself might be particulate. Instead, there was a widespread belief that energy quantization resulted from some unknown constraint on the matter that absorbs or emits radiation. Attitudes changed over time. In part, the change can be traced to experiments such as Compton scattering, where it was much more difficult not to ascribe quantization to light itself to explain the observed results. Even after Compton's experiment, Bohr, Hendrik Kramers and John Slater made one last attempt to preserve the Maxwellian continuous electromagnetic field model of light, the socalled BKS model. To account for the thenavailable data, two drastic hypotheses had to be made: Energy and momentum are conserved only on the average in interactions between matter and radiation, not in elementary processes such as absorption and emission. This allows one to reconcile the (at the time believed to be) discontinuously changing energy of the atom (jump between energy states) with the continuous release of energy into radiation. (It is now known that this is actually a continuous process, the combined atomfield system evolving in time according to Schroedinger's equation.) Causality is abandoned. For example, spontaneous emissions are merely emissions induced by a "virtual" electromagnetic field. However, refined Compton experiments showed that energymomentum is conserved extraordinarily well in elementary processes; and also that the jolting of the electron and the generation of a new photon in Compton scattering obey causality to within 10 ps. Accordingly, Bohr and his coworkers gave their model "as honorable a funeral as possible". Nevertheless, the failures of the BKS model inspired Werner Heisenberg in his development of matrix mechanics. A few physicists persisted in developing semiclassical models in which electromagnetic radiation is not quantized, but matter appears to obey the laws of quantum mechanics. Although the evidence for photons from chemical and physical experiments was overwhelming by the 1970s, this evidence could not be considered as absolutely definitive; since it relied on the interaction of light with matter, a sufficiently complicated theory of matter could in principle account for the evidence. Nevertheless, all semiclassical theories were refuted definitively in the 1970s and 1980s by photoncorrelation experiments. Hence, Einstein's hypothesis that quantization is a property of light itself is considered to be proven.
Wave–particle duality and uncertainty principlesPhotons, like all quantum objects, exhibit both wavelike and particlelike properties. Their dual wave–particle nature can be difficult to visualize. The photon displays clearly wavelike phenomena such as diffraction and interference on the length scale of its wavelength. For example, a single photon passing through a doubleslit experiment lands on the screen exhibiting interference phenomena but only if no measure was made on the actual slit being run across. To account for the particle interpretation that phenomenon is called probability distribution but behaves according to the Maxwell's equations. However, experiments confirm that the photon is not a short pulse of electromagnetic radiation; it does not spread out as it propagates, nor does it divide when it encounters a beam splitter". Rather, the photon seems to be a pointlike particle since it is absorbed or emitted as a whole by arbitrarily small systems, systems much smaller than its wavelength, such as an atomic nucleus (≈10−15 m across) or even the pointlike electron. Nevertheless, the photon is not a pointlike particle whose trajectory is shaped probabilistically by the electromagnetic field, as conceived by Einstein and others; that hypothesis was also refuted by the photoncorrelation experiments cited above. According to our present understanding, the electromagnetic field itself is produced by photons, which in turn result from a local gauge symmetry and the laws of quantum field theory. A key element of quantum mechanics is Heisenberg's uncertainty principle, which forbids the simultaneous measurement of the position and momentum of a particle along the same direction. Remarkably, the uncertainty principle for charged, material particles requires the quantization of light into photons, and even the frequency dependence of the photon's energy and momentum. An elegant illustration is Heisenberg's thought experiment for locating an electron with an ideal microscope. The position of the electron can be determined to within the resolving power of the microscope. The momentum of the electron is uncertain, since it received a "kick" delta p from the light scattering from it into the microscope. If light were not quantized into photons, the uncertainty delta p could be made arbitrarily small by reducing the light's intensity. In that case, since the wavelength and intensity of light can be varied independently, one could simultaneously determine the position and momentum to arbitrarily high accuracy, violating the uncertainty principle. Both photons and material particles such as electrons create analogous interference patterns when passing through a doubleslit experiment. For photons, this corresponds to the interference of a Maxwell light wave whereas, for material particles, this corresponds to the interference of the Schrödinger wave equation. Although this similarity might suggest that Maxwell's equations are simply Schrödinger's equation for photons, most physicists do not agree. For one thing, they are mathematically different; most obviously, Schrödinger's one equation solves for a complex field, whereas Maxwell's four equations solve for real fields. More generally, the normal concept of a Schrödinger probability wave function cannot be applied to photons. Being massless, they cannot be localized without being destroyed; technically, photons cannot have a position eigenstate.
Bose–Einstein model of a photon gasIn 1924, Satyendra Nath Bose derived Planck's law of blackbody radiation without using any electromagnetism, but rather a modification of coarsegrained counting of phase space. Einstein showed that this modification is equivalent to assuming that photons are rigorously identical and that it implied a "mysterious nonlocal interaction", now understood as the requirement for a symmetric quantum mechanical state. This work led to the concept of coherent states and the development of the laser. In the same papers, Einstein extended Bose's formalism to material particles (bosons) and predicted that they would condense into their lowest quantum state at low enough temperatures; this Bose–Einstein condensation was observed experimentally in 1995. The modern view on this is that photons are, by virtue of their integer spin, bosons (as opposed to fermions with halfinteger spin). By the spinstatistics theorem, all bosons obey Bose–Einstein statistics (whereas all fermions obey FermiDirac statistics).
Stimulated and spontaneous emissionStimulated emission (in which photons "clone" themselves) was predicted by Einstein in his kinetic analysis, and led to the development of the laser. Einstein's derivation inspired further developments in the quantum treatment of light, which led to the statistical interpretation of quantum mechanics. In 1916, Einstein showed that Planck's radiation law could be derived from a semiclassical, statistical treatment of photons and atoms, which implies a relation between the rates at which atoms emit and absorb photons. The condition follows from the assumption that light is emitted and absorbed by atoms independently, and that the thermal equilibrium is preserved by interaction with atoms. Consider a cavity in thermal equilibrium and filled with electromagnetic radiation and atoms that can emit and absorb that radiation. Thermal equilibrium requires that the energy density ρ(ν) of photons with frequency ν (which is proportional to their number density) is, on average, constant in time; hence, the rate at which photons of any particular frequency are emitted must equal the rate of absorbing them. Einstein began by postulating simple proportionality relations for the different reaction rates involved. Einstein could not fully justify his rate equations, but claimed that it should be possible to calculate the coefficients Aij, Bji and Bij once physicists had obtained "mechanics and electrodynamics modified to accommodate the quantum hypothesis". In fact, in 1926, Paul Dirac derived the Bij rate constants in using a semiclassical approach, and, in 1927, succeeded in deriving all the rate constants from first principles within the framework of quantum theory. Dirac's work was the foundation of quantum electrodynamics, i.e., the quantization of the electromagnetic field itself. Dirac's approach is also called second quantization or quantum field theory; earlier quantum mechanical treatments only treat material particles as quantum mechanical, not the electromagnetic field.
Einstein was troubled by the fact that his theory seemed incomplete, since it did not determine the direction of a spontaneously emitted photon. A probabilistic nature of lightparticle motion was first considered by Newton in his treatment of birefringence and, more generally, of the splitting of light beams at interfaces into a transmitted beam and a reflected beam. Newton hypothesized that hidden variables in the light particle determined which path it would follow. Similarly, Einstein hoped for a more complete theory that would leave nothing to chance, beginning his separation from quantum mechanics. Ironically, Max Born's probabilistic interpretation of the wave function was inspired by Einstein's later work searching for a more complete theory.
Second quantizationIn 1910, Peter Debye derived Planck's law of blackbody radiation from a relatively simple assumption. He correctly decomposed the electromagnetic field in a cavity into its Fourier modes, and assumed that the energy in any mode was an integer multiple of hν, where ν is the frequency of the electromagnetic mode. Planck's law of blackbody radiation follows immediately as a geometric sum. However, Debye's approach failed to give the correct formula for the energy fluctuations of blackbody radiation, which were derived by Einstein in 1909. In quantum field theory, the probability of an event is computed by summing the probability amplitude (a complex number) for all possible ways in which the event can occur. Dirac was able to derive Einstein's Aij and Bij coefficients from first principles, and showed that the Bose–Einstein statistics of photons is a natural consequence of quantizing the electromagnetic field correctly (Bose's reasoning went in the opposite direction; he derived Planck's law of black body radiation by assuming BE statistics). In Dirac's time, it was not yet known that all bosons, including photons, must obey BE statistics. Dirac's secondorder perturbation theory can involve virtual photons, transient intermediate states of the electromagnetic field; the static electric and magnetic interactions are mediated by such virtual photons. In such quantum field theories, the probability amplitude of observable events is calculated by summing over all possible intermediate steps, even ones that are unphysical; hence, virtual photons are not constrained to satisfy E = pc, and may have extra polarization states; depending on the gauge used, virtual photons may have three or four polarization states, instead of the two states of real photons. Although these transient virtual photons can never be observed, they contribute measurably to the probabilities of observable events. Indeed, such secondorder and higherorder perturbation calculations can give apparently infinite contributions to the sum. Such unphysical results are corrected for using the technique of renormalization. Other virtual particles may contribute to the summation as well; for example, two photons may interact indirectly through virtual electronpositron pairs. In fact, such photonphoton scattering, as well as electronphoton scattering, is meant to be one of the modes of operations of the planned particle accelerator, the International Linear Collider.
In modern physics notation, the quantum state of the electromagnetic field is written as a Fock state, a tensor product of the states for each electromagnetic mode.
The photon as a gauge bosonThe electromagnetic field can be understood as a gauge field, i.e., as a field that results from requiring that a gauge symmetry holds independently at every position in spacetime. For the electromagnetic field, this gauge symmetry is the Abelian U(1) symmetry of a complex number, which reflects the ability to vary the phase of a complex number without affecting observables or real valued functions made from it, such as the energy or the Lagrangian. The quanta of an Abelian gauge field must be massless, uncharged bosons, as long as the symmetry is not broken; hence, the photon is predicted to be massless, and to have zero electric charge and integer spin. The particular form of the electromagnetic interaction specifies that the photon must have spin ±1; thus. These two spin components correspond to the classical concepts of righthanded and lefthanded circularly polarized light. However, the transient virtual photons of quantum electrodynamics may also adopt unphysical polarization states. In the prevailing Standard Model of physics, the photon is one of four gauge bosons in the electroweak interaction; the other three are denoted W+, W− and Z0 and are responsible for the weak interaction. Unlike the photon, these gauge bosons have mass, owing to a mechanism that breaks their SU(2) gauge symmetry. The unification of the photon with W and Z gauge bosons in the electroweak interaction was accomplished by Sheldon Glashow, Abdus Salam and Steven Weinberg, for which they were awarded the 1979 Nobel Prize in physics. Physicists continue to hypothesize grand unified theories that connect these four gauge bosons with the eight gluon gauge bosons of quantum chromodynamics; however, key predictions of these theories, such as proton decay, have not been observed experimentally.
Contributions to the mass of a systemThe energy of a system that emits a photon is decreased by the energy E of the photon as measured in the rest frame of the emitting system, which may result in a reduction in mass in the amount E / c2. Similarly, the mass of a system that absorbs a photon is increased by a corresponding amount. As an application, the energy balance of nuclear reactions involving photons is commonly written in terms of the masses of the nuclei involved, and terms of the form E / c2 for the gamma photons (and for other relevant energies, such as the recoil energy of nuclei). This concept is applied in key predictions of quantum electrodynamics. In that theory, the mass of electrons (or, more generally, leptons) is modified by including the mass contributions of virtual photons, in a technique known as renormalization. Such "radiative corrections" contribute to a number of predictions of QED, such as the magnetic dipole moment of leptons, the Lamb shift, and the hyperfine structure of bound lepton pairs, such as muonium and positronium. Since photons contribute to the stressenergy tensor, they exert a gravitational attraction on other objects, according to the theory of general relativity. Conversely, photons are themselves affected by gravity; their normally straight trajectories may be bent by warped spacetime, as in gravitational lensing, and their frequencies may be lowered by moving to a higher gravitational potential, as in the PoundRebka experiment. However, these effects are not specific to photons; exactly the same effects would be predicted for classical electromagnetic waves.
Photons in matter(Visible) light that travels through transparent matter does so at a lower speed than c, the speed of light in a vacuum. Xrays, on the other hand, usually have a phase velocity above c, as evidenced by total external reflection. In addition, light can also undergo scattering and absorption. There are circumstances in which heat transfer through a material is mostly radiative, involving emission and absorption of photons within it. An example would be in the core of the sun. Energy can take about a million years to reach the surface. However, this phenomenon is distinct from scattered radiation passing diffusely through matter, as it involves local equilibration between the radiation and the temperature. Thus, the time is how long it takes the energy to be transferred, not the photons themselves. Once in open space, a photon from the Sun takes only 8.3 minutes to reach Earth. The factor by which the speed of light is decreased in a material is called the refractive index of the material. In a classical wave picture, the slowing can be explained by the light inducing electric polarization in the matter, the polarized matter radiating new light, and the new light interfering with the original light wave to form a delayed wave. In a particle picture, the slowing can instead be described as a blending of the photon with quantum excitations of the matter (quasiparticles such as phonons and excitons) to form a polariton; this polariton has a nonzero effective mass, which means that it cannot travel at c. Alternatively, photons may be viewed as always traveling at c, even in matter, but they have their phase shifted (delayed or advanced) upon interaction with atomic scatters: this modifies their wavelength and momentum, but not speed. A light wave made up of these photons does travel slower than the speed of light. In this view the photons are "bare", and are scattered and phase shifted, while in the view of the preceding paragraph the photons are "dressed" by their interaction with matter, and move without scattering or phase shifting, but at a lower speed. Light of different frequencies may travel through matter at different speeds; this is called dispersion. In some cases, it can result in extremely slow speeds of light in matter. The effects of photon interactions with other quasiparticles may be observed directly in Raman scattering and Brillouin scattering. Photons can also be absorbed by nuclei, atoms or molecules, provoking transitions between their energy levels. A classic example is the molecular transition of retinal C20H28O, which is responsible for vision, as discovered in 1958 by Nobel laureate biochemist George Wald and coworkers. The absorption provokes a cistrans isomerization that, in combination with other such transitions, is transduced into nerve impulses. The absorption of photons can even break chemical bonds, as in the photodissociation of chlorine; this is the subject of photochemistry. Analogously, gamma rays can in some circumstances dissociate atomic nuclei in a process called photodisintegration.
Technological applicationsPhotons have many applications in technology. Individual photons can be detected by several methods. The classic photomultiplier tube exploits the photoelectric effect: a photon landing on a metal plate ejects an electron, initiating an everamplifying avalanche of electrons. Chargecoupled device chips use a similar effect in semiconductors: an incident photon generates a charge on a microscopic capacitor that can be detected. Other detectors such as Geiger counters use the ability of photons to ionize gas molecules, causing a detectable change in conductivity. Planck's energy formula E = hν is often used by engineers and chemists in design, both to compute the change in energy resulting from a photon absorption and to predict the frequency of the light emitted for a given energy transition. For example, the emission spectrum of a fluorescent light bulb can be designed using gas molecules with different electronic energy levels and adjusting the typical energy with which an electron hits the gas molecules within the bulb. Under some conditions, an energy transition can be excited by "two" photons that individually would be insufficient. This allows for higher resolution microscopy, because the sample absorbs energy only in the region where two beams of different colors overlap significantly, which can be made much smaller than the excitation volume of a single beam. Moreover, these photons cause less damage to the sample, since they are of lower energy. In some cases, two energy transitions can be coupled so that, as one system absorbs a photon, another nearby system "steals" its energy and reemits a photon of a different frequency. This is the basis of fluorescence resonance energy transfer, a technique that is used in molecular biology to study the interaction of suitable proteins. Several different kinds of hardware random number generator involve the detection of single photons. In one example, for each bit in the random sequence that is to be produced, a photon is sent to a beamsplitter. In such a situation, there are two possible outcomes of equal probability. The actual outcome is used to determine whether the next bit in the sequence is Recent researchMuch research has been devoted to applications of photons in the field of quantum optics. Photons seem wellsuited to be elements of an extremely fast quantum computer, and the quantum entanglement of photons is a focus of research. Nonlinear optical processes are another active research area, with topics such as twophoton absorption, selfphase modulation, modulational instability and optical parametric oscillators. However, such processes generally do not require the assumption of photons per se; they may often be modeled by treating atoms as nonlinear oscillators. The nonlinear process of spontaneous parametric down conversion is often used to produce singlephoton states. Finally, photons are essential in some aspects of optical communication, especially for quantum cryptography. There's More: What is a Particle Accelerator? What is the Schrödinger's Cat Paradox? What is Heisenberg's Uncertainty Principle? This material is from Wikipedia, used with permission, and may also be used by others. For terms of use, go to Creativecommons.org 
Here is a selection of the most significant electricians' books available online today, at the best prices around. Clicking on any logo provides access to reviews and ratings by electricians. A good place to start is with the 2008 NEC Handbook, which contains the complete text of the current code plus extensive commentary, diagrams and illustrations. Other books of interest for the electrician are available as well.
There's More:
What is a Particle Accelerator?
What is the Schrödinger's Cat Paradox?
What is Heisenberg's Uncertainty Principle?
This material is from Wikipedia, used with permission, and may also be used by others. For terms of use, go to Creativecommons.org
Here is a selection of the most significant electricians' books available online today, at the best prices around. Clicking on any logo provides access to reviews and ratings by electricians. A good place to start is with the 2008 NEC Handbook, which contains the complete text of the current code plus extensive commentary, diagrams and illustrations. Other books of interest for the electrician are available as well.