Professor Dan Akerib discusses his research on dark matter
In the twentieth century — and even in the late nineteenth century — evidence was discovered and eventually confirmed that the mass data analyzed from the visible mass of stars in galaxies did not match up with the masses that would be implied by the rotation rates of the galaxies. In particular, the observed rotation rates of galaxies were faster than would be possible if the only mass in the galaxies came from visible matter from stars; if that were the case, the galaxies would fly apart. Additionally, gravitational lensing of galaxy clusters could determine their masses and it was found that there was a significant mass that was unaccounted for by the visible matter we can detect. Thus, researchers hypothesized the existence of “dark matter” — a form of unseen, hard-to-detect matter that provides the necessary gravity to hold these fast-rotating galaxies together.
Dan Akerib, a professor of particle physics and astrophysics at Stanford University and researcher at the SLAC National Accelerator Laboratory (originally named the Stanford Linear Accelerator Center) (originally named the Stanford Linear Accelerator Center), was at Carnegie Mellon on Oct. 26 to give a lecture on his years of researching the nature of dark matter, which is still one of astrophysics’ greatest unsolved mysteries.
In an interview with The Tartan, Akerib explained how dark matter came to captivate him. “I think that the idea that most of the matter in the universe is something that is different from us [is very fascinating]... someone described it as the ultimate Copernican revolution,” he reflected. He added, “not only are we not at the center of the universe, we’re not even made of the same stuff.”
Akerib started the lecture with a brief historical summary of the reasoning behind the dark matter hypothesis and noted the work of Vera Rubin, who pioneered the idea of graphing rotation rates versus distance from the center of the galaxy, which at outer distance ranges deviated significantly from the theoretical curve. Her work formed the foundation of the dark matter hypothesis and the basis of today’s research on the subject.
Akerib then summarized cosmological methods used by astronomers and astrophysicists to weigh the universe — in particular, to find the percentage composition of visible mass, dark matter, and energy in the universe. He explained that this ratio is significant as matter tends to attract while energy tends to repel. In particular, the three main methods that cosmologists use to find the ratios are supernovae standard candles, the 3K cosmic microwave background, and the matter distributions in the early and current universe. The results yield an energy density of approximately 70 percent.
Currently, one of the leading theories on the nature of dark matter is the WIMP theory, short for “weakly interacting massive particles.” They are called “weakly interacting” because they do not form stars or interact much with normal matter and have very small cross sections, and “massive” because of the gravitational pull they exert. WIMPs are hypothesized to form in the Big Bang, and because they are hypothesized to have small cross sections when the universe was expanding, it would have been probabilistically difficult for WIMPs and anti-WIMPs in the early universe to find each other and annihilate. Akerib described WIMPs as “similar to very heavy neutrinos”.
The next step of the theory is always to find evidence for it, but so far detection of WIMPs has proved elusive. He explained that WIMPs are most likely not periodic table elements and electrically neutral; “ordinary matter is virtually invisible to this stuff” — as an example of just how little interaction they have with ordinary matter, he stated that WIMPs on average scatter once in a light-year of lead. WIMPs are, by definition, elusive to our eyes and instruments.
Luckily for WIMP researchers like Akerib, WIMP events occur on the order of 10 to the power of 16 times annually, so Akerib and his team have developed a detector to try and detect some of these events. The detector-building process faced a slew of technical challenges — electronic noise, the inability to detect signals below a certain range, and the natural radioactivity scattered throughout the Earth’s crust that could interfere with the detector. Eventually, the detector was designed to be a chamber that held liquid xenon (to keep the radioactivity as low as possible in the center of the chamber), which would be ionized by the kinetic energy of the WIMP particles, creating an ionization that can be detected. The detector was housed in a large water pool a few meters underground to minimize interference from cosmic rays and other extraneous signals.
While the initial runs have not caught any events, Akerib and his team plan to continue their research by building a detector fifty times larger than the original one. Much of their everyday work consists of making and testing the wire meshes that go in the detector — which is very sensitive to the smallest imperfections — and purifying xenon gas samples using gas charcoal chromatography.
He detailed an anecdote from the building process: locals near the site of the upcoming detector were worried about radiation levels from the xenon, which turned out to be roughly equal to 20 bananas’ worth of radiation. As a reference, people get about 100 bananas’ dosage of radiation daily. “You can no longer buy bananas in the SLAC cafeteria because they’re radioactive [and would interfere with the sensitive detector],” he added.
The plan is to run the new detector for 1,000 days.