There is a realm the laws of physics forbid us from accessing, below the resolving power of our most powerful microscopes and beyond the reach of our most sensitive telescopes. There’s no telling what might exist there—perhaps entire universes.
Since the beginning of human inquiry, there have been limits to our observing abilities. Worldviews were restricted by the availability of tools and our own creativity. Over time, the size of our observable universe grew as our knowledge grew—we saw planets beyond Earth, stars beyond the Sun, and galaxies beyond our own, while we peered deeper into cells and atoms. And then, during the 20th century, mathematics emerged that can explain, shockingly well—and, to a point, predict—the world we live in. The theories of special and general relativity describe exactly the motion of the planets, stars, and galaxies. Quantum mechanics and the Standard Model of Particle Physics have worked wonders at clarifying what goes on inside of atoms.
However, with each of these successful theories comes hard-and-fast limits to our observing abilities. Today, these limits seem to define true boundaries to our knowledge.
That Which We Can’t Know
On the large end, there is a speed limit that caps what we can see. It hampers any hope for us to observe most of our universe first-hand.
The speed of light is approximately 300,000,000 meters per second (or 671,000,000 miles per hour, if that’s how your brain works). The theory of special relativity, proposed by Albert Einstein in 1905, forbids anything from traveling faster than that. Massless things always travel this speed in a vacuum. Accelerating massive objects to this speed essentially introduces a divide-by-zero in one of special relativity’s equations; it would take infinite energy to accelerate something with mass to the speed of light.
“It’s a built-in limit to our understanding of the universe—these are the smallest meaningful numbers that quantum mechanics allows us to define.”
If, as a child, you hopped on a spaceship traveling out of the solar system at 99% the speed of light, you might be able to explore other parts of the galaxy before succumbing to age, but because time is relative, your friends and family would likely be long gone before you could report your observations back to Earth. But you’d still have your limits—the Milky Way galaxy is 105,700 light-years across, our neighboring galaxy Andromeda is 2.5 million light-years away, and the observable universe is around 93 billion light-years across. Any hope of exploring farther distances would require multigenerational missions or, if using a remote probe, accepting that you’ll be dead and humanity may be very different by the time the probe’s data returns to Earth.
The speed of light is more than just a speed limit, however. Since the light we see requires travel time to arrive at Earth, then we must contend with several horizons beyond which we can’t interact, which exist due to Einstein’s theory of general relativity. There is an event horizon, a moving boundary in space and time beyond which light and particles emitted now will never reach Earth, no matter how much time passes—those events we will never see. There is also the particle horizon, or a boundary beyond which we cannot observe light arriving from the past—this defines the observable universe.
There’s a second kind of event horizon, one surrounding a black hole. Gravity is an effect caused by the presence of massive objects warping the shape of space, like a bowling ball on a trampoline. A massive-enough object might warp space such that no information can exit beyond a certain boundary.
These limits aren’t static. “We will see further and further as time goes on, because the distance light travels outward gets bigger and bigger,” said Tamara Davis, astrophysics professor who studies cosmology at the University of Queensland. But this expanding perspective won’t be permanent—since our universe is also expanding (and that expansion is accelerating). “If you fast-forward 100 billion years into the future, all of the galaxies that we can currently see will be so far, and accelerating so quickly away from us, that the light they emitted in the past will have faded from view.” At that point, our observable universe would be just those nearby galaxies gravitationally bound to our own.
Another boundary lives on the other end of the scale. Zoom in between molecules, into the center of atoms, deep into their nuclei and into the quarks that make up their protons and neutrons. Here, another set of rules, mostly devised in the 20th century, governs how things work. In the rules of quantum mechanics, everything is “quantized,” meaning particles’ properties (their energy or their location around an atomic nucleus, for example) can only take on distinct values, like steps on a ladder, rather than a continuum, like places on a slide. However, quantum mechanics also demonstrates that particles aren’t just dots; they simultaneously act like waves, meaning that they can take on multiple values at the same time and experience a host of other wave-like effects, such as interference. Essentially, the quantum world is a noisy place, and our understanding of it is innately tied to probability and uncertainty.
“If you fast-forward 100 billion years into the future, all of the galaxies that we can currently see will be so far, and accelerating so quickly away from us, that the light they emitted in the past will have faded from view.”
This quantum-ness means that if you try to peer too closely, you’ll run into the observer effect: Attempting to see things this small requires bouncing light off of them, and the energy from this interaction can fundamentally change that which you’re attempting to observe.
But there’s an even more fundamental limit to what we can see. Werner Heisenberg discovered that the wonkiness of quantum mechanics introduces minimum accuracy with which you can measure certain pairs of mathematically related properties, such as a particle’s position and momentum. The more accurately you can measure one, the less accurately you can measure the other. And finally, even attempting to measure just one of those properties becomes impossible at a small enough scale, called the Planck scale, which comes with a shortest length, 10^-35 meters, and a shortest time interval, around 5 x 10^-44 seconds.
“You take the constant numbers that describe nature—a gravitational constant, the speed of light, and Planck’s constant, and if I put these constants together, I get the Planck length,” said James Beacham, physicist at the ATLAS experiment of the Large Hadron Collider. “Mathematically, it’s nothing special—I can write down a smaller number like 10^-36 meters… But quantum mechanics says that if I have a prediction to my theory that says structure exists at a smaller scale, then quantum has built-in uncertainty for it. It’s a built-in limit to our understanding of the universe—these are the smallest meaningful numbers that quantum mechanics allows us to define.”
This is assuming that quantum mechanics is the correct way to think about the universe, of course. But time and time again, experiments have demonstrated there’s no reason to think otherwise.
Probing the Unknown
These fundamental limits, large and small, present clear barriers to our knowledge. Our theories tell us that we will never directly observe what lies beyond these cosmic horizons or what structures exist smaller than the Planck scale. However, the answers to some of the grandest questions we ask ourselves might exist beyond those very walls. Why and how did the universe begin? What lies beyond our universe? Why do things look and act the way that they do? Why do things exist?
The unobservable and untestable exist beyond the scope of scientific inquiry. “All’s well and good to write down the math and say you can explain the universe, but if you have no way of testing the hypothesis, then that’s getting outside the realm of what we consider science,” said Nathan Musoke, a computational cosmologist at the University of New Hampshire. Exploring the unanswerable belongs to philosophy or religion. It’s possible, however, that science-derived answers to these questions exist as visible imprints on these horizons that the scientific method can uncover.
“All’s well and good to write down the math and say you can explain the universe, but if you have no way of testing the hypothesis, then that’s getting outside the realm of what we consider science.”
That imprinting is literal. Ralph Alpher and Robert Herman first predicted in 1948 that some light left over from an early epoch in the universe’s history might still be observable here on Earth. Then, in 1964, Arno Penzias and Robert Wilson were working as radio astronomers at Bell Labs in New Jersey, when they noticed a strange signal in their radio telescope. They went through every idea to figure out the source of the noise—perhaps it was background radiation from New York City, or even poop from pigeons nesting in the experiment? But they soon realized that the data matched Alpher and Herman’s prediction.
Penzias and Wilson had spotted the microwave radiation from just 400,000 years after the Big Bang called the cosmic microwave background, the oldest and most distant radiation observable to today’s telescopes. During this era in the universe’s history, chemical reactions caused the previously opaque universe to allow light to travel through uninhibited. This light, stretched out by the expanding universe, now appears as faint microwave radiation coming from all directions in the sky.
Astronomers’ experiments since then, such as the Cosmic Background Explorer (COBE), the Wilkinson Microwave Anisotropy Probe (WMAP), and the Planck space observatory have attempted to map this cosmic microwave background, revealing several key takeaways. First, the temperature of these microwaves is eerily uniform across the sky—around 2.725 degrees above absolute zero, the universe’s minimum temperature. Second, despite its uniformity, there are small, direction-dependent temperature fluctuations; patches where the radiation is slightly warmer and patches where it’s slightly cooler. These fluctuations are a remnant of the structure of the early universe before it became transparent, produced by sound waves pulsing through it and gravitational wells, revealing how the earliest structures may have formed.
At least one theory has allowed for a scientific approach to probing this structure, with hypotheses that have been tested and supported by further observations of these fluctuations. This theory is called inflation. Inflation posits that the observable universe as we see it today would have once been contained in a space smaller than any known particle. Then, it underwent a burst of unthinkable expansion lasting just a small fraction of a second, governed by a field with dynamics determined by quantum mechanics. This era magnified tiny quantum-scale fluctuations into wells of gravity that eventually governed the large-scale structure of the observable universe, with those wells written into the cosmic microwave background data. You can think of inflation as part of the “bang” in the Big Bang theory.
“I think there’s a pretty broad consensus that inflation probably occurred. There’s very little consensus as to how or why it occurred, what caused it, or what physics it obeyed when it happened.”
It’s a nice thought, that we can pull knowledge from beyond the cosmic microwave background. But this knowledge leads to more questions. “I think there’s a pretty broad consensus that inflation probably occurred,” said Katie Mack, theoretical astrophysicist at North Carolina State University. “There’s very little consensus as to how or why it occurred, what caused it, or what physics it obeyed when it happened.”
Some of these new questions may be unanswerable. “What happens at the very beginning, that information is obscured from us,” said Mack. “I find it frustrating that we’re always going to be lacking information. We can come up with models that explain what we see, and models that do better than others, but in terms of validating them, at some point we’re going to have to just accept that there’s some unknowability.”
At the cosmic microwave background and beyond, the large and the small intersect; the early universe seems to reflect quantum behaviors. Similar conversations are happening on the other end of the size spectrum, as physicists attempt to reconcile the behavior of the universe on the largest scale with the rules of quantum mechanics. Black holes exist in this scientific space, where gravity and quantum physics must play together, and where physical descriptions of what’s going on sit below the Planck scale.
Here, physicists are also working to devise a mathematical theory that, while too small to observe directly, produces observable effects. Perhaps most famous among these ideas is string theory, which isn’t really a theory but a mathematical framework based on the idea that fundamental particles like quarks and electrons aren’t just specks but one-dimensional strings whose behavior governs those particles’ properties. This theory attempts to explain the various forces of nature that particles experience, while gravity seems to be a natural result of thinking about the problem in this way. Like those studying any theory, string theorists hope that their framework will put forth testable predictions.
“Often, very difficult problems in physics require profound jumps, revolutions, or different ways of thinking, and it’s only afterward when we realize that we were asking the question in the wrong way.”
Finding ways to test these theories is a work in progress. “There’s faith that one way or another we should be able to test these ideas,” said David Gross, professor at the Kavli Institute for Theoretical Physics, University of California, Santa Barbara and winner of the 2004 Nobel Prize in Physics. “It might be very indirect—but that’s not something that’s a pressing issue.”
Searching for indirect ways to test string theory (and other theories of quantum gravity) is part of the search for the theory itself. Perhaps experiments producing small black holes could provide a laboratory to explore this domain, or perhaps string theory calculations will require particles that a particle accelerator could locate.
At these small timescales, our notion of what space and time really is might break down in profound ways, said Gross. “The way physicists formulate questions in general often assumes various givens, like spacetime exists as a smooth, continuous manifold,” he said. “Those questions might be ill formulated. Often, very difficult problems in physics require profound jumps, revolutions, or different ways of thinking, and it’s only afterward when we realize that we were asking the question in the wrong way.”
For example, some hope to know what happened at the beginning of the universe—and what happened before time began. “That, I believe, isn’t the right way to ask the question,” said Gross, as asking such a question might mean relying on an incorrect understanding of the nature of space and time. Not that we know the correct way, yet.
That Which We Can Know
Walls that stop us from easily answering our deepest questions about the universe… well, they don’t feel very nice to think about. But offering some comfort is the fact that 93 billion light-years is very big, and 10^-35 meters is very small. Between the largest and the smallest is a staggering space full of things we don’t but theoretically can know.
“There are so many things we don’t understand in astrophysics—and we’re overwhelmed with data. To question whether we’re hitting a limit is like trolling.”
Today’s best telescopes can look far into the distance (and remember, looking into the distance also means looking back in time). Hubble can see objects as they were just a few hundred million years after the Big Bang, and its successor, the Webb Space Telescope, will look farther still, perhaps 150 million years after the Big Bang. Existing galactic surveys like the Sloan Digital Sky Survey and the Dark Energy Survey have collected data on millions of galaxies, the latter having recently released a 3D map of the universe with 300 million galaxies. The upcoming Vera C. Rubin Observatory in Chile will survey up to 10 billion galaxies across the sky.
“From an astronomy point of view, we have so much data that we don’t have enough people to analyze it,” said Mikhail Ivanov, NASA Einstein Fellow at the Institute for Advanced Study. “There are so many things we don’t understand in astrophysics—and we’re overwhelmed with data. To question whether we’re hitting a limit is like trolling.” Even then, these mind-boggling surveys represent only a small fraction of the universe’s estimated 200 billion galaxies that future telescopes might be able to map.
But as scientists attempt to play in these theoretically accessible spaces, some wonder whether the true limit is us.
Today, particle physics seems to be up against an issue of its own: Despite plenty of outstanding mysteries in need of answers, the physicists at the Large Hadron Collider have found no new fundamental particles since the Higgs Boson in 2012. This lack of discovery has physicists scratching their heads; it’s ruled out the simplest versions of some theories that had been guiding particle physicists previously, with few obvious signposts about where to look next (though there are some!).
Beacham thinks that these problems could be solved by searching for phenomena all the way down to the Planck scale. A vast, unknown chasm exists between the scale of today’s particle physics experiments and the Planck scale, and there’s no guarantee of anything new to discover in that space. Exploring the entirety of that chasm would take an immense amount of energy and increasingly powerful colliders. Quantum mechanics says that higher-momentum particles have smaller wavelengths, and thus are needed to probe smaller length scales. However, actually exploring the Planck scale may require a particle accelerator big enough to circle the Sun—maybe even one the size of the solar system.
“Maybe it’s daunting to think of such a collider, but it’s inspiration for a way to get to the scale—and inspiration to figure out how to get there with a smaller device,” he said. Beacham views it as particle physicists’ duty to explore whether any new physical phenomena might exist all the way down to the Planck scale, even if there currently isn’t evidence there’s anything to find. “We need to think about going as high in energy as we can, building larger and larger colliders until we hit the limit. We don’t get to choose what the discoveries are,” he said.
Or, perhaps we can use artificial intelligence to create models that perfectly explain the behavior of our universe. Zooming back out, Fermilab and University of Chicago scientist Brian Nord has dreamed up a system that could model the universe with the help of artificial intelligence, constantly and automatically updating its mathematical model with new observations. Such a model could grow arbitrarily close to the model that actually describes our universe—it could generate a theory of everything. But, as with other AI algorithms, it would be a black box to humans.
Such issues are already cropping up in fields where we use software-based tools to make accurate models, explained Taner Edis, physicist at Truman State University. Some software tools—machine learning models, for example—may accurately describe the world we live in but are too complex for any individual to completely understand. In other words, we know that these tools work, but not necessarily how. Maybe AI will take us farther down this path, where the knowledge we create will exist spread over a civilization and its technology, owned in bits and pieces by humanity and the algorithms we create to understand the universe. Together, we’d have generated a complete picture, but one inaccessible to any single person.
Finally, these sorts of models may provide supreme predictive power, but they wouldn’t necessarily offer comfortable answers to questions about why things work the way they do. Perhaps this sets up a dichotomy between what scientists can do—make predictions based on initial conditions—and what they hope these predictions will allow them to do—lead us to a better understanding of the universe we live in.
How can scientists convince funders that we should build experiments, not with the hope of producing new technology or advancing society, but merely with the hope of answering deep questions?
“I have a hunch that we’ll be able to effectively achieve full knowledge of the universe, but what form will it come in?” said Nord. “Will we be able to fully understand that knowledge, or will it be used merely as a tool to make predictions without caring about the meaning?”
Thinking realistically, today’s physicists are forced to think about what society cares about most and whether our systems and funding models permit us to fully examine what we can explore, before we can begin to worry about what we can’t. U.S.legislators often discuss basic science research with the language of applied science or positive outcomes—the Department of Energy funds much particle physics research. The National Science Foundation’s mission is “To promote the progress of science; to advance the national health, prosperity, and welfare; and to secure the national defense; and for other purposes.”
Physicists hoping to receive funding must compete for resources in order to do research that promotes the missions of these organizations. While many labs, such as CERN, exist solely to fund peaceful research with no military applications, most still brag that indirectly solving bigger problems will lead to new tech—the internet, or advances in data handling and AI, for example. Private funding organizations exist, but they, too, are either limited in their resources, driven by a mission, or both.
But what if answering these deep questions requires thinking that isn’t driven by… anything? How can scientists convince funders that we should build experiments, not with the hope of producing new technology or advancing society, but merely with the hope of answering deep questions? Echoing a sentiment expressed in an article by Vanessa A. Bee, what if our systems today (sorry, folks, I’m talking about capitalism) are actually stifling innovation in favor of producing some short-term gain? What if answering these questions would require social policy and international collaboration deemed unacceptable by governments?
If this is indeed the world we live in, then the unknowable barrier is far closer than the limits of light speed and the Planck scale. It would exist because we collectively—the governments we vote for, the institutions they fund—don’t deem answering those questions important enough to devote resources to.
Unknown Unknowns
Prior to the 1500s, the universe was simply Earth; the Sun, Moon, and stars were small satellites that orbited us. By 1543, Nicolaus Copernicus proposed a heliocentric model of the universe—the Sun sat at the center, and Earth orbited it. It was only in the 1920s that Edwin Hubble calculated the distance of Andromeda and proved the Milky Way wasn’t the whole universe; it was just one of many, many galaxies in a larger universe. Scientists discovered most of the particles that make up today’s Standard Model of particle physics in the second half of the 20th century. Sure, relativity and quantum theory seem to have established the size of the sandbox we have to play in—but precedent would suggest there’s more to the sandbox, or even beyond the sandbox, that we haven’t considered. But then, maybe there isn’t.
There are things that we’ll never know, but that’s not the right way to think about scientific discovery. We won’t know unless we attempt to know, by asking questions, crafting hypotheses, and testing them with experiments. The vast unknown, both leading up to and beyond our boundaries, presents limitless opportunities to ask questions, uncover more knowledge, and even render previous limits obsolete. We cannot truly know the unknowable, then, since the unknowable is just what remains when we can no longer hypothesize and experiment. The unknowable isn’t fact—it’s something we decide.