Expert Voices

World's Most Advanced Computers Unravel the Universe's Most Primitive Processes (Op-Ed)

The visualiztion, Magnetic Fields in Core-Collapse Supernovae depicts the magnetic field inside the shock surface of a supernova, and was created using the GenASIS code on the Oak Ridge Leadership Computing Facility petascale computer, Jaguar, work that continues on Titan. Credit: Eirik Endeve, Christian Cardall, Reuben Budiardja, Anthony Mezzacappa, Dave Pugmire.

Gregory Scott Jones, a writer who covers supercomputing. He contributed this article to Live Science's Expert Voices: Op-Ed & Insights.

There is an idea, popular in New Age circles, that humans represent the universe's primary self-awareness.

In other words, our consciousness is actually the Cosmos realizing it exists; is mankind the only creature to ever look up at the sky and know the vast distances to the stars, or the fact that we are physically the product of their demise? This is, I imagine, the sort of thing Carl Sagan had in mind when he said "humans are the stuff of the cosmos examining itself." Far out for sure.

But this self-awareness, if it's indeed real, presents many questions. Big ones. And we're getting answers thanks to those primitively simulated brains we call computers. Big ones. The Universe, it seems, has begun to write its autobiography.

The irony is hard to ignore. The idea that some of the most advanced machines in the modern world will piece together the most basic processes in all of time is rapidly becoming a reality.

Today's supercomputers are necessary for solving an entire range of complex scientific challenges, from the complexities of climate change to the properties of new materials to the ideal aerodynamics of vehicle design. But few problems require such massive computing power as do those born in the heavens.

Unfortunately, recreating the Big Bang and watching the universe unfold in a laboratory is out of the question for obvious reasons. But with empirical data from satellites, probes and seriously powerful telescopes, and the simulation potential of computers pushing 30 petaflops — or 30 thousand trillion (quadrillion) calculations per second — scientists are getting a much clearer picture of how this whole universe thing unraveled, and how we came to be.

Observation reveals what was created in the early moments of the universe: The cosmic microwave background, or CMB, represents the dawn of time just after (well, about 378,000 years after) the Big Bang. Its current geography is the result of roughly 14 billion years of formation, plenty of time for researchers to play with piecing the puzzle together.

But we're getting there, one step at a time. For example, thanks to decades of observation and extremely sophisticated applications running across many thousands of processors, a team of researchers led by Salman Habib is using Argonne National Laboratory's Mira and Oak Ridge National Laboratory's Titan supercomputers to see how tiny variations in the Big Bang can grow to form enormous clumps that now host stars and galaxies.

The simulations take place across billions of light years of space in thousands of time steps with the potential to squash or validate theories and confirm or disprove much of what we thought we knew about how the universe behaves, including the elusive "dark energy," the reigning champion in our quest to explain how the universe expands, and why the expansion rate is currently accelerating.

Supercomputers are necessary for such complex simulations, principal investigator Salman Habib has said, due to their sheer speed, massive amounts of memory and their communication-oriented architectures. Habib's application achieved a sustained performance in excess of ten petaflops, completely out of reach just years ago, allowing the team to witness the evolution of the universe from the largest scales down to those characteristic of galaxies.

"In a way supercomputers compress the enormous reaches of space and time that are characteristic of the cosmos, and allow us to interact with them on the — by comparison — incredibly short scales of human perception," said Habib.

If the Universe is self-aware, simulating its creation is akin to forcing it to watch embarrassing home movies of its childhood.

But what about us? After all, if we are in fact the latest and greatest universal incarnation, where is our birth story? Consider core-collapse supernovas (CCSNs), or stars greater than eight times the size of our sun, but no greater than around 40 times.

These massive elemental factories self-implode, a violent act that leaves in its place all the elements up to iron, i.e., all of the necessary ingredients for life. When Crosby, Stills, and Nash sang "we are stardust, we are golden, we are billion-year-old carbon," they had CCSNs in mind, whether they knew it or not.

Researchers can now simulate in three dimensions many of the implosions that created us, a feat impossible just a couple of years ago. We now know that neutrinos play a significant, if not the dominant, role in these massive elemental creation events, as a team of researchers using the Titan supercomputer located at Oak Ridge National Laboratory is achieving neutrino-driven explosions across a range of stellar masses in two dimensions, giving credibility to their model.

The same team used Jaguar, Titan's predecessor, to explain how a neutron star could become the more rapidly rotating pulsar, a problem featured on the cover of the June 1, 2012, issue of Science, which explored the top unsolved problems in astrophysics. Known as the standing accretion shock instability, or SASI, researchers now have a relevant description of how a rotating neutron star picks up steam, work that was recently validated by observation in the February 20, 2014, issue of Nature.

If you're a topical expert — researcher, business leader, author or innovator — and would like to contribute an op-ed piece, email us here. (Image credit:

Forget about home movies. This is the universe staring at its reflection in the mirror.

These monumental developments are occurring across a wide range of astrophysics and cosmology, from black hole accretion to the formation of individual planets and stars, fields with concepts so vast that it is difficult, if not impossible, to imagine a computer powerful enough to ever resolve them completely. Nevertheless, the potential for the world's latest and greatest calculators to solve the biggest questions, both metaphorically and literally, is potentially limitless, as are the questions. The universe is, after all, a very old and very large place.

Our best estimate of the structure of the universe as we know it, the standard model, accounts for roughly 5 percent of its total mass; the rest we embarrassingly refer to as "dark matter ." We can't see it, can't feel it, can only infer it. The resolution and definition of dark matter and "dark energy" would be among the most significant scientific achievements of all time, and simulations on the world's most powerful computers will doubtless play a large role.

But problems of this magnitude will no doubt require technologies more powerful than today's leading systems. Luckily for us, the next era is unfolding before our very eyes. The world's fastest computers may soon approach the exascale, capable of crunching quintillions of calculations per second, or nearly an entire order of magnitude faster than current systems. And once again the most advanced machines on the planet will be called upon to answer the most fundamental questions: Who are we? And where do we come from?

Our earliest history is intimately connected with our near future. The universe must think it's pretty smart.

The views expressed are those of the author and do not necessarily reflect the views of the publisher. This article was originally published on Live Science.

Join our Space Forums to keep talking space on the latest missions, night sky and more! And if you have a news tip, correction or comment, let us know at: