You are here

Universe Today Feed

Subscribe to Universe Today Feed feed Universe Today Feed
Space and astronomy news
Updated: 11 hours 51 min ago

Can We Survive in Space? It Might Depend on How Our Gut Microbiome Adapts

Wed, 02/21/2024 - 2:13pm

For over a century, people have dreamed of the day when humanity (as a species) would venture into space. In recent decades, that dream has moved much closer to realization, thanks to the rise of the commercial space industry (NewSpace), renewed interest in space exploration, and long-term plans to establish habitats in Low Earth Orbit (LEO), on the lunar surface, and Mars. Based on the progression, it is clear that going to space exploration will not be reserved for astronauts and government space agencies for much longer.

But before the “Great Migration” can begin, there are a lot of questions that need to be addressed. Namely, how will prolonged exposure to microgravity and space radiation affect human health? These include the well-studied aspects of muscle and bone density loss and how time in space can impact our organ function and cardiovascular and psychological health. In a recent study, an international team of scientists considered an often-overlooked aspect of human health: our microbiome. In short, how will time in space affect our gut bacteria, which is crucial to our well-being?

The team consisted of biomedical researchers from the Ionizing and Non-ionizing Radiation Protection Research Center (INIRPRC) at the Shiraz University of Medical Sciences (SUMS), the Lebanese International University, the International University of Beirut, the MVLS College at The University of Glasgow, the Center for Applied Mathematics and Bioinformatics (CAMB) at Gulf University in Kuwait, the Nuclear Physics Institute (NPI) of the Czech Academy of Sciences (CAS), and the Technische Universität Wien Atominstitut in Vienna. The paper that describes their findings recently appeared in Frontiers of Microbiology.

Artist’s impression of the Space Launch System (SLS) taking off. Credit: NASA

A microbiome is the collection of all microbes that live on and within our bodies, including bacteria, fungi, viruses, and their respective genes. These microbes are key to how our body interacts with the surrounding environment since they can affect how we respond to the presence of foreign bodies and substances. In particular, some microbes alter foreign bodies in ways that make them more harmful, while others act as a buffer that mitigates the effects of toxins. As they note in their study, the microbiota of astronauts will encounter elevated stress from microgravity and space radiation, including Galactic Cosmic Rays (GCR).

Cosmic rays are a high-energy form of radiation that consists primarily of protons and atomic nuclei stripped of their electrons that have been accelerated to close to the speed of light. When these rays are generated from elements heavier than hydrogen or helium, their high-energy nuclei components are known as HZE ions, which are particularly hazardous. When these impact our atmosphere or protective shielding aboard spacecraft or the International Space Station (ISS), they result in showers of secondary particles.

While Earth’s protective magnetosphere and atmosphere prevent most of these particles from reaching the surface, astronauts in space are exposed to them regularly. As the authors noted, previous research has shown how this exposure could potentially enhance astronaut resilience to radiation, a process known as radio-adaptation. However, they also noted that the extent to which astronauts adapted varied from one astronaut to the next, with some experiencing adverse biological effects before embarking on a deep space mission.

For this reason, they recommend conducting further research to determine the risks associated with the space environment, as it mostly consists of protons, which astronauts will be exposed to before encountering HZE particles. Third, NASA’s Multi-Mission Model suggests that an astronaut’s first mission can be an adapting dose. However, the team notes that current research suggests that a second spaceflight does not necessarily increase the chances of genetic abnormalities as much as expected. This could mean that the body may have a natural radio-adaptive defense mechanism.

Making medical diagnoses aboard the International Space Station can be a tricky business Credit: NASA

In terms of recommendations, the team lauded the ISS as the ideal environment for testing the human microbiome response to space radiation and microgravity. They also address the shortage of research in this area and how the long-term effects of radiation on microbiomes and environmental bacteria are poorly understood:

“The International Space Station (ISS) is a unique and controlled system to study the interplay between the human microbiome and the microbiome of their habitats. The ISS is a hermetically sealed closed system, yet it harbors many microorganisms… In this context, NASA scientists did not consider that adaptation is not limited to astronauts and radiation exposure to bacteria inside an astronaut’s body or that bacteria inside the space station could induce resistance not only to high levels of DNA damage caused by HZEs but also to other bacterial activity-threatening factors such as antibiotics.”

Increased resistance to antibiotics could be life-threatening for astronauts, who face risks of injury and infection during long-duration missions. Furthermore, they emphasize how space travel and prolonged exposure to microgravity can weaken the immune system, reducing astronauts’ natural resistance to microbes – especially those with high levels of resistance to radiation, heat, UV, and desiccation, and can therefore survive in a space environment. As they summarize it:

“In a competition between astronauts and their microbiomes to adapt to the harsh space environment, microorganisms may emerge as the winners because they can evolve and adapt more quickly than humans by rapid acquisition of microbial genes. Microorganisms have a much shorter generation time, enabling them to produce many more offspring, each with unique genetic mutations that can help them survive in the space environment.”

Flight Engineer Anne McClain in the cupola holding biomedical gear for MARROW. Credit: NASA

For this reason, the research team stresses that additional research is needed to estimate the magnitude of adaptation in microorganisms before missions are mounted. This could be crucial for identifying potential risks and developing mitigation strategies, novel therapies, and interventions. They also recommend that astronauts undergo regular cytogenetic tests to measure their adaptive response and that only those who show a high adaptive response to low doses of radiation be selected for missions where they would be exposed to higher doses.

They also acknowledge that studying astronaut microbiomes in space presents several challenges. These include the difficulty of conducting experiments in the microgravity environment, which can affect the growth and behavior of microorganisms, making it challenging to obtain accurate and reliable data. There’s also the potential hazard of spreading pathogens in a closed environment with recycled air systems. However, this is research that needs to be conducted before crewed deep-space exploration can be realized, as it has the potential to identify potential pathogens and develop strategies to prevent their spread during missions.

Further Reading: Frontiers in Microbiology

The post Can We Survive in Space? It Might Depend on How Our Gut Microbiome Adapts appeared first on Universe Today.

Categories: Science

A New, More Accurate Measurement for the Clumpiness of the Universe

Wed, 02/21/2024 - 8:45am

Cosmologists are wrestling with an interesting question: how much clumpiness does the Universe have? There are competing but not compatible measurements of cosmic clumpiness and that introduces a “tension” between the differing measurements. It involves the amount and distribution of matter in the Universe. However, dark energy and neutrinos are also in the mix. Now, results from a recent large X-ray survey of galaxy clusters may help “ease the tension”.

The eROSITA X-ray instrument orbiting beyond Earth performed an extensive sky survey of galaxy clusters to measure matter distribution (clumpiness) in the Universe. Scientists at the Max Planck Institute for Extraterrestrial Physics recently shared their analysis of its cosmologically important data.

“eROSITA has now brought cluster evolution measurement as a tool for precision cosmology to the next level,” said Dr. Esra Bulbul (MPE), the lead scientist for eROSITA’s clusters and cosmology team. “The cosmological parameters that we measure from galaxy clusters are consistent with state-of-the-art cosmic microwave background, showing that the same cosmological model holds from soon after the Big Bang to today.”

eROSITA, the Standard Cosmological Model, and Clumpiness

To get a better feel for what this means, let’s look at what the team is trying to confirm. The idea is to figure out just what the Universe has been like through time. That means understanding matter, its distribution (or clumpiness), and what role dark matter and dark energy have played. It all began just after the Big Bang when the Universe was in a hot, dense state. The only things existing were photons and particles. The Universe expanded and began to condense into regions of higher density. Think of these as density variations, or areas of more or less clumpiness in the primordial soup. As things cooled and expanded, the denser clumps in the soup became galaxies and eventually galaxy clusters. The clumpiness was smoother (or “isotropic”) than expected. That raises questions about the role of dark matter and dark energy, among other things.

eROSITA’s observations of galaxy clusters and distribution of matter showed several interesting results. First, both dark matter and visible matter (baryonic matter), make up about 29 percent of the total energy density of the Universe. Presumably, the rest consists of dark energy, which we don’t know much about, yet. Energy density is the amount of energy stored in a region of space as a function of volume. In cosmology, it also includes any mass in that volume of space.

This plot shows the constraints put on the total matter density in the Universe and the S8 “tension”. Constraints from eROSITA galaxy clusters are in orange, from the Cosmic Microwave Background (Planck) in blue, from weak lensing (DES+KiDS) in grey, and from cluster number counts (SPT) in black. Credit: MPE, V. Ghirardini for the eROSITA consortium

The measurement of energy density agrees with measurements of the cosmic microwave background radiation—also known as the CMB. Think of that as a map of the density variations in the early Universe. It’s made up of microwave radiation that permeates the Universe. That radiation is not completely smooth or uniform. That’s the variability in density that eventually became the seeds of the first galaxies.

Measuring Clumpiness

eROSITA’s goal is to measure the assembly of galaxy clusters over cosmic time. By tracing their evolution via the X-rays emitted by hot gas, the instrument traced both the total amount of matter in the Universe and its clumpiness. Those measurements solve the “tension” or discrepancy between past clumpiness measurements that used different techniques. Those included the CMB and observations of weak gravitational lensing.

A computer simulation of what gas and stars in a galaxy cluster look like, and how they look embedded in the cosmic web. The assembly of galaxy clusters has implications for the clumpiness of the Universe throughout time. Credit: Yannick Bahé.

The eROSITA data shows the distribution of matter is actually in good agreement with previous measurements of the CMB. That’s good news because cosmologists were afraid they’d have to invoke “new physics” to explain the tension between measurements. “eROSITA tells us that the Universe behaved as expected throughout cosmic history,” says Dr. Vittorio Ghirardini, the postdoctoral researcher at MPE who led cosmology study. “There’s no tension with the CMB. Maybe the cosmologists can relax a bit now.”

But Wait, There are Neutrinos to Worry About!

Interestingly, the eROSITA measurements of galaxy clusters and other large structures also provide information about neutrinos. They’re the most abundant particles with mass that we know of in the Universe. They come from the Sun and supernovae (for example), but also originated in the Big Bang. eROSITA’s survey offers new information about the mass of neutrinos and their prevalence. “We have obtained tight constraints on the mass of the lightest known particles from the abundance of the largest objects in the Universe,” said Ghirardini.

Computer simulations show how neutrinos can form cosmic clumpiness. Credit: Yoshikawa, Kohji, et al

Neutrinos may be small and tough to “see”, but they have mass that contributes to the total density of matter in the Universe. Cosmologists describe them as “hot”, which means they travel at nearly the speed of light. Therefore, they tend to smooth out the distribution of matter—which can be probed by analyzing the evolution of galaxy clusters in the Universe. And, there’s a good chance that eROSITA may help solve the mystery of neutrino mass. “We are even on the brink of a breakthrough to measure the total mass of neutrinos when combined with ground-based neutrino experiments,” added Ghirardini.

How eROSITA Did It

There’s a lot more to explore in the eROSITA data, but it’s also fascinating to look at the extent of the survey data. It comprises one of the most extensive catalogs of clusters of galaxies done so far. The so-called “Western Galactic half” of the all-sky survey contains 12,247 optically identified X-ray galaxy clusters. “Of these, 8,361 represent new discoveries – almost 70%,” said Matthias Kluge, a postdoctoral researcher at MPE who is responsible for the optical identification of the detected clusters. “This shows the huge discovery potential of eROSITA.”

All that data can be charted in three dimensions, and when scientists do that, galaxy clusters show up as intersections of the cosmic web. In addition, there’s a supercluster catalog, which also shows connected clusters and the filaments of matter between them. “We find more than 1,300 supercluster systems, which makes this the largest-ever X-ray supercluster sample,” said Ang Liu, a postdoctoral researcher at MPE.

This new look at clumpiness in the Universe comes from the first release of data from eROSITA. The instrument completed additional surveys in early 2022. Once those data are analyzed, astronomers expect to be probing even deeper into the distribution of matter in the Universe and testing their models against reality. “When the full data are analyzed,” said Esra Bulbul, “eROSITA will again put our cosmological models to the most stringent test ever conducted through a cluster survey.”

For More Information

eROSITA Relaxes Cosmological Tension
The SRG/eROSITA All-Sky Survey: Cosmology Constraints from Cluster Abundances in the Western Galactic Hemisphere

About eROSITA

The post A New, More Accurate Measurement for the Clumpiness of the Universe appeared first on Universe Today.

Categories: Science

Scientists Track How a Giant Wave Moved Through Our Galactic Backyard

Tue, 02/20/2024 - 3:55pm

Astronomers say there’s a wave rippling through our galactic neighborhood that’s playing a part in the birth and death of stars — and perhaps in Earth’s history as well.

The cosmic ripple, known as the Radcliffe Wave, was identified in astronomical data four years ago — but in a follow-up study published today by the journal Nature, a research team lays out fresh evidence that the wave is actually waving, like the wave that fans in a sports stadium create by taking turns standing up and sitting down.

“Similar to how fans in a stadium are being pulled back to their seats by the Earth’s gravity, the Radcliffe Wave oscillates due to the gravity of the Milky Way,” study lead author Ralf Konietzka, a researcher at Harvard and the Harvard-Smithsonian Center for Astrophysics, or CfA, said in a news release

The wave — which is named in honor of Harvard Radcliffe Institute, where the undulation was discovered — consists of a string of star clusters spread out over a stretch of the Milky Way measuring about 9,000 light-years in length.

Astronomers reported in 2020 that they identified the wavy pattern by correlating the 3-D locations of the clusters in data from the European Space Agency’s Gaia space telescope, plus observations of dust and gas clouds in the same region.

“It’s the largest coherent structure that we know of, and it’s really, really close to us,” said study co-author Catherine Zucker, an astrophysicist with the Smithsonian Astrophysical Observatory at the CfA. “It’s been there the whole time. We just didn’t know about it, because we couldn’t build these high-resolution models of the distribution of gaseous clouds near the sun, in 3-D.”

At the time, the astronomers didn’t have enough data to determine whether the peak of the wave was rolling down the line. That’s what’s known as a traveling wave, as opposed to a stationary wave — the kind of wave that’s set off, for example, by a vibrating guitar string.

Since then, additional readings about the motion of the star clusters have led the astronomers to conclude that the Radcliffe Wave is indeed a traveling wave that rises to a maximum height of more than 700 light-years and has a mean wavelength of roughly 6,500 light-years.

“Now we can go and test all these different theories for why the wave formed in the first place,” Zucker said.

Konietzka said the potential explanations range from “explosions of massive stars, called supernovae, to out-of-galaxy disturbances like a dwarf satellite galaxy colliding with our Milky Way.”

Astronomers say the wave’s rippling effect could in turn trigger bursts of supernovae and swarms of star formation within the gas and dust clouds of the interstellar medium. In earlier research, Zucker and other astronomers suggested that sometime around 14 million years ago, just such a burst gave rise to the “Local Bubble,” a star-forming shell that surrounds our own solar system. 

Other researchers have proposed that the long-lasting fallout from all those supernovae could have affected Earth’s geology and climate — for example, by showering our planet with radioactive dust or perhaps even triggering an ice age.

The Radcliffe Wave is currently about 980 light-years away from our own solar system, and appears to be drifting outward at a speed of about 11,000 mph (5 km/sec). “The measured drift of the Radcliffe Wave radially outward from the galactic center suggests that the cluster whose supernovae ultimately created today’s expanding Local Bubble may have been born in the Radcliffe Wave,” authors of the newly published paper say.

Study co-author Alyssa Goodman, an astronomer at the CfA, said the evidence supports the case for claiming that the Radcliffe Wave had an effect on Earth and its cosmic neighborhood. 

“Passage of the sun through over-dense material like the Radcliffe Wave and the Local Bubble does affect the heliosphere,” she wrote in an email, “and the timing does work out that some of the peaks in radioactivity on Earth (e.g., iron-60) line up time-wise with when the sun would have crossed the RadWave, Local Bubble surface, and other ‘Local Fluff’ clouds as well.”

Now the study’s authors are wondering whether the Radcliffe Wave is merely a local phenomenon. Could such waves be common? “The question is, what caused the displacement giving rise to the waving we see?” Goodman said. “And does it happen all over the galaxy? In all galaxies? Does it happen occasionally? Does it happen all the time?”

In addition to Konietzka, Goodman and Zucker, authors of the Nature paper, titled “The Radcliffe Wave Is Oscillating,” include Andreas Burkert, João Alves, Michael Foley, Cameren Swiggum, Maria Koller and Núria Miret-Roig. The research is the focus of a BornCurious podcast titled “Riding the Radcliffe Wave,” as well an online 3-D interactive presented by Cosmic Data Stories and WorldWide Telescope.

The post Scientists Track How a Giant Wave Moved Through Our Galactic Backyard appeared first on Universe Today.

Categories: Science

JWST Sees a Milky Way-Like Galaxy Coming Together in the Early Universe

Tue, 02/20/2024 - 2:22pm

The gigantic galaxies we see in the Universe today, including our own Milky Way galaxy, started out far smaller. Mergers throughout the Universe’s 13.7 billion years gradually assembled today’s massive galaxies. But they may have begun as mere star clusters.

In an effort to understand the earliest galaxies, the JWST has examined their ancient light for clues as to how they became so massive.

The JWST can effectively see back in time to when the Universe was only about 5% as old as it is now. In that distant past, structures that would eventually become as massive as the Milky Way, and even larger, were only about 1/10,000th as massive as they are now. What clues can the powerful infrared space telescope uncover that show us how galaxies grew so large?

A new paper presents JWST observations of a galaxy at redshift z~8.3. At that redshift, the light has been travelling for over 13 billion years and began its journey only 600 million years after the Big Bang. The galaxy, called the Firefly Sparkle, contains a network of massive star clusters that are evidence of how galaxies grow.

The paper is “The Firefly Sparkle: The Earliest Stages of the Assembly of A Milky Way-type Galaxy in a 600 Myr Old Universe.” The lead author is Lamiya Mowla, an observational astronomer and assistant professor of Physics and Astronomy at Wellesley College. The paper is in pre-print and hasn’t yet been peer-reviewed.

“The Firefly Sparkle provides an unprecedented case study of a Milky Way-like galaxy in the earliest stages of its assembly in only a 600 million-year-old Universe,”

From Mowla et al. 2024

Despite the JWST’s power, this distant, ancient galaxy is only visible through the gravitational lensing of a massive cluster of foreground galaxies. The lensing makes the Firefly Sparkle appear as an arc. Two other galaxies are also in the vicinity, called Firefly BF (Best Friend) and Firefly NBF (New Best Friend.)

This image shows the Firefly Sparkle galaxy and its two neighbours, BF and NBF. The Firefly Sparkle’s mass is concentrated in 10 clusters that contain up to 57% of its entire mass. Image Credit: Mowla et al. 2024.

“The Firefly Sparkle exhibits the hallmarks expected of a future Milky Way-type galaxy captured during its
earliest and most gas-rich stage of formation,” the authors write. The young galaxy’s mass is concentrated in 10 clusters, which range from about 200,000 solar masses to 630,000 solar masses. According to the authors, these clusters “straddle the boundary between low-mass galaxies and high-mass globular clusters.”

These clusters are significant because they’re clues to how the galaxy is growing. The researchers were able to gauge the ages of the clusters and their star formation histories. They found that they experienced a burst of star formation at around the same time. “The cluster ages suggest that they are gravitationally bound with star formation histories showing a recent starburst possibly triggered by the interaction with a companion galaxy at the same redshift at a projected distance of ~2 kpc away from the Firefly Sparkle.”

There are two candidates for the interacting galaxy: Firefly Best Friend (BF) and Firefly New Best Friend (NBF). But NBF is about 13 kpcs away, while BF is about two kpcs away, making BF the likely interactor. “Faint low-surface brightness features are visible at the corners of the arc close to the neighbour, hinting at a possible interaction between the two galaxies <FS and BF> which may have triggered a burst of star formation in both of them,” explain the researchers.

This figure from the study illustrates the star formation histories of each cluster, as well as each galaxy. In the top right, “The Firefly Sparkle and FF-BF both show a recent burst of star formation in the last ~ 50 Myr indicative of recent interactions,” the authors explain. Image Credit: Mowla et al. 2024.

The researchers paid special attention to the central cluster. They found that the temperature is extremely high at about 40,000 Kelvin (40,000 C; 72,000 F.) It also has a top-heavy initial mass function, a signal that it formed in a very metal-poor environment. These observations and other evidence show that Firefly Sparkle is very likely a progenitor of galaxies like ours. For these reasons, “… the Firefly Sparkle provides an unprecedented case study of a Milky Way-like galaxy in the earliest stages of its assembly in
only a 600 million-year-old Universe,” the authors write.

Fortunately, the researchers behind these results have a powerful supercomputer simulation to compare observations with. It’s called Illustris TNG. It’s a massive cosmological magnetohydrodynamical simulation based on a comprehensive physical model of the Universe. Illustris TNG has made three runs, called TNG50, TNG 100, and TNG 300. The researchers compared their results with TNG 50.

This figure compares Firefly Sparkle’s current mass with the TNG 50 simulations of galaxy growth and with the growth rate of the Milky Way, according to an upcoming paper. Image Credit: Mowla et al. 2024.

Finding these ancient star clusters is intriguing, but we can’t assume they’ll survive intact. There are tidal and evaporative forces at work. The authors examined the stability of the individual star clusters and how they’ll fare over time.

“Most of these star clusters are expected to survive to the present-day universe and will expand and then get ripped apart to form the stellar disk and the halo of the galaxy,” the authors explain. “The only way they survive is to get kicked out to large distances, away from the dense tidal field of the galaxy.” The ones that get kicked out may persist as globular clusters.

One of the JWST’s primary science goals is to study how galaxies formed and evolved in the early Universe. By finding one in which clusters are still forming, the space telescope is reaching its goal.

“The Firefly Sparkle represents one of JWST’s first spectrophotometric observations of an extremely lensed galaxy assembling at high redshifts, with clusters that are in the process of formation instead of seen at later epochs,” the authors conclude.

The post JWST Sees a Milky Way-Like Galaxy Coming Together in the Early Universe appeared first on Universe Today.

Categories: Science

The Brightest Object Ever Seen in the Universe

Tue, 02/20/2024 - 12:19pm

It’s an exciting time in astronomy today, where records are being broken and reset regularly. We are barely two months into 2023, and already new records have been set for the farthest black hole yet observed, the brightest supernova, and the highest-energy gamma rays from our Sun. Most recently, an international team of astronomers using the ESO’s Very Large Telescope in Chile reportedly saw the brightest object ever observed in the Universe: a quasar (J0529-4351) located about 12 billion light years away that has the fastest-growing supermassive black hole (SMBH) at its center.

The international team responsible for the discovery consisted of astrophysicists from the Research School of Astronomy and Astrophysics (RSAA) and the Center for Gravitational Astrophysics (CGA) at the Australian National University (ANU). They were joined by researchers from the University of Melbourne, the Paris Institute of Astrophysics (IAP), and the European Southern Observatory (ESO). The paper that describes their findings, titled “The accretion of a solar mass per day by a 17-billion solar mass black hole,” recently appeared online and will published in the journal Nature Astronomy.

First observed in 1963 by Dutch-American astronomer Maarten Schmidt, quasars (short for “quasi-stellar objects”) are the bright cores of galaxies powered by SMBHs. These black holes collect matter from their surroundings and accelerate it to near the speed of light, which releases tremendous amounts of energy across the electromagnetic spectrum. Quasars become so bright that their cores will outshine all the stars in their disk, making them the brightest objects in the sky and visible from billions of light-years away.

As a general rule, astronomers gauge the growth rate of SMBHs based on the luminosity of their galaxy’s core region – the brighter the quasar, the faster the black hole is accreting matter. In this case, the SMBH at the core of J0529-4351 is growing by the equivalent of one Solar mass a day, making it the fastest-growing black hole yet observed. In the process, the accretion disk alone releases a radiative energy of 2 × 1041 Watts, more than 500 trillion times the luminous energy emitted by the Sun. Christian Wolf, an ANU astronomer and lead author of the study, characterized the discovery in a recent ESO press release:

“We have discovered the fastest-growing black hole known to date. It has a mass of 17 billion Suns, and eats just over a Sun per day. This makes it the most luminous object in the known Universe. Personally, I simply like the chase. For a few minutes a day, I get to feel like a child again, playing treasure hunt, and now I bring everything to the table that I have learned since.”

But what was most surprising was that this quasar was hiding in plain sight. “All this light comes from a hot accretion disc that measures seven light-years in diameter — this must be the largest accretion disc in the Universe,” said ANU Ph.D. student and co-author Samuel Lai. “It is a surprise that it has remained unknown until today, when we already know about a million less impressive quasars. It has literally been staring us in the face until now,” added co-author Christopher Onken, who is also an astronomer at ANU.

As Onken explained, J0529-4351 showed up in images taken by the ESO Schmidt Southern Sky Survey dating back to 1980. It was only in recent years that it was recognized as a quasar, thanks to improved instruments and measurements. Finding quasars requires precise observations from large areas of the sky, resulting in massive datasets that often require machine learning algorithms to analyze them. However, these models are somewhat limited because they are trained on existing data, meaning candidates are selected based on previously observed objects.

This image shows the region of the sky in which the record-breaking quasar J0529-4351 is situated. Credit: ESO/Digitized Sky Survey 2/Dark Energy Survey

Since J0529-4351 is so luminous, it was dismissed by the ESA’s Gaia Observatory as being too bright to be a quasar and was ruled to be a bright star. Last year, the ANU-led team identified it as a distant quasar based on observations using the 2.3-meter telescope at the Siding Spring Observatory in Australia. They then conducted follow-up observations using the X-shooter spectrograph on the ESO’s VLT telescope to confirm their results. The quasar is also an ideal target for the GRAVITY+ upgrade on ESO’s Very Large Telescope Interferometer (VLTI), designed to accurately measure the mass of black holes.

In addition, astronomers look forward to making observations with next-generation telescopes like the ESO’s Extremely Large Telescope (ELT). This 39-meter telescope, currently under construction in the Atacama Desert in Chile, will make identifying and characterizing distant quasars easier. Studying these objects and their central black holes could reveal vital details about how SMBHs and galaxies co-evolved during the early Universe.

Further Reading: ESO, ESO Science Papers

The post The Brightest Object Ever Seen in the Universe appeared first on Universe Today.

Categories: Science

Japan's New H3 Rocket Successfully Blasts Off

Mon, 02/19/2024 - 11:36am

Japan successfully tested its new flagship H3 rocket after an earlier version failed last year. The rocket lifted off from the Tanegashima Space Center on Saturday, February 17, reaching an orbital altitude of about 670 kilometers (420 miles). It deployed a set of micro-satellites and a dummy satellite designed to simulate a realistic payload.

With the successful launch of the H3, Japan will begin transitioning away from the previous H-2A rocket which has been in service since 2001 and is set to be retired after two more launches. Several upcoming missions depend on the H3, so this successful test was vital.

The launch came after two days of delays because of bad weather. The H3 rocket, built by Mitsubishi Heavy Industries, is now set to become the main launch vehicle of Japan’s space program. The rocket’s first flight in March 2023 failed to reach orbit, which resulted in the loss of an Earth imaging satellite.

The successful launch and deployment of the satellites was a relief for JAXA and members of the project. A livestream of the launch and subsequent successful orbit insertion showed those in the JAXA command cheering and hugging each other.

“I now feel a heavy load taken off my shoulders,” said JAXA H3 project manager Masashi Okada, speaking at a press briefing after the launch. “But now is the real start for H3, and we will work to steadily improve it.”

H3 stands about 57-meter (187-feet) tall and is designed to carry larger payloads. The two microsatellites were deployed approximately 16 minutes and 43 seconds after liftoff. They included an Earth observation satellite named CE-SAT-IE, developed by Canon Electronics, and TIRSAT, an infrared Earth observation instrument that will observe the temperature of the Earth’s surface and seawater.

“We feel so relieved to be able to announce the good results,” JAXA President Hiroshi Yamakawa said at the briefing. Yamakawa added that the main goals of H3 are to secure independent access to space and allow Japan to be competitive as international demand for satellite launches continues to grow. “We made a big first step today toward achieving that goal,” he said.

An image sent back by a mini-probe shows Japan’s SLIM lander on its side on the lunar surface. (JAXA / Takara Tomy / Sony Group / Doshisha Univ.)

The successful launch comes after two other recent successes for JAXA last month where the H-2A rocket successfully placed a spy satellite into orbit, and just days later JAXA’s robotic SLIM (Smart Lander for Investigating Moon) made the first-ever precise “pinpoint” Moon landing – although unfortunately the lander came down on its side. However, during the final stages of the descent two autonomous rovers were successfully deployed: a tiny hopping robot and the other designed to roll about the surface. Both have sent back pictures and can continue exploring and sending back information even if SLIM cannot be operated.

The post Japan's New H3 Rocket Successfully Blasts Off appeared first on Universe Today.

Categories: Science

Gravastars are an Alternative Theory to Black Holes. Here's What They'd Look Like

Mon, 02/19/2024 - 9:17am

One of the central predictions of general relativity is that in the end, gravity wins. Stars will fuse hydrogen into new elements to fight gravity and can oppose it for a time. Electrons and neutrons exert pressure to counter gravity, but their stability against that constant pull limits the amount of mass a white dwarf or neutron star can have. All of this can be countered by gathering more mass together. Beyond about 3 solar masses, give or take, gravity will overpower all other forces and collapse the mass into a black hole.

While black holes have a great deal of theoretical and observational evidence to prove their existence, the theory of black holes is not without issue. For one, general relativity predicts that the mass compresses to an infinitely dense singularity where the laws of physics break down. This singularity is shrouded by an event horizon, which serves as a point of no return for anything devoured by the black hole. Both of these are problematic, so there has been a long history of trying to find some alternative. Some mechanism that prevents singularities and event horizons from forming.

One alternative is a gravitational vacuum star or gravitational condensate star, commonly called a gravastar. It was first proposed in 2001, and takes advantage of the fact that most of the energy in the universe is not regular matter or even dark matter, but dark energy. Dark energy drives cosmic expansion, so perhaps it could oppose gravitational collapse in high densities.

Illustration of a hypothetical gravastar. Credit: Daniel Jampolski and Luciano Rezzolla, Goethe University Frankfurt

The original gravastar model proposed a kind of Bose-Einstein condensate of dark energy surrounded by a thin shell of regular matter. The internal condensate ensures that the gravastar has no singularity, while the dense shell of matter ensures that the gravastar appears similar to a black hole from the outside. Interesting idea, but there are two central problems. One is that the shell is unstable, particularly if the gravastar is rotating. There are ways to tweak things just so to make it stable, but such ideal conditions aren’t likely to occur in nature. The second problem is that gravitational wave observations of large body mergers confirm the standard black hole model. But a new gravastar model might solve some of those problems.

The new model essentially nests multiple gravastars together, somewhat like those nested Matryoshka dolls. Rather than a single shell enclosing exotic dark energy, the model has a layers of nested shells with dark energy between the layers. The authors refer to this model as a nestar, or nested gravastar. This alternative model makes the gravastar more stable, since the tension of dark energy is better balanced by the weight of the shells. The interior structure of the nestar also means that the gravitational waves of a nestar and black hole are more similar, meaning that technically their existence can’t be ruled out.

That said, even the authors note that there is no likely scenario that could produce nestars. They likely don’t exist, and it’s almost certain that what we observe as black holes are true black holes. But studies such as this one are great for testing the limits of general relativity. They help us understand what is possible within the framework of the theory, which in turn helps us better understand gravitational physics.

Reference: Jampolski, Daniel and Rezzolla, Luciano. “Nested solutions of gravitational condensate stars.” Classical and Quantum Gravity 41 (2024): 065014.

The post Gravastars are an Alternative Theory to Black Holes. Here's What They'd Look Like appeared first on Universe Today.

Categories: Science

European Satellite ERS-2 to Reenter Earth’s Atmosphere This Week

Mon, 02/19/2024 - 9:11am

One of the largest reentries in recent years, ESA’s ERS-2 satellite is coming down this week.

After almost three decades in orbit, an early Earth-observation satellite is finally coming down this week. The European Space Agency’s (ESA) European Remote Sensing satellite ERS-2 is set to reenter the Earth’s atmosphere on or around Wednesday, February 21st.

A Trail Blazing Mission

Launched atop an Ariane-4 rocket from the Kourou Space Center in French Guiana on April 21st, 1995, ERS-2 was one of ESA’s first Earth observation satellites. ERS-2 monitored land masses, oceans, rivers, vegetation and the polar regions of the Earth using visible light and ultraviolet sensors. The mission was on hand for several natural disasters, including the flood of the Elbe River across Germany in 2006. ERS-2 ceased operations in September 2011.

Anatomy of the reentry of ERS-2. ESA

ERS-2 was placed in a retrograde, Sun-synchronous low Earth orbit, inclined 98.5 degrees relative to the equator. This orbit is typical for Earth-observing and clandestine spy satellites, as it allows the mission to image key target sites at the same relative Sun angle, an attribute handy for image interpretation.

ERS-2 tracks and ice floe. ESA The Last Days of ERS-2

Reentry predictions for the satellite are centered on February 21st at 00:19 Universal Time (UT)+/- 25 hours. As we get closer, expect that time to get refined. The mass of ERS-2 at launch (including fuel) was 2,516 kilograms. Expect most of the satellite to burn up on reentry.

The orbital path of ERS-2. Orbitron

For context, recent high profile reentries include the UARS satellite (6.5 tons, in 2011), and the massive Long March-5B booster that launched the core module for China’s Tiangong Space Station in late 2022 (weighing in at 23 tons).

ERS-2 in the clean room on Earth prior to launch. ESA

ESA passed its first space debris mitigation policy in 2008, 13 years after ERS-2 was launched. In 2011, ESA decided to passively reenter the satellite, and began a series of 66 deorbiting maneuvers to bring its orbit down from 785 kilometers to 573 kilometers. Its fuel drained and batteries exhausted, ERS-2 is now succumbing to the increased drag of the Earth’s atmosphere as we near the peak of the current solar cycle.

Flooding in Prague, seen by ERS-2. ESA Tracking the Reentry

Tracking the satellite is as simple as knowing where and when to look. The ID number for ERS-2 is 1995-021A/23560. ESA has a site set up dedicated to tracking the decay of ERS-2. Aerospace.Org, Space-Track and Heavens-Above are other good sites to follow the end of ERS-2.

Expect the satellite to be a real ‘fast mover’ on its final passes. We saw UARS on its final orbit, flashing as it tumbled swiftly across the sky.

Taking out ERS-2 highlights the growing dilemma posed by space junk. There are currently over 25,800 objects in Earth orbit. That amount is growing exponentially as the annual launch cadence increases. 2023 saw a record 212 launches reach orbit, and 2024 is on track to break that number. The rise of mega-constellations such as SpaceX’s Starlink is adding to this burden.

The Age of Space Debris

Space junk adds to the number of artificial ‘stars’ seen whizzing across the night sky, impacts astronomical sky surveys, and poses a hazard to crewed missions and the International Space Station and the Tiangong Space Station. Reentries also contaminate the atmosphere, and a recent study suggests that mega-constellations may even impact the Earth’s magnet field. And while it’s mainly wealthier countries in the northern hemisphere that are launching satellites, the global south disproportionately bears the brunt of uncontrolled reentries.

Finally, all of these are consequences we don’t fully understand and are worthy of further study. For now, you can still track the demise of ERS-2, as it comes to a fiery end this week.

The post European Satellite ERS-2 to Reenter Earth’s Atmosphere This Week appeared first on Universe Today.

Categories: Science

Look at How Much the Sun Has Changed in Just Two Years

Mon, 02/19/2024 - 3:18am

The solar cycle has been reasonably well understood since 1843 when Samuel Schwabe spent 17 years observing the variation of sunspots. Since then, we have regularly observed the ebb and flow of the sunspots cycle every 11 years. More recently ESA’s Solar Orbiter has taken regular images of the Sun to track the progress as we head towards the peak of the current solar cycle. Two recently released images from February 2021 and October 2023 show how things are really picking up as we head toward solar maximum.

The Sun is a great big ball of plasma, electrically charged gas, which has the amazing property that it can move a magnetic field that may be embedded within.  As the Sun rotates, the magnetic field gets dragged around with it but, because the Sun rotates faster at the equator than at the poles, the field lines get wound up tighter and tighter.

Under this immense stressing, the field lines occasionally break, snap or burst through the surface of the Sun and when they do, we see a sunspot. These dark patches on the visible surface of the Sun are regions where denser concentrations of solar material prohibit heat flow to the visible surface giving rise to slightly cooler, and therefore darker patches on the Sun. 

A collage of new solar images captured by the Inouye Solar Telescope, which is a small amount of solar data obtained during the Inouye’s first year of operations throughout its commissioning phase. Images include sunspots and quiet regions of the Sun, known as convection cells. (Credit: NSF/AURA/NSO)

The slow rotation of the Sun and the slow but continuous winding up of the field lines means that sun spots become more and more numerous as the field gets more distorted. Observed over a period of years the spots seem to slowly migrate from the polar regions to the equatorial regions as the solar cycle progresses. 

To try and help understand this complex cycle and unlock other mysteries of the Sun, the European Space Agency launched its Solar Orbiter on 10 February 2020. Its mission to explore the Sun’s polar regions, understand what drives the 11 year solar cycle and what drives the heating of the corona, the outer layers of the Sun’s atmosphere. 

Solar Orbiter

Images from Solar Orbiter have been released that show closeups of the Sun’s visible surface, the photosphere as it nears peak of solar activity. At the beginning of the cycle, at solar minimum in 2019, there was relatively little activity and only a few sunspots. Since then, things have been slowly increasing. The image from February 2021 showed a reasonably quiet Sun but an image taken in October last year shows that things are, dare I say, hotting up! The maximum of this cycle is expected to occur in 2025 which supports theories that the period of maximum activity could arrive a year earlier. 

Understanding the cycle is not just of whimsical scientific interest, it is vital to ensure we minimise damage to ground based and orbiting systems but crucially understand impact on life on Earth. 

Source : Sun’s surprising activity surge in Solar Orbiter snapshot

The post Look at How Much the Sun Has Changed in Just Two Years appeared first on Universe Today.

Categories: Science

What are the Differences Between Quasars and Microquasars?

Mon, 02/19/2024 - 2:31am

Quasars are fascinating objects; supermassive black holes that are actively feasting on material from their accretion disks. The result is a jet that can outshine the combined light from the entire galaxy! There are smaller blackholes too that are the result of the death of stars and these also sometimes seem to host accretion disks and jets just like their larger cousins. We call these microquasars and, whilst there are similarities between them, there are differences too.

The term quasar gives a clue to their nature, the term is an abbreviated version of ‘qausi-stellar radio source’ which is exactly what they are.  A source of radio energy which seems to present as the pinpoint nature of stars.  The first quasar to be discovered was given the rather unimaginative name ‘3C 273’ and it was found in the constellation Virgo.  Most objects of this nature tend to have catalogue numbers rather than more common names and in the case of 3C 273 it tells us it is the 273rd object in the 3rd Cambridge Catalogue of Radio Sources.

It was in 1964 that we started to understand the nature of quasars and their incredible luminosity which is the result of the accretion of material onto a supermassive black hole. The accretion process seems to drive twin radio lobes that appear as opposing jets out of their rotational axis. The microquasars seem to be scaled-down versions. 

In a paper recently published by J I Katz from the Washington University the differences between the two are explored and, despite the common nature of quasars across the Universe, to date only 19 microquasars have been discovered and there is one key difference emerging.

It seems that the radio lobes are the key.  In quasars, a significant propotion of the power appears to come from particle acceleration along their polar jets, driving the energy release from the radio lobes. In microquasars, this seems to be the opposite with thermal emissions from their accretion disk more prominent. In quasars, for some as yet unknown reason, the accretion of material onto the supermassive black holes seems to drive the particle acceleration along the jet rather than thermal radiation yet this is not the case for the smaller microquasars. 

Supermassive blackholes that are the powerhouses for quasars seem to offer a more favourable environment for the accretion and acceleration of energetic particles. Katz proposes that a lower electron density in the accretion disk of supermassive black holes allows quasars to accelerate much larger quantities of relativistic particles than their stellar mass equivalents.

Source : Quasars vs. Microquasars

The post What are the Differences Between Quasars and Microquasars? appeared first on Universe Today.

Categories: Science

Odysseus Moon Lander Sends Back Selfies With Earth in the Picture

Sun, 02/18/2024 - 10:03pm

Intuitive Machines’ Odysseus lander has beamed back a series of snapshots that were captured as it headed out from the Earth toward the moon, and one of the pictures features Australia front and center. The shots also show the second stage of the SpaceX Falcon 9 rocket that launched the spacecraft, floating away as Odysseus pushed onward.

Intuitive Machines successfully transmitted its first IM-1 mission images to Earth on February 16, 2024. The images were captured shortly after separation from @SpaceX's second stage on Intuitive Machines’ first journey to the Moon under @NASA's CLPS initiative. pic.twitter.com/9LccL6q5tF

— Intuitive Machines (@Int_Machines) February 17, 2024

The pictures were taken on Feb. 16, the day of the launch.

“Payload integration managers programmed the lander’s wide and narrow field-of-view cameras to take five quick images every five minutes for two hours, starting 100 seconds after separating from SpaceX’s second stage,” Houston-based Intuitive Machines explained in a posting to X / Twitter. “Out of all the images collected, Intuitive Machines chose to show humanity’s place in the universe with four wonderful images we hope to inspire the next generation of risk-takers.”

If Intuitive Machines’ IM-1 mission is successful, Odysseus is due to become the first commercial spacecraft to make a soft landing on the moon, and the first U.S. spacecraft to do so since NASA’s Apollo 17 crewed mission in 1972.

The lander, which is about the size of an old-fashioned telephone booth, is carrying six science payloads for NASA, plus six commercial payloads — including a miniaturized camera system that would be dropped off just before landing to record the touchdown.

Odysseus is scheduled to reach lunar orbit on Feb. 21 and descend to Malapert A crater, near the moon’s south pole, on the 22nd. The mission’s objective is to test out spacecraft systems and assess the environment in the south polar region, in advance of a crewed landing that could take place as early as 2026.

Assuming all goes well, Intuitive Machines is in line to receive $118 million from NASA through the Commercial Lunar Payload Services program, which was created to take advantage of private-sector innovation and reduce NASA’s costs.

In a Feb. 18 mission update, Intuitive Machines reported that Odysseus “continues to be in excellent health, and flight controllers are preparing planned trajectory correction maneuvers to prepare the lander for lunar orbit insertion.”

Odysseus continues to be in excellent health, and flight controllers are preparing planned trajectory correction maneuvers to prepare the lander for lunar orbit insertion.
?(18FEB2024 1745 CST) 1/5 pic.twitter.com/vp6PV5hqGU

— Intuitive Machines (@Int_Machines) February 18, 2024

Success isn’t guaranteed: Just last month, a NASA-supported commercial lander built by Astrobotic fell back to Earth after missing its chance to make a moon landing due to a propellant leak. Over the past few years, other robotic moon landing missions planned by Israel’s SpaceIL team, Japan’s iSpace and the Russian Space Agency have also ended in failure.

That being said, failure isn’t inevitable: In the past year, India and the Japan Aerospace Exploration Agency have successfully put landers on the lunar surface to send back science data.

If Odysseus survives its landing attempt, Intuitive Machines expects the solar-powered robot to be in operation for seven days. The mission is expected to end when the sun sinks below the lunar horizon.

The post Odysseus Moon Lander Sends Back Selfies With Earth in the Picture appeared first on Universe Today.

Categories: Science

Solar Eclipses Provide a Rare Way to Study Cloud Formation

Sun, 02/18/2024 - 10:50am

April 8’s North American solar eclipse is just around the corner, and it has astronomy fans and weather aficionados alike preparing for an incredible show. But it’s not just fun and games. Eclipses are rare opportunities for scientists to study phenomena that only come around once in a while.

Last week, a team of meteorological experts from the Netherlands released a paper describing how eclipses can disrupt the formation of certain types of clouds. Their findings have implications for futuristic geoengineering schemes that propose to artificially block sunlight to combat climate change.

Published in Nature Communications Earth & Environment, the paper examines satellite imagery of cloud cover during three solar eclipses between 2005 and 2016.

They found that in the wake of an eclipse, shallow cumulus clouds tend to disappear – and it doesn’t even need to be a total eclipse for this to occur – it happens when just 15% of the Sun is obscured.

The effect isn’t immediate. There’s a delay of about 20 minutes. That’s because the eclipse isn’t destroying the clouds directly. Instead, it’s cooling the land beneath, interrupting packets of warm air that race upwards in updrafts to condense into clouds. By suppressing the updrafts, the eclipse puts a pause on cumulus cloud formation.

Proposals to reduce climate change by artificially blocking the Sun work on a similar principle to an eclipse. A swarm of sun-shade spacecraft, or an injection of light-absorbing aerosols into the atmosphere, could reduce the amount of solar energy reaching the surface of the Earth, cooling the temperature back to historical norms. For a project like this to work, about 3.5% to 5% of sunlight would have to be blocked.

The cloud modeling data from this paper indicates reasons to be cautious, however. First and foremost, it suggests that blocking sunlight isn’t as effective as you might think, because while it does cool the ground initially, it also reduces cloud cover, which once again increases the amount of solar energy reaching the Earth.

The decrease in cloud cover would also have an effect on precipitation – fewer clouds means less rain – which might result in regional increases in drought and desertification.

It’s unclear whether the reduction in cumulus clouds would persist with a more permanent, artificially constructed eclipse – true solar eclipses only last a few minutes locally, after all. But the authors say the data ought to influence the design of any serious geoengineering proposals going forward. A solar shade stationed between the Sun and Earth, at Lagrange point 1, for example, might not block the Sun uniformly. If it caused either partial or intermittent local eclipses, it would be more likely to feature these cloud-destroying effects.

Atmospheric aerosol injection might seem like a more uniform method of blocking sunlight, but large-scale weather patterns actually make these methods potentially even more variable, blocking up to 45% of sunlight locally on occasion (well beyond the 15% needed to see a reduction in cloud formation).

These geoengineering projects, in other words, might solve climate change only to introduce new, unexpected challenges, and the costs might not be borne equitably across the globe.

So what’s the lesson? Well, if you’re going out to see the eclipse on April 8, and you feel a little chill in the air, you’re not imagining it. The Earth around you is cooling – and it might also get a little sunnier after it’s over, as cumulus cloud formation gets interrupted. These effects are tangible reminders that the relationship between Earth’s climate and the Sun is complex – and tinkering with it comes with a high chance of unintended consequences.

Read the Paper:

Victor Trees et al. “Clouds dissipate quickly during solar eclipses as the land surface cools.” Communications Earth and Environment. February 12, 2024.

The post Solar Eclipses Provide a Rare Way to Study Cloud Formation appeared first on Universe Today.

Categories: Science

Even Eris and Makemake Could Have Geothermal Activity

Sun, 02/18/2024 - 10:39am

Whether or not you agree that Pluto isn’t a planet, in many ways, Pluto is quite different from the classical planets. It’s smaller than the Moon, has an elliptical orbit that brings it closer to the Sun than Neptune at times, and is part of a collection of icy bodies on the edge of our solar system. It was also thought to be a cold dead world until the flyby of New Horizons proved otherwise. The plucky little spacecraft showed us that Pluto was geologically active, with a thin atmosphere and mountains that rise above icy plains. Geologically, Pluto is more similar to Earth than the Moon, a fact that has led some to reconsider Pluto’s designation as a dwarf planet.

Astronomers still aren’t sure how Pluto has remained geologically active. Perhaps the gravitational interactions with its moon Charon, or perhaps interior radioactive decay. But regardless of the cause, the general thought has been that Pluto is an exception, not a rule. Other outer worlds of similar size and composition are likely dead worlds. But a new study shows that isn’t the case for at least two dwarf planets, Eris and Makemake.

This new study doesn’t rely on high-resolution images like we have for Pluto. Our current observations of Eris and Makemake show them only as small, blurry dots. But we do have spectral observations of these worlds, which is where this study comes in.

The team looked at the spectral lines of molecules on the surface of these worlds, most specifically that of methane. Methane, or CH4 has two important variants. One is composed of standard hydrogen atoms, while the other contains one or more atoms of a type of hydrogen known as deuterium. Deuterium has a nucleus containing a proton and neutron rather than just a proton, and this skews the spectrum of methane a bit. From the spectral observations, the team could measure the D/H ratio for methane on both worlds.

How D/H ratios compare to possible origins. Credit: Glein, et al

This ratio is determined by the source of the methane. If Eris and Makemake are dead worlds, then the methane they have stems from their origin more than 4 billion years ago, and the D/H level should be on the higher end. On the other hand, if the surface methane was generated through an interior process and vented through active geological processes, then the D/H ratio should be lower. The team found that the ratio is most consistent with thermogenic and abiotic mechanisms, suggesting that both Eris and Makemake are active worlds, or at least were active in geologically recent times.

Eris is about the same size as Pluto, so it isn’t too surprising that it’s a geologically active world given what we now know about Pluto. But Makemake is much smaller, about 60% the size of Pluto. If Makemake is an active world, then it is likely that other dwarf planets such as Haumea are as well. If that’s the case, then most if not all dwarf planets are geologically active. As the authors suggest, it might be worth sending a probe or two to the outer worlds for more study.

Reference: Glein, Christopher R., et al. “Moderate D/H ratios in methane ice on Eris and Makemake as evidence of hydrothermal or metamorphic processes in their interiors: Geochemical analysis.” Icarus (2024): 115999.

The post Even Eris and Makemake Could Have Geothermal Activity appeared first on Universe Today.

Categories: Science

There’s One Last Place Planet 9 Could Be Hiding

Sat, 02/17/2024 - 4:30pm

 A recently submitted study to The Astronomical Journal continues to search for the elusive Planet Nine (also called Planet X), which is a hypothetical planet that potentially orbits in the outer reaches of the solar system and well beyond the orbit of the dwarf planet, Pluto. The goal of this study was to narrow down the possible locations of Planet Nine and holds the potential to help researchers better understand the makeup of our solar system, along with its formation and evolutionary processes. So, what was the motivation behind this study regarding narrowing down the location of a potential Planet Nine?

Dr. Mike Brown, who is a Richard and Barbara Rosenberg Professor of Astronomy at Caltech and lead author of the study, tells Universe Today, “We are continuing to try to systematically cover all of the regions of the sky where we predict Planet Nine to be. Using data from Pan-STARRS allowed us to cover the largest region to date.”

Pan-STARRS, which stands for Panoramic Survey Telescope and Rapid Response System, is a collaborative astronomical observation system located at Haleakala Observatory and operated by the University of Hawai’I Institute of Astronomy with telescope construction being funded by the U.S. Air Force. For the study, the researchers used data from Data Release 2 (DR2) with the goal of narrowing down the possible location of Planet Nine based on findings from past studies.

In the end, the team narrowed down possible locations of Planet Nine by eliminating approximately 78 percent of possible locations that were calculated from previous studies. Additionally, the researchers also provided new estimates for the approximate semimajor axis (measured in astronomical units (AU)) and Earth-mass size of Planet Nine at 500 and 6.6, respectively. So, what are the most significant results from this study, and what follow-up studies are currently being conducted or planned?

“While I would love to say that the most significant result was finding Planet Nine, we didn’t,” Dr. Brown tells Universe Today. “So instead, it means that we have significantly narrowed the search area. We’ve now surveyed approximately 80% of the regions where we think Planet Nine might be.”

In terms of follow-up studies, Dr. Brown tells Universe Today, “I think that the LSST is the most likely survey to find Planet Nine. When it comes online in a year or two it will quickly cover much of the search space and, if Planet Nine is there, find it.”

LSST stands for Legacy Survey of Space and Time, and is an astronomical survey currently scheduled as a 10-year program to study the southern sky and take place at the Vera C. Rubin Observatory in Chile, which is presently under construction. Objectives for LSST include studying identifying near-Earth asteroids (NEAs) and small planetary bodies within our solar system, but also include deep space studies, as well. These include investigating the properties of dark matter and dark energy and the evolution of the Milky Way Galaxy. But what is the importance of finding Planet Nine?

Dr. Brown tells Universe Today, “This would be the 5th largest planet of our solar system and the only one with a mass between Earth and Uranus. Such planets are common around other stars, and we would suddenly have a chance to study one in our own solar system.”

Scientists began hypothesizing the existence of Planet Nine shortly after the discovery of Neptune in 1846, including an 1880 memoir authored by D. Kirkwood and later a 1946 paper authored by American astronomer, Clyde Tombaugh, who was responsible for discovering Pluto in 1930. More recent studies include studies from 2016 and 2017 presenting evidence for the existence of Planet Nine, the former of which was co-authored by Dr. Brown. This most recent study marks the most complete investigation of narrowing down the location of Planet Nine, which Dr. Brown has long-believed exists, telling Universe Today, “There are too many separate signs that Planet Nine is there. The solar system is very difficult to understand without Planet Nine.”

He continues by telling Universe Today that “…Planet Nine explains many things about orbits of objects in the outer solar system that would be otherwise unexplainable and would each need some sort of separate explanation. The cluster of the directions of the orbits is the best know, but there is also the large perihelion distances of many objects, existence of highly inclined and even retrograde objects, and the high abundance of very eccentric orbits which cross inside the orbit of Neptune. None of these should happen in the solar system, but all are easily explainable as an effect of Planet Nine.”

Does Planet Nine exist and where will we find it in the coming years and decades? Only time will tell, and this is why we science!

As always, keep doing science & keep looking up!

The post There’s One Last Place Planet 9 Could Be Hiding appeared first on Universe Today.

Categories: Science

China's Chang'e-8 Mission Will Try to Make Bricks on the Moon

Sat, 02/17/2024 - 11:58am

The China National Space Administration (CNSA) has put out a call for international and industry partners to contribute science payloads to its Chang’e-8 lunar lander, set for launch to the Moon in 2028. The mission, which will involve a lander, a rover, and a utility robot, will be China’s first attempt at in-situ resource utilization on the Moon, using lunar regolith to produce brick-like building materials.

Just like NASA’s Artemis plans, the CNSA’s plans for the Moon are targeted at the Lunar south pole, which is expected to be rich in useable resources, especially water. The presence of these resources will be vital for long-term human activity on the lunar surface.

Possible landing sites for Chang’e-8 include Leibnitz Beta, Amundsen crater, Cabeus crater, and the ridge connecting the Shackleton and de Gerlache craters, according to a presentation by Chang’e-8 chief deputy designer in October 2023.

Chang’e-8 will be the last CNSA robotic mission to be launched before construction begins on the International Lunar Research Station, China’s crewed moonbase being planned in collaboration with Russia’s Roscosmos. That makes Chang’e-8’s attempt to create building materials out of regolith a vital proof-of-concept for their lunar aspirations.

In order to make moon-bricks, the lander will carry an instrument that uses solar energy to melt lunar soil and turn it into useable parts at a speed of 40 cubic cm per hour.

Alongside the regolith processing equipment, the lander will be equipped with an array of science instruments, including cameras, a seismometer to detect moonquakes, and an x-ray telescope. Part of the mission will focus on moon-based Earth observation, with several instruments designed to monitor Earth’s atmosphere and magnetosphere.

The rover, meanwhile, will carry ground penetrating radar, cameras, a mineral analyzer, and tools for collecting and storing samples (leaving open the possibility of future missions to retrieve the samples).

The utility robot is a key piece of the mission, but CNSA isn’t developing it in-house. Instead, the space agency is seeking proposals from partners interested in developing it as a piggyback payload to ride alongside the rest of Chang’e-8.

According to the call for proposals, the 100kg, battery-powered robot will need to be able to “capture, carry and place items, shovel, and transfer lunar soil.” It will also need to be able to travel at 400m per hour.

There is room for an additional 100kg of piggyback payloads besides the robot, for which full proposals are expected to be submitted later this year.

While planning for Chang’e 8 is ongoing, the CNSA has two additional robotic moon missions in the works for the near future. The first, Chang’e-6, will launch this spring, and aims to return a regolith sample from the lunar far side (a never before accomplished feat). The next mission is planned for 2026, when Chang’e-7 will carry out a geological examination of the permanently shadowed craters scattered around the Moon’s south pole.

The post China's Chang'e-8 Mission Will Try to Make Bricks on the Moon appeared first on Universe Today.

Categories: Science

Can the Gaia Hypothesis Be Tested in the Lab?

Sat, 02/17/2024 - 10:27am

During the 1970s, inventor/environmentalist James Lovelock and evolutionary biologist Lynn Margulis proposed the Gaia Hypothesis. This theory posits that Earth is a single, self-regulating system where the atmosphere, hydrosphere, all life, and their inorganic surroundings work together to maintain the conditions for life on the planet. This theory was largely inspired by Lovelock’s work with NASA during the 1960s, where the skilled inventor designed instruments for modeling the climate of Mars and other planets in the Solar System.

According to this theory, planets like Earth would slowly grow warmer and their oceans more acidic without a biosphere that regulates temperature and ensures climate stability. While the theory was readily accepted among environmentalists and climatologists, many in the scientific community have remained skeptical since it was proposed. Until now, it has been impossible to test this theory because it involves forces that work on a planetary scale. But in a recent paper, a team of Spanish scientists proposed an experimental system incorporating synthetic biology that could test the theory on a small scale.

The team included researchers from the Catalan Institution for Research and Advanced Studies (ICREA), the Universitat Pompeu Fabra’s Complex Systems Lab (UPE-CSL), the European Molecular Biology Laboratory (EMBL), and the Santa Fe Institute (SFI). Their paper, “A Synthetic Microbial Daisyworld: Planetary Regulation in the Test Tube,” recently appeared in the Journal of the Royal Society Interface. As they describe, their proposed test consists of two engineered micro-organisms in a self-contained system to see if they can achieve a stable equilibrium.

An image of Earth taken by the Galileo spacecraft in 1990. Credit: NASA/JPL

In response to challenges, Lovelock and British marine and atmospheric scientist Andrew Watson (a postgrad student of Lovelock’s) created a computer model named Daisyworld in 1983. The model consisted of an imaginary planet orbiting a star whose radiant energy slowly increases or decreases (aka. stellar flux). In the first (biological) case, the planet has a simple biosphere consisting of two species of daisies with different colors (black and white) that cause them to absorb different amounts of solar radiation.

The black or white daises increase based on how much solar energy the planet receives, and changes in their relative populations stabilize the planet’s climate over time despite fluctuations in energy from the star. In the second (non-biological) case, the planet’s temperature is directly related to the amount of energy it receives from the star. Previously, no means existed to test this model since it was planetary in scale. This proposed test was inspired by recent research in fermentation, which typically requires finely tuned external controls to ensure stable, regulated conditions.

In this experimental setup, one strain senses if the environment is becoming too acidic and counteracts it, while the other strain senses if the environment is becoming too basic and acts to increase acidity. Ricard Solé, an ICREA research professor, head of the Complex Systems Lab, and an external professor at the SFI, was a co-author of the paper. As he explained in a recent SFI news release:

“There’s been recent work in trying to see if you can engineer microorganisms for fermentation so that they can self-regulate. That was the key inspiration. Because these strains act on the environment, and the environment affects them, this creates a closed causal loop. The idea is to show that under very broad conditions, they will stabilize to a constant pH level, as predicted by the original theory.”

Artist’s impression of an Earth-like exoplanet orbiting Gliese 667 C, part of a triple star system. Credit: ESO

Solé and several of his students developed the experiment during a visit to SFI. It has the potential to answer long-standing questions regarding planet-wide regulatory systems. In short, it offers the first possible means for testing the Gaia Hypothesis and demonstrating the vital role life plays in regulating biospheres and maintaining habitable conditions. In addition to Earth’s climate, this research could have significant implications for measuring habitability and climate stability on other planets, particularly exoplanets.

Further Reading: Santa Fe Institute, Journal of the Royal Society

The post Can the Gaia Hypothesis Be Tested in the Lab? appeared first on Universe Today.

Categories: Science

New NASA Report Suggests We Could See Space-Based Power After 2050

Fri, 02/16/2024 - 5:25pm

Space-based solar power (SBSP) has been in the news recently, with the successful test of a solar power demonstrator in space taking place last summer. While the concept is fundamentally sound, there are plenty of hurdles to overcome if the technology is to be widely adopted – not the least of which is cost. NASA is no stranger to costly projects, though, and they recently commissioned a study from their internal Office of Technology, Policy, and Strategy that suggests how NASA could continue to support this budding idea. Most interestingly, if the technological cards are played right, SBSP could be the most carbon-efficient, lowest-cost power source for humanity by 2050.

To be clear, there are a lot of hurdles to overcome to get to that point, but first, let’s start with what the report looked at. Its primary concern was two-fold – how expensive the electricity from a power satellite is and how high its lifecycle carbon emissions are, including those introduced to the atmosphere to get it into space in the first place.

Those two data points were analyzed for two different systems, one modular one called the SPS-Alpha Mark-III suggested by prolific inventor John Mankins, which is a little more theoretical, and another by a group of Japanese researchers called the Tethered-SPS that uses a more traditional design. In most of the calculations the report provides, the SPS-Alpha Mark-III outperforms the more conventional system. Still, there are some technical hurdles to its implementation – though nothing so complicated as some of the others discussed therein.

Fraser interviews an expert on space power – Prof Stephen Sweeney

The results report presents are not pretty for SBSP. Given their current levels of technical maturity, both solutions produce electricity that is more expensive than any existing technology. Not only that, even the more climate-friendly SPS-Alpha Mark-III is still comparable only to solar power in terms of climate impact and is beaten out by things like hydropower or even nuclear fission. So, some work needs to happen before there is any commercial incentive to adopt this technology.

Let’s tackle cost first – two big sources are the cost of getting the satellite into orbit and maintaining it when it’s up there, known as in-space assembly and maintenance (ISAM). The report even provides some allowances for the launch cost to be lower than it currently is (without fully functional Starships). But even with that lower cost, 863 launches to geosynchronous orbit for the smaller of the two systems will likely not allow any system to be cost-competitive with terrestrial alternatives.

Also, as of right now, no ISAM infrastructure could support such a massive satellite. So if any part of the system fails while in space, which, given the nature of the environment, is inevitable, there wouldn’t be any feasible way to fix it. Like lowering launch costs, this, too, is being worked on by several commercial entities. However, the inability to maintain infrastructure in orbit inexpensively will plague cost assessments of any large project in the near future.

Isaac Arthur describes how useful beaming power in space can be.
Credit – SFIA YouTube Channel

As for greenhouse emissions – most of those are caused by the launches required to get into space. There haven’t been a lot of studies done on the effects of emitting combustion products into the high atmosphere, especially in terms of their impact on the environment. But it wouldn’t be surprising if that wasn’t good. But even without that, just the sheer amount of greenhouse gases that must be emitted to lift all of the weight of these systems into space would make it hard to compete with low-carbon alternatives. 

These difficulties might sound like a death knell for building a SBSP system in the near future. But there is a silver lining. Using a statistical methodology sensitivity analysis, the report’s writers created a scenario where SBSP is the most cost-effective with the least greenhouse gas emissions of any energy source available in 2050.

To do so requires some great leaps in technology; in particular, using other technologies, like increased ISAM and ion drives to move the parts from low Earth orbit (where Starships can be reusable) to geostationary orbit, can dramatically limit the number of launches needed. Other improvements include optimistic standards for cost analysis, such as lower launch costs (though the $500/kg the study uses is far lower than even more optimistic estimates of what Starship can do) and increasing the lifetime of the equipment itself. 

Even the Financial Times is interested enough to take a look at the underlying SBSP idea.
Credit – Financial Times YouTube Channel

Ultimately, this analysis goes to show that, with a little bit more development, SBSP could be not only cost-competitive in 25 years but also the best option for low-cost, environmentally friendly power. However, the report’s purpose was to suggest potential action items to NASA’s leadership, and its outcome was an underwhelming “keep an eye on it.” It rightfully points out that plenty of the activities that would go into making SBSP as potentially fantastic as it can be, such as lower launch costs, ion drives, and improved ISAM systems, are already on NASA’s radar and are actively under development, with varying levels of support.

The authors suggest looking into the technology every few years, as NASA has been for decades at this point, to see if any specific technical hurdles aren’t being addressed as part of other projects. For now, they didn’t find any. But plenty of technologies that weren’t even mentioned in the report, such as asteroid mining or deployable lightweight structures, could also fundamentally change the economic calculations. One thing is for sure – whatever future reports on the viability of SBSP will have plenty of new advances to consider.

Learn More:
NASA OTPS – New Study Updates NASA on Space-Based Solar Power
NASA OTPS – Space-Based Solar Power
UT – New Satellite Successfully Beams Power From Space
UT – Could Space-based Satellites Power Remote Mines?

Lead Image:
DALL-E’s interpretation of a SBSP system.

The post New NASA Report Suggests We Could See Space-Based Power After 2050 appeared first on Universe Today.

Categories: Science

NASA is Done Setting Fires Inside its Doomed Cargo Spacecraft

Fri, 02/16/2024 - 12:57pm

Fire on a spacecraft can be catastrophic. It can spread quickly in a confined space, and for trapped astronauts, there may be no escape. It’s fading in time now, but Apollo 1, which was to be the first crewed Apollo mission, never got off the ground because of a fire that killed the crew. There’ve been other dangerous spacecraft fires too, like the one onboard the Russian Mir space station in 1997.

In an effort to understand how fire behaves in spacecraft, NASA began its Saffire (Spacecraft Fire Safety Experiment) in 2016. Saffire was an eight-year, six-mission effort to study how fire behaves in space. The final Saffire test was completed on January 9th.

Fire behaviour in buildings here on Earth is well-studied and well-understood. Fire prevention and suppression are important components in building design. It makes sense to bring that same level of understanding to space travel and even to surpass it.

“How big a fire does it take for things to get bad for a crew?” asked Dr. David Urban, Saffire principal investigator at NASA’s Glenn Research Centre. “This kind of work is done for every other inhabited structure here on Earth – buildings, planes, trains, automobiles, mines, submarines, ships – but we hadn’t done this research for spacecraft until Saffire.”

NASA has conducted six experiments under Saffire, and each one was conducted in an uncrewed Cygnus cargo vehicle after it completed its supply mission to the ISS. The vehicles are sent into the atmosphere to burn up, and the experiments are run prior to the vehicle’s destruction. Saffire 1 ran in 2016 inside an avionics bay with an airflow duct. The bay contained a cotton and fibreglass burn blend, which was ignited remotely with a hot wire.

Subsequent Saffire experiments tested how different materials burned, including the fire-resistant fabric Nomex and even acrylic spacecraft windows. Tests also included varied oxygen flows, different atmospheric pressures, and different oxygen levels. Each Saffire test generated important data on how fire behaves inside spacecraft.

The final segment of the Saffire program, Saffire-VI, was conducted on January 9th, 2024, prior to the uncrewed Northrop Grumman Cygnus spacecraft carrying the experiment burning up during re-entry. Saffire-VI was different than its predecessors in the program because the experiment had higher oxygen content and lower pressure similar to actual conditions in spacecraft.

“The Saffire flow unit is a wind tunnel,” said Dr. Gary Ruff, Saffire project manager at NASA’s Glenn Research Center. “We’re pushing air through it. Once test conditions are set, we run an electrical current through a thin wire, and the materials ignite.”

Fire in a confined environment does more than just damage things and burn people. It also generates harmful combustion by-products. Alongside the predictable carbon monoxide and carbon dioxide, a fire onboard a spacecraft can generate trace amounts of hydrogen fluoride, hydrogen chloride, and hydrogen cyanide. Hydrogen fluoride is a very toxic chemical, and exposure requires immediate medical attention. Hydrogen chloride is an irritant that can become fatal, and hydrogen cyanide can damage the brain, heart, and lungs and can also be fatal. A piece of equipment called the Combustion Product Monitor (CPM) instrument uses laser spectroscopy to analyze the contents of the smoke and detect these hazardous chemicals.

The Combustion Product Monitor uses laser spectroscopy to detect hazardous chemicals created by fire. Image Credit: NASA/JPL/Microdevices Laboratory.

Cameras inside the experiment record what happens, while other instruments outside collect data. After the experiments collect their data, it’s downloaded before the Cygnus vehicle is sent plummeting toward its atmospheric destruction. By altering variables like oxygen content and flow and atmospheric pressure, the experiments gather data that the researchers use to build a predictive model of fire behaviour aboard a spacecraft.

“You’ve got a heat release rate and a rate of release of combustion products,” Ruff said. “You can take those as model input and predict what will happen in a vehicle.”

At this point in time, humans are poised for a big leap. We’re working towards establishing a presence on the Moon. From there, future crewed missions to Mars beckon. Researchers are studying how to protect astronauts’ health during those flights by understanding how their bodies respond to extended time in microgravity, exposure to radiation, and other hazards. Preventing fires and extinguishing them quickly are critical issues in spaceflight and astronaut safety, especially when astronauts are so far away there’s no chance of any assistance.

The models built on Saffire data will help missions succeed and help everyone get home safely.

The post NASA is Done Setting Fires Inside its Doomed Cargo Spacecraft appeared first on Universe Today.

Categories: Science

Euclid Begins its 6-Year Survey of the Dark Universe

Fri, 02/16/2024 - 9:50am

On July 1, 2023, the Galileo Spacecraft launched with a clear mission: to map the dark and distant Universe. To achieve that goal, over the next 6 years, Galileo will make 40,000 observations of the sky beyond the Milky Way. From this data astronomers will be able to map the positions of billions of galaxies, allowing astronomers to observe the effects of dark matter.

There have been several galactic sky surveys before, but Galileo’s mission will take them to the next level. Galileo is equipped with a widefield imaging system. With each 70-minute exposure of the dark sky, it will capture the image and spectra of more than 50,000 galaxies. When it is complete, the Galileo survey will be the most detailed survey of galactic positions and distances. The mission will also make several deep sky observations, where it focuses on the most distant and dim galaxies.

Euclid’s field of view compared to the Moon. Credit: ESA/ESA/Euclid/Euclid Consortium/NASA, S. Brunier

One of the mysteries Galileo could answer is the nature of dark energy. The standard model of cosmology describes dark energy as a property of space and time. A cosmological constant that drives cosmic expansion. But some theories of dark energy argue that it’s an energy field within space and time, and that cosmic expansion isn’t constant. Galileo will study whether cosmic expansion varies, allowing astronomers to constrain or rule out certain models. The mission will also look at how dark matter distorts galaxies, allowing us to learn more about the properties of dark matter and how it interacts with regular matter.

The Euclid mission officially began its survey on Valentine’s Day and will complete about 15% of its survey this year. An initial deep sky data set will be released in Spring 2025, and data from the first year of the general survey will be released in Summer 2026.

You can read more about the Euclid Mission on ESA’s website.

The post Euclid Begins its 6-Year Survey of the Dark Universe appeared first on Universe Today.

Categories: Science

OSIRIS-REx’s Final Haul: 121.6 Grams from Asteroid Bennu

Thu, 02/15/2024 - 1:33pm

After several months of meticulous, careful work, NASA has the final total for their haul of asteroidal material from the OSIRIS-REx mission to Bennu. The highly successful mission successfully collected 121.6 grams, or almost 4.3 ounces, of rock and dust. It won’t be long before scientists get their hands on these samples and start analyzing them.

These samples have been a long time coming. The OSIRIS-REx (Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer) was approved by NASA back in 2011 and launched in September 2016. It reached its target, the carbonaceous Apollo group asteroid 101955 Bennu, in December 2018. After spending months studying the asteroid and reconnoitring for a suitable sampling location, it selected one in December 2019. After two sampling rehearsals, the spacecraft gathered its sample on October 20th, 2020.

In September 2023, the sample finally returned to Earth.

There was some serendipity in the way the final total was reached. Some of it hitched a ride outside of the main sample container. There was some drama, too, as stubborn bolts on the TAGSAM head resisted removal and delayed access to the sample contained inside. Personnel from NASA’s Astromaterials Research and Exploration Science (ARES) had to design, build, and test new tools that they used to finally open the TAGSAM head and access the sample.

For OSIRIS-REx to be successful, it had to collect at least 60 grams of material. With a final total that is double that, it should open up more research opportunities and allow more of the material to be held untouched for future research. NASA says they will preserve 70% of the sample for the future, including for future generations.

OSIRIS-REx astromaterials processors, from left, Rachel Funk, Julia Plummer, and Jannatul Ferdous, prepare to lift the top plate of the Touch-and-Go Sample Acquisition Mechanism (TAGSAM) head and pour the final portion of asteroid rocks and dust into sample trays below. Credit: NASA/Robert Markowitz

The next step is for the material to be put into containers and sent to researchers. More than 200 researchers around the world will receive samples. Many of the samples will find their way to scientists at NASA and institutions in the US, while others will go to researchers at institutions associated with the Canadian Space Agency, JAXA, and other partner nations. Canada will receive 4% of the sample, the first time that Canada’s scientific community will have direct access to a returned asteroid sample.

Asteroid Bennu was chosen because it’s close to Earth and has been observed extensively. It’s a carbonaceous asteroid, which make up about 75% of asteroids. But it’s also a sub-type of carbonaceous asteroids called a B-type. These are much more uncommon than other carbonaceous asteroids, and scientists think they’re very primitive and contain volatiles that date back to the early Solar System. Researchers around the world have been eagerly waiting for these samples.

Bennu is a natural time capsule that holds clues to how the Solar System formed, including Earth. It’s also a rubble pile asteroid, and OSIRIS-REx showed that Bennu has over 200 boulders on its surface that are larger than 10 meters. Some of these boulders have veins of carbonate minerals that predate the formation of the asteroid.

Bennu’s boulder-strewn surface. Bennu is a rubble pile asteroid that was likely part of a much larger parent body at one time in the distant past. Image Credit: NASA/University of Arizona.

The “O” in OSIRIS-REx stands for Origins, and that’s one of the things scientists hope to learn more about from Bennu. Will the sample contain any organic compounds that could’ve played a role in the appearance of life? If so, that supports the panspermia theory.

Laboratory testing will also show how accurate the spacecraft’s instruments were by comparing the samples to what the instruments told us from orbit around the asteroid. This is invaluable feedback for future missions.

But the main scientific value in the Bennu sample concerns what the samples will tell us about the asteroid’s origins. Scientists think that Bennu broke off from a much larger parent body before migrating to the inner Solar System. It could hold clues to that journey and how it changed over time. Astronomers suspect that Bennu is actually older than the Solar System itself. It could hold important clues to the gas and dust in the solar nebula that eventually formed the Sun and all the planets.

We already have some early results from the Bennu sample. Initial observations showed that the asteroid contains carbon and water. Carbon wasn’t unexpected since the asteroid is a carbonaceous one. Neither was water surprising since scientists have long thought that asteroids were one of the main ways that Earth got its water.

While the OSIRIS-REx sampling mission is over, the spacecraft is still going. It’s in its extended mission now, called OSIRIS-APEX (Origins, Spectral Interpretation, Resource Identification and Security – Apophis Explorer.) Its target is the asteroid Apophis, which will have a close encounter with Earth in 2029. The mission will study how the close encounter affects the asteroid, including its orbit and trajectory, and any surface changes that Earth’s gravity might trigger, like landslides.

These are images of the asteroid Apophis captured in 2012. Apophis was considered at risk of impacting Earth, but now astronomers are confident it will pass by. (NASA / JPL-Caltech)

The OSIRIS-REx mission is an impressive display of human ingenuity and cooperation. Once scientists get their hands on the samples, we can expect a stream of fascinating results. Who knows which of our ideas about the Solar System will be confirmed and which ones will be discarded? No matter what we learn, it’s guaranteed to be interesting.

The post OSIRIS-REx’s Final Haul: 121.6 Grams from Asteroid Bennu appeared first on Universe Today.

Categories: Science

Pages