Since the beginning of the Space Age, humans have relied on chemical rockets to get into space. While this method is certainly effective, it is also very expensive and requires a considerable amount of resources. As we look to more efficient means of getting out into space, one has to wonder if similarly-advanced species on other planets (where conditions would be different) would rely on similar methods.
Harvard Professor Abraham Loeb and Michael Hippke, an independent researcher affiliated with the Sonneberg Observatory, both addressed this question in two recently–released papers. Whereas Prof. Loeb looks at the challenges extra-terrestrials would face launching rockets from Proxima b, Hippke considers whether aliens living on a Super-Earth would be able to get into space.
Artist’s impression of Proxima b, which was discovered using the Radial Velocity method. Credit: ESO/M. Kornmesser
For the sake of his study, Loeb considered how we humans are fortunate enough to live on a planet that is well-suited for space launches. Essentially, if a rocket is to escape from the Earth’s surface and reach space, it needs to achieve an escape velocity of 11.186 km/s (40,270 km/h; 25,020 mph). Similarly, the escape velocity needed to get away from the location of the Earth around the Sun is about 42 km/s (151,200 km/h; 93,951 mph).
As Prof. Loeb told Universe Today via email:
“Chemical propulsion requires a fuel mass that grows exponentially with terminal speed. By a fortunate coincidence the escape speed from the orbit of the Earth around the Sun is at the limit of attainable speed by chemical rockets. But the habitable zone around fainter stars is closer in, making it much more challenging for chemical rockets to escape from the deeper gravitational pit there.”
As Loeb indicates in his essay, the escape speed scales as the square root of the stellar mass over the distance from the star, which implies that the escape speed from the habitable zone scales inversely with stellar mass to the power of one quarter. For planets like Earth, orbiting within the habitable zone of a G-type (yellow dwarf) star like our Sun, this works out quite while.
This infographic compares the orbit of the planet around Proxima Centauri (Proxima b) with the same region of the Solar System. Credit: Pale Red Dot
Unfortunately, this does not work well for terrestrial planets that orbit lower-mass M-type (red dwarf) stars. These stars are the most common type in the Universe, accounting for 75% of stars in the Milky Way Galaxy alone. In addition, recent exoplanet surveys have discovered a plethora of rocky planets orbiting red dwarf stars systems, with some scientists venturing that they are the most likely place to find potentially-habitable rocky planets.
Using the nearest star to our own as an example (Proxima Centauri), Loeb explains how a rocket using chemical propellant would have a much harder time achieving escape velocity from a planet located within it’s habitable zone.
“The nearest star to the Sun, Proxima Centauri, is an example for a faint star with only 12% of the mass of the Sun,” he said. “A couple of years ago, it was discovered that this star has an Earth-size planet, Proxima b, in its habitable zone, which is 20 times closer than the separation of the Earth from the Sun. At that location, the escape speed is 50% larger than from the orbit of the Earth around the Sun. A civilization on Proxima b will find it difficult to escape from their location to interstellar space with chemical rockets.”
Hippke’s paper, on the other hand, begins by considering that Earth may in fact not be the most habitable type of planet in our Universe. For instance, planets that are more massive than Earth would have higher surface gravity, which means they would be able to hold onto a thicker atmosphere, which would provide greater shielding against harmful cosmic rays and solar radiation.
Artists impression of a Super-Earth, a class of planet that has many times the mass of Earth, but less than a Uranus or Neptune-sized planet. Credit: NASA/Ames/JPL-Caltech
In addition, a planet with higher gravity would have a flatter topography, resulting in archipelagos instead of continents and shallower oceans – an ideal situation where biodiversity is concerned. However, when it comes to rocket launches, increased surface gravity would also mean a higher escape velocity. As Hippke told Universe Today via email:
“Conventional (chemical) rocket flight needs more fuel the heavier the planet is which it launches from. The relation is a so-called “exponential function”. So you have gravity in the exponent. For example, Earth gravity = 1. Then, the fuel goes as “something to the first power”, which is e.g. 2^1 = 2. On another planet with ten times the gravity, g=10, you have 10 in the exponent. This is extremly much more: “something” to the 10th power. For example, 2^10 = 1024. So in this toy example, the Super-Earth with 10x the mass needs a rocket with 1000x the fuel, to launch the same payload.”
For comparison, Hippke uses Kepler-20 b, a Super-Earth located 950 light years away that is 1.6 times Earth’s radius and 9.7 times it mass. Whereas escape velocity from Earth is roughly 11 km/s, a rocket attempting to leave a Super-Earth similar to Kepler-20 b would need to achieve an escape velocity of ~27.1 km/s. As a result, a single-stage rocket on Kepler-20 b would have to burn 104 times as much fuel as a rocket on Earth to get into orbit.
To put it into perspective, Hippke considers specific payloads being launched from Earth. “To lift a more useful payload of 6.2 t as required for the James Webb Space Telescope on Kepler-20 b, the fuel mass would increase to 55,000 t, about the mass of the largest ocean battleships,” he writes. “For a classical Apollo moon mission (45 t), the rocket would need to be considerably larger, ~400,000 t.”
Project Starshot, an initiative sponsored by the Breakthrough Foundation, is intended to be humanity’s first interstellar voyage. Credit: breakthroughinitiatives.org
While Hippke’s analysis concludes that chemical rockets would still allow for escape velocities on Super-Earths up to 10 Earth masses, the amount of propellant needed makes this method impractical. As Hippke pointed out, this could have a serious effect on an alien civilization’s development.
“I am surprised to see how close we as humans are to end up on a planet which is still reasonably lightweight to conduct space flight,” he said. “Other civilizations, if they exist, might not be as lucky. On more massive planets, space flight would be exponentially more expensive. Such civilizations would not have satellite TV, a moon mission, or a Hubble Space Telescope. This should alter their way of development in certain ways we can now analyze in more detail.”
Both of these papers present some clear implications when it comes to the search for extra-terrestrial intelligence (SETI). For starters, it means that civilizations on planets that orbit red dwarf stars or Super-Earths are less likely to be space-faring, which would make detecting them more difficult. It also indicates that when it comes to the kinds of propulsion humanity is familiar with, we may be in the minority.
“This above results imply that chemical propulsion has a limited utility, so it would make sense to search for signals associated with lightsails or nuclear engines, especially near dwarf stars,” said Loeb. “But there are also interesting implications for the future of our own civilization.”
Artist’s concept of a bimodal nuclear rocket making the journey to the Moon, Mars, and other destinations in the Solar System. Credit: NASA
“One consequence of the paper is for space colonization and SETI,” added Hippke. “Civs from Super-Earths are much less likely to explore the stars. Instead, they would be (to some extent) “arrested” on their home planet, and e.g. make more use of lasers or radio telescopes for interstellar communication instead of sending probes or spaceships.”
However, both Loeb and Hippke also note that extra-terrestrial civilizations could address these challenges by adopting other methods of propulsion. In the end, chemical propulsion may be something that few technologically-advanced species would adopt because it is simply not practical for them. As Loeb explained:
“An advanced extraterrestrial civilization could use other propulsion methods, such as nuclear engines or lightsails which are not constrained by the same limitations as chemical propulsion and can reach speeds as high as a tenth of the speed of light. Our civilization is currently developing these alternative propulsion technologies but these efforts are still at their infancy.”
One such example is Breakthrough Starshot, which is currently being developed by the Breakthrough Prize Foundation (of which Loeb is the chair of the Advisory Committee). This initiative aims to use a laser-driven lightsail to accelerate a nanocraft up to speeds of 20% the speed of light, which will allow it to travel to Proxima Centauri in just 20 years time.
Artist’s impression of rocky exoplanets orbiting Gliese 832, a red dwarf star just 16 light-years from Earth. Credit: ESO/M. Kornmesser/N. Risinger (skysurvey.org).
Hippke similarly considers nuclear rockets as a viable possibility, since increased surface gravity would also mean that space elevators would be impractical. Loeb also indicated that the limitations imposed by planets around low mass stars could have repercussions for when humans try to colonize the known Universe:
“When the sun will heat up enough to boil all water off the face of the Earth, we could relocate to a new home by then. Some of the most desirable destinations would be systems of multiple planets around low mass stars, such as the nearby dwarf star TRAPPIST-1 which weighs 9% of a solar mass and hosts seven Earth-size planets. Once we get to the habitable zone of TRAPPIST-1, however, there would be no rush to escape. Such stars burn hydrogen so slowly that they could keep us warm for ten trillion years, about a thousand times longer than the lifetime of the sun.”
But in the meantime, we can rest easy in the knowledge that we live on a habitable planet around a yellow dwarf star, which affords us not only life, but the ability to get out into space and explore. As always, when it comes to searching for signs of extra-terrestrial life in our Universe, we humans are forced to take the “low hanging fruit approach”.
Basically, the only planet we know of that supports life is Earth, and the only means of space exploration we know how to look for are the ones we ourselves have tried and tested. As a result, we are somewhat limited when it comes to looking for biosignatures (i.e. planets with liquid water, oxygen and nitrogen atmospheres, etc.) or technosignatures (i.e. radio transmissions, chemical rockets, etc.).
As our understanding of what conditions life can emerge under increases, and our own technology advances, we’ll have more to be on the lookout for. And hopefully, despite the additional challenges it may be facing, extra-terrestrial life will be looking for us!
What if our Solar System had another generation of planets that formed before, or alongside, the planets we have today? A new study published in Nature Communications on April 17th 2018 presents evidence that says that’s what happened. The first-generation planets, or planet, would have been destroyed during collisions in the earlier days of the Solar System and much of the debris swept up in the formation of new bodies.
This is not a new theory, but a new study brings new evidence to support it.
The evidence is in the form of a meteorite that crashed into Sudan’s Nubian Desert in 2008. The meteorite is known as 2008 TC3, or the Almahata Sitta meteorite. Inside the meteorite are tiny crystals called nanodiamonds that, according to this study, could only have formed in the high-pressure conditions within the growth of a planet. This contrasts previous thinking around these meteorites which suggests they formed as a result of powerful shockwaves created in collisions between parent bodies.
“We demonstrate that these large diamonds cannot be the result of a shock but rather of growth that has taken place within a planet.” – study co-author Philippe Gillet
Models of planetary formation show that terrestrial planets are formed by the accretion of smaller bodies into larger and larger bodies. Follow the process long enough, and you end up with planets like Earth. The smaller bodies that join together are typically between the size of the Moon and Mars. But evidence of these smaller bodies is hard to find.
One type of unique and rare meteorite, called a ureilite, could provide the evidence to back up the models, and that’s what fell to Earth in the Nubian Desert in 2008. Ureilites are thought to be the remnants of a lost planet that was formed in the first 10 million years of the Solar System, and then was destroyed in a collision.
Ureilites are different than other stony meteorites. They have a higher component of carbon than other meteorites, mostly in the form of the aforementioned nanodiamonds. Researchers from Switzerland, France and Germany examined the diamonds inside 2008 TC3 and determined that they probably formed in a small proto-planet about 4.55 billion years ago.
Philippe Gillet, one of the study’s co-authors, had this to say in an interview with Associated Press: “We demonstrate that these large diamonds cannot be the result of a shock but rather of growth that has taken place within a planet.”
According to the research presented in this paper, these nanodiamonds were formed under pressures of 200,000 bar (2.9 million psi). This means the mystery parent-planet would have to have been as big as Mercury, or even Mars.
The key to the study is the size of the nanodiamonds. The team’s results show the presence of diamond crystals as large as 100 micrometers. Though the nanodiamonds have since been segmented by a process called graphitization, the team is confident that these larger crystals are there. And they could only have been formed by static high-pressure growth in the interior of a planet. A collision shock wave couldn’t have done it.
This is what’s called a High-Angle Annular Dark-Field (HAADF) Scanning Transmission Electron Microscopy (STEM) image. The image on the left shows diamond segments with similar crystal orientations. The image on the right is a magnification of the green square area. The orange lines highlight the inclusion trails, which match between the diamond segments. But those same trails are absent from the intersecting graphite. Image: Farhang Nabiei, Philippe Gillet, et. al.
But the parent body of the ureilite meteorite in the study would have to have been subject to collisions, otherwise where is it? In the case of this meteorite, a collision and resulting shock wave still played a role.
The study goes on to say that a collision took place some time after the parent body’s formation. And this collision would have produced the shock wave that caused the graphitization of the nanodiamonds.
The key evidence is in what are called High-Angle Annular Dark-Field (HAADF) Scanning Transmission Electron Microscopy (STEM) images, as seen above. The image is two images in one, with the one on the right being a magnification of a part of the image on the left. On the left, dotted yellow lines indicate areas of diamond crystals separate from areas of graphite. On the right is a magnification of the green square.
The inclusion trails are what’s important here. On the right, the inclusion trails are highlighted with the orange lines. They clearly indicate inclusion lines that match between adjacent diamond segments. But the inclusion lines aren’t present in the intervening graphite. In the study, the researchers say this is “undeniable morphological evidence that the inclusions existed in diamond before these were broken into smaller pieces by graphitization.”
To summarize, this supports the idea that a small planet between the size of Mercury and Mars was formed in the first 10 million years of the Solar System. Inside that body, large nanodiamonds were formed by high-pressure growth. Eventually, that parent body was involved in a collision, which produced a shock wave. The shock wave then caused the graphitization of the nanodiamonds.
It’s an intriguing piece of evidence, and fits with what we know about the formation and evolution of our Solar System.
Farewell Kepler. Welcome TESS And The Quest To Find Earth 2.0 - YouTube
At 6:51 EDT on Wednesday, April 18th, a SpaceX Falcon 9 rocket blasted off from Florida’s Cape Canaveral. It was carrying NASA’s TESS: the Transiting Exoplanet Survey Satellite. From what we can tell, the mission went without a hitch, with the first stage returning to land on its floating barge in the Atlantic Ocean, and stage 2 carrying on to send TESS into its final orbit.
This is a changing of the guard, as we’re now entering the final days for NASA’s Kepler Space Telescope. It’s running out of fuel and already crippled by the loss of its reaction wheels. In just a few months NASA will shut it down for good.
That is sad, but don’t worry, with TESS on its way, the exoplanet science journey continues: searching for Earth-sized worlds in the Milky Way.
It’s hard to believe that we’ve only known about planets orbiting other stars for just over 20 years now. The first extrasolar planet found was the hot jupiter 51 Pegasi B, which was discovered in 1995 by a team of Swiss astronomers.
They found this world using the radial velocity method, where the gravity of the planet pulls its star back and forth, changing the wavelength of the light we see ever so slightly. This technique has been refined and used discover many more planets orbiting many more stars.
But another technique has been even more successful: the transit technique. This is where the light from the star is carefully measured over time, watching for any dip in brightness as a planet passes in front.
In a series of papers, Professor Loeb and Michael Hippke indicate that conventional rockets would have a hard time escaping from certain kinds of extra-solar planets. Credit: NASA/Tim Pyle
At the time that I’m writing this article in April, 2018, there are 3,708 confirmed planets with several thousand more candidates that need additional confirmation.
Planets are everywhere, in all shapes and sizes. From the familiar gas giants, rocky worlds and ice giants we have in the Solar System, to the unusual hot jupiters and super earths. Astronomers have even found comets in other solar systems, planets like Saturn but with ring systems that dwarf our neighbouring planet. The hunt is even on for exomoons. Moons orbiting planets orbiting other stars.
NASA’s Kepler Space Telescope was the most productive planet hunting instrument ever built. Of those 3,708 planets discovered so far, Kepler turned up 2,342 worlds.
Artist’s concept of the Kepler mission with Earth in the background. Credit: NASA/JPL-Caltech
Kepler was launched back in March 2009, and began operations on May 12, 2009. It used its 1.4 meter primary mirror to observe a 12-degree region of the sky. Just for comparison, the Moon takes up about half a degree. So a region containing hundreds of times the size of the Moon.
Kepler was placed into an Earth-trailing orbit around the Sun, with a period of 372.5 days. With a longer year, the telescope slowly drifts behind the Earth by about 25 million km per year.
As I mentioned earlier, Kepler was designed to use the transit technique, searching for planets passing in front of their stars in this very specific region of the sky. While previous exoplanet surveys had only found the more massive planets, Kepler was sensitive enough to see worlds with half the mass of Earth orbiting other stars.
The number of confirmed exoplanets, by year. Credit: NASA
And everything was going great until July 14, 2012 when one of the spacecraft’s four reaction wheels failed. These are gyroscopes that allow the spacecraft to change its orientation without propellant. No problem, Kepler was designed to only need three. Then a second wheel failed on May 11, 2013, bringing an end to its main mission.
What the Kepler engineers came up with is one of the most ingenious spacecraft rescues in the history of spaceflight. They realized that they could use light pressure from the Sun to perfectly stabilize the telescope and keep it pointed at a region of the sky.
How the K2 mission rescued Kepler. Image credit: NASA
This allowed Kepler to keep working, observing even larger portions of the sky, but its orbit around the Sun would only let it watch one region for a shorter period of time. Instead of scanning Sun-like stars, Kepler focused its attention on red dwarf stars, which can have Earth sized worlds orbiting them every few days.
This was known as the K2 era, and during this time it turned up an additional 307 confirmed, and 480 unconfirmed planets.
But Kepler is running out of time now. About a month ago NASA announced that Kepler’s almost out of fuel. This fuel is important because one important maneuver it needs to make is to point itself back and Earth and upload all the data it’s been gathering. NASA figures that’s just a few months away now, and when it happens, they’ll instruct the telescope to point at Earth for one last time, transmit its final data, and then shut down forever.
And today TESS blasted off successfully, making its way to take over where Kepler leaves off.
The TESS mission has been around in some form since 2006 when it was originally conceived as a privately funded mission by Google, the Kavli Foundation and MIT.
Over the years, it was proposed to NASA, and in 2013, it was accepted as one of NASA’s Explorer Missions. These are missions with a budget of $200 million or less. WISE and WMAP are other examples of Explorer Missions.
But there are a bunch of differences between Kepler and TESS.
It’ll be capable of surveying the entire sky over the course of two years, which is an area 400 times larger than Kepler observed. And astronomers are expecting that the mission will turn up thousands of extrasolar planets, 500 of which will be Earth-sized or super-Earth-sized.
Illustration of the TESS field of view. Credit: NASA/MIT
By performing this wide survey of the sky with bright stars, TESS will be finding the close extrasolar planets. If a bright star has planets passing in front of it from our perspective, TESS will find it. It will create the definitive catalog of nearby planets.
Since these worlds are much brighter in the sky, it’ll be easier for the world’s ground and space-based observatories to do follow up observations. Astronomers will be able to measure the size, mass, density and even the atmospheres of extrasolar worlds. Just wait until James Webb gets its detectors on some of these worlds.
In addition to its primary job of finding planets, NASA has invited Guest Investigators to use the spacecraft for other science research, such as finding quasars, tracking stellar rotation, and observing the variations of dwarf stars. Anything that has a change in brightness will a great target for TESS.
One interesting feature of the TESS mission will be its orbit, taking it on a path that no other mission has ever used. It’s called a “P/2 lunar-resonant” orbit, and takes the spacecraft on an elliptical trajectory that takes half as long as the Moon to orbits the Earth – 13.7 days.
Simulation of the TESS orbit. Credit: NASA/MIT
At its closest point to Earth, it’ll be 35,785 km above the surface and take three hours to transmit all its data to ground stations. Then it’ll fly out to the highest point, at an altitude of 373,300 km, out of the hazards of the Van Allen Belts.
By the time the TESS mission wraps up, we’re going to know a lot about the extrasolar planets in our nearby neighborhood. Well, a lot about the planets that perfectly line up with their stars from our perspective. And sadly, this is only a couple of percent of the star systems out there.
We’re going to need other techniques to find the rest, which I’m sure we’ll be covering in future articles.
When Elon Musk of SpaceX tweets something interesting, it generates a wave of excitement. So when he tweeted recently that SpaceX might be working on a way to retrieve upper stages of their rockets, it set off a chain of intrigued responses.
SpaceX will try to bring rocket upper stage back from orbital velocity using a giant party balloon
SpaceX has been retrieving and reusing their lower stages for some time now, and it’s lowered the cost of launching payloads into space. But this is the first hint that they may try to do the same with upper stages.
Twitter responders wanted to know exactly what SpaceX has in mind, and what a “giant party balloon” might be. Musk hasn’t elaborated yet, but one of his Twitter followers had something interesting to add.
If you're proposing what I think you are, an ultra low ballistic entry coefficient decelerator, then you and @SpaceX should come see what we have at the @UofMaryland . We've been working on this for awhile and just finished some testing pic.twitter.com/nJBvyUnzaK
Universe Today contacted Mr. Kupec to see if he could help us understand what Musk may have been getting at. But first, a little background.
An “ultra low ballistic entry coefficient decelerator” is a bit of a mouthful. The ballistic coefficient measures how well a vehicle can overcome air resistance in flight. A high ballistic coefficient means a re-entry vehicle would not lose velocity quickly, and would reach Earth at high speeds. An ultra low ballistic entry coefficient decelerator would lose speed quickly, meaning that a vehicle would be travelling at low, subsonic speeds before reaching the ground.
To recover an upper stage booster, low speeds are desirable, since they generate less heat. But according to Kupec, there’s another problem that must be overcome.
“What happens when these things slow down to landing velocities? If your center of gravity is offset significantly behind your center of drag, as would be the case with a returning upper stage, it can get unstable. If the center of gravity of the re-entry vehicle is too high, it can become inverted, which is obviously not desirable.”
So the trick is to lower the speed of the re-entry vehicle to the point where the heat generated by reentry isn’t damaging the booster, and to do it without causing the vehicle to invert or otherwise become unstable. This isn’t a problem for the main stage boosters that SpaceX now routinely recovers; they have their own retro-rockets to guide their descent and landing. But for the upper stage boosters, which reach orbital velocities, it’s an obstacle that has to be overcome.
“My research is specifically focused on how high you can push the center of gravity and still maintain the proper flight configuration,” said Kupec.
But what about the “giant party balloon” that Musk tweeted about?
Musk could be referring, in colorful terms, to what’s called a ballute. The word is a combination of the words balloon and parachute. They were invented in the 1950’s by Goodyear Aerospace. They can arrest the descent of entry vehicles and provide stability during the descent.
“…the balloon would have to be 120 ft. in diameter, and made of a high-temperature fabric…” – Professor Dave Akin, University of Maryland
Universe Today contacted Professor Dave Akin of the University of Maryland for some insight into Musk’s tweet. Professor Akin has been working on reentry systems for over 2 decades.
In an e-mail exchange, Professor Akin told us, “There have been concepts proposed for deploying a large balloon on a cable that is towed behind you on entry. The balloon lowers your ballistic coefficient, which means you decelerate higher in the atmosphere and the heat load is less.” So the key is to scrub your speed before you get closer to Earth, where the atmosphere is thicker and generates more heat.
But according to Professor Akin, this won’t necessarily be easy to do. “To get the two orders of magnitude reduction in ballistic coefficient that Elon has been talking about the balloon would have to be 120 ft. in diameter, and made of a high-temperature fabric, so it’s not going to be all that easy.”
But Musk’s track record shows he doesn’t shy away from things that aren’t easy.
Retrieving upper rocket stages isn’t all about lowering launch costs, it’s also about space junk. The European Space Agency estimates that there are over 29,000 pieces of space junk orbiting Earth, and some of that junk is spent upper stage boosters. There have been some collisions and accidents already, with some satellites being pushed into different orbits. In 2009, the Iridium 33 communications satellite and the defunct Russian Cosmos 2251 communications satellite collided with each other, destroying both. If SpaceX can develop a way to retrieve its upper stage boosters, that means less space junk, and fewer potential collisions.
There’s a clear precedent for using balloons to manage reentry. With people like Professor Akin and Quinn Kupec working on it, SpaceX won’t have to reinvent the wheel. But they’ll still have a lot of work to do.
Musk tweeted one other thing shortly after his “giant party balloon” tweet:
The first-ever detection of gravitational waves (which took place in September of 2015) triggered a revolution in astronomy. Not only did this event confirm a theory predicted by Einstein’s Theory of General Relativity a century before, it also ushered in a new era where the mergers of distant black holes, supernovae, and neutron stars could be studied by examining their resulting waves.
In addition, scientists have theorized that black hole mergers could actually be a lot more common than previously thought. According to a new study conducted by pair of researchers from Monash University, these mergers happen once every few minutes. By listening to the background noise of the Universe, they claim, we could find evidence of thousands of previously undetected events.
Drs. Eric Thrane and Rory Smith. Credit: Monash University
As they state in their study, every 2 to 10 minutes, a pair of stellar-mass black holes merge somewhere in the Universe. A small fraction of these are large enough that the resulting gravitational wave event can be detected by advanced instruments like the Laser Interferometer Gravitational-Wave Observatory and Virgo observatory. The rest, however, contribute to a sort of stochastic background noise.
By measuring this noise, scientists may be able to study much more in the way of events and learn a great deal more about gravitational waves. As Dr Thrane explained in a Monash University press statement:
“Measuring the gravitational-wave background will allow us to study populations of black holes at vast distances. Someday, the technique may enable us to see gravitational waves from the Big Bang, hidden behind gravitational waves from black holes and neutron stars.”
Drs Smith and Thrane are no amateurs when it comes to the study of gravitational waves. Last year, they were both involved in a major breakthrough, where researchers from LIGO Scientific Collaboration (LSC) and the Virgo Collaboration measured gravitational waves from a pair of merging neutron stars. This was the first time that a neutron star merger (aka. a kilonova) was observed in both gravitational waves and visible light.
The pair were also part of the Advanced LIGO team that made the first detection of gravitational waves in September 2015. To date, six confirmed gravitational wave events have been confirmed by the LIGO and Virgo Collaborations. But according to Drs Thrane and Smith, there could be as many as 100,000 events happening every year that these detectors simply aren’t equipped to handle.
Artist’s impression of merging binary black holes. Credit: LIGO/A. Simonnet.
These waves are what come together to create a gravitational wave background; and while the individual events are too subtle to be detected, researchers have been attempting to develop a method for detecting the general noise for years. Relying on a combination of computer simulations of faint black hole signals and masses of data from known events, Drs. Thrane and Smith claim to have done just that.
From this, the pair were able to produce a signal within the simulated data that they believe is evidence of faint black hole mergers. Looking ahead, Drs Thrane and Smith hope to apply their new method to real data, and are optimistic it will yield results. The researchers will also have access to the new OzSTAR supercomputer, which was installed last month at the Swinburne University of Technology to help scientists to look for gravitational waves in LIGO data.
This computer is different from those used by the LIGO community, which includes the supercomputers at CalTech and MIT. Rather than relying on more traditional central processing units (CPUs), OzGrav uses graphical processor units – which can be hundreds of times faster for some applications. According to Professor Matthew Bailes, the Director of the OzGRav supercomputer:
“It is 125,000 times more powerful than the first supercomputer I built at the institution in 1998… By harnessing the power of GPUs, OzStar has the potential to make big discoveries in gravitational-wave astronomy.”
What has been especially impressive about the study of gravitational waves is how it has progressed so quickly. From the initial detection in 2015, scientists from Advanced LIGO and Virgo have now confirmed six different events and anticipate detecting many more. On top of that, astrophysicists are even coming up with ways to use gravitational waves to learn more about the astronomical phenomena that cause them.
All of this was made possible thanks to improvements in instrumentation and growing collaboration between observatories. And with more sophisticated methods designed to sift through archival data for additional signals and background noise, we stand to learn a great deal more about this mysterious cosmic force.
One of the defining characteristics of the modern era of space exploration is the way the public and private aerospace companies (colloquially referred to as the NewSpace industry) and are taking part like never before. Thanks to cheaper launch services and the development of small satellites that can be built using off-the-shelf electronics (aka. CubeSats and microsats), universities and research institutions are also able to conduct research in space.
Looking to the future, there are those who want to take public involvement in space exploration to a whole new level. This includes the California-based aerospace company Space Fab that wants to make space accessible to everyone through the development the Waypoint Space Telescope – the first space telescope that people will be able to access through their smartphones to take pictures of Earth and space.
The company was founded in 2016 by Randy Chung and Sean League with the vision of creating a future where anything could be manufactured in space. Chung began his career developing communications satellites and has a background in integrated circuit design, digital signal processing, CMOS imager design, and software development. He holds sixteen patents in the fields of computer peripherals, imagers, and digital communications.
SpaceFab.US - About Us - YouTube
League, meanwhile, is an astrophysicist who has spent the past few decades developing optics, building and designing remote telescopes, solid state lasers, and has lots of experience with startups, fundraising, computer-aided design (CAD) and machining. Between the two of them, they are ideally suited to creating a new generation of publicly-accessible telescopes. As League told Universe Today via email:
“We have studied over 200 papers on the design of small satellite structures, electronics, navigation, and attitude control. We are rethinking satellite design, not tied down by legacy approaches. That fresh approach leads us to use a Corrected Dall Kirkham telescope design, rather than the standard Richey-Chretien design, an extending secondary mirror, rather than a fixed telescope structure, and a multi-purpose and multi-directional telescope, not a single purpose telescope just for Earth observation or just for astronomy.”
Together, League and Chung launched Space Fab in the hopes of spurring the development of the space industry, where asteroid mining and space manufacturing will provide cheap and abundant resources for all and allow for further exploration of our Solar System. The first step in this long-term plan is to build a profitable space telescope business by creating the first commercial, multipurpose space telescope industry.
“SpaceFab’s primary long term objective is to accelerate man’s access to space and to make the human race a multi-planet species,” said League. “This not only safeguards the human race, but all life that is brought along. We intend to make space resources readily available and dramatically less expensive than today, without environmental impact on Earth.”
SpaceFab: A Space Telescope for Everyone - YouTube
What makes the Waypoint Space Telescope especially unique is the way it combines off-the-shelf components with revolutionary instruments. The design is based on a standard 12U CubeSat satellite, which contains the Waypoint telescope. This telescope has extendable optics that consist of a 21 cm silicon carbide primary mirror, a deployable secondary mirror, a 48 Megapixel imager for visible and near-infrared wavelengths, an 8 Megapixel image intensified camera for ultraviolet and visible wavelengths and a 150 band hyper-spectral imager.
“Waypoint’s astronomical capabilities are impressive,” says League. “Without the distorting effects of Earth’s atmosphere, our 48 megapixel imager can take perfect high resolution images every time. We can reach the maximum theoretical resolution for our main mirror at .6 arc seconds per pixel on a single image, and higher resolution is possible through multiple exposures. Contrast will be fantastic, with the blackness of background space not being washed out by Earth’s atmosphere, clouds, moisture, city lights, or the day/night cycle. The Waypoint satellite also includes a complete set of astronomical and earth observations filters.”
The Waypoint Space Telescope will be ready to launch as a secondary payload by the end of 2019 on a rocket like the SpaceX Falcon 9. The company has also completed its first seed round of investment and is currently crowdfunding through a Kickstarter campaign.
Those who pledge their money will have the honor of getting a “space selfie”, where a favorite photo will be paired with a backdrop of Earth, pictured from orbit. In addition, Space Fab is building its own custom laser communications systems for the telescope optimized for low power, small size, and high speed.
Once deployed, this communication system will allow the telescope to download data back to Earth twice a day using optical ground stations. These images will then be available for upload via smartphone, tablet, computer or other devices. Chung and League’s efforts to create the first accessible telescope is already drawing its share of acolytes. One such person is Dustin Gibson, one of the owners of OPT Telescopes. As he told Universe Today via email:
“So far, the company is on the fast track to success with its first round of investing completed and over target, and the second round just getting started. It looks like this thing is going to fly in 2019! For an astrophotography lover like myself, I can’t think of anything more ground breaking than a consumer controlled space telescope.
“What Space Fab is doing is rewriting not just how we think about ways in which to do land surveys or deep space imaging, but actually redefining the way we are able interact with satellites by giving the common user a level of control over the movements and functionality of the unit itself with something as simple as a cell phone.”
Looking ahead, Space Fab is also busy developing the technology that will allow them to mine asteroids and tap the abundant resources of the Solar System. The company recently filed a patent for their ion accelerator, which is designed to augment the thrust from existing cubesat-sized ion engines.
The company is also focused on creating advanced robotic arms that will be able to wrestle with space debris and repair themselves in the event of mechanical failure or damage. In the meantime, the Waypoint is the first of several space telescopes that Space Fab hopes to deploy in order to generate revenue for these ventures.
“Our space telescopes will be open to everyone, so that is the beginning,” said League. “The revenue these satellites will generate provides us with the funds and knowledge base to conduct metal asteroid mining and manufacturing on a large scale. This will allow the manufacture of large structures, spacecraft, tools or anything thing else that is needed in space. With these available resources, our hope is to accelerate the space economy and colonization.”
In this respect, Space Fab is in good company when it comes to the age of NewSpace. Alongside big-names like SpaceX, Blue Origin, Planetary Resources, and Deep Space Industries, they are part of a constellation of companies that are looking to make space accessible and usher in an age of post-scarcity. And with the help of the general public, they just might succeed!
Are you ready for a luxury hotel in space? We all knew it was coming, even though it seems impossibly futuristic. But this time it’s not just science fiction; somebody actually has a plan.
The space hotel will be called “Aurora Station” and the company behind it is Orion Span, a Silicon Valley and Houston-based firm. Orion Span aims to deliver the astronaut experience to people, by delivering the people into space. The catch?
“We developed Aurora Station to provide a turnkey destination in space. Upon launch, Aurora Station goes into service immediately, bringing travelers into space quicker and at a lower price point than ever seen before, while still providing an unforgettable experience” – Frank Bunger, CEO and founder of Orion Span.
First of all, a 12 day stay aboard Aurora Station for two people will cost $19 million US, or $9.5 million per person. Even so, you can’t just buy a ticket and hop on board. Guests must also sign up for three months of Orion Span Astronaut Certification (OSAC). Then they’ll be trained at a facility in Houston, Texas.
So once their cheque has cleared, and once they’re trained, what awaits guests on Aurora Station?
Aurora Station will orbit Earth at 320 km (200 m) and will make the trip around Earth every 90 minutes. If you do the math, that’s 16 sunrises and sunsets each day, and guests will enjoy this slideshow for 12 days. Other than this compressed schedule of 96 sunsets and 96 sunrises during their 12 day stay, guests will also be treated to stunning views of the Earth rolling by underneath them, thanks to the unprecedented number of windows Aurora Station will have.
Aurora Station will have 5600 square feet of living space which can be configured as 2 or 4 suites. Image: Orion Span
Aurora Station is the brain-child of Orion Span’s CEO, Frank Bunger. “We developed Aurora Station to provide a turnkey destination in space. Upon launch, Aurora Station goes into service immediately, bringing travelers into space quicker and at a lower price point than ever seen before, while still providing an unforgettable experience,” said Bunger.
Guests won’t be alone on the station, of course. The space hotel will have room for 6 people in total, meaning 4 guests and 2 crew. (You didn’t think you’d be alone up there, did you?) Each pair of guests will still have some alone time though, in what Orion Span calls luxurious private suites for two.
There’s no doubt that staying on a space hotel for 12 days will be the experience of a lifetime, but still, 12 days is a long time. The space station itself will be 5600 square feet, with two suites that can be configured to four. Each suite will be about the size of a small bedroom. Once you’ve gotten used to seeing Earth below you, and you’re used to your suite, what will you do?
Well, there’ll be Wi-Fi of course. So if you’re the type of person who gets bored of orbiting the only planet that we know of that hosts life, and the only planet on which every human civilization has lived and died on, you can always surf the web or watch videos. Aurora Station will also have a virtual-reality holodeck, the cherry-on-top for this science-fiction-come-to- life space resort.
But apparently, boredom won’t be a problem. In an interview with the Globe and Mail, Orion Span CEO Frank Bunger said, ““We talked to previous space tourists, they said 10 days aboard the space station was not enough.” Maybe the extra 2 days in space that Aurora Station guests will enjoy will be just the right amount.
As far as getting guests to the station, that will be up to other private space companies like SpaceX. SpaceX has plans to send tourists on trips around the Moon, and they have experience docking with the International Space Station, so they should be able to transport guests to and from a space hotel.
Aurora Station will also host micro-gravity research and in-situ manufacturing. Image: Orion Span
It doesn’t seem like there’s any shortage of customers. Aurora Station was introduced on April 5th 2018, and the first four months of reservations sold out within 72 hours, with each guest paying a deposit of $80,000 US.
There’s another side to Aurora Station, though. Other than just a nice get-away for people who can afford it, there’s a research aspect to it. Orion Span will offer Aurora Station as a platform for micro-gravity research on a pay-as-you-go basis. It will also lease capacity for in-situ manufacturing and 3D printing research.
But Aurora Station would hardly be in the news if it was only a research endeavour. What’s got people excited is the ability to visit space. And maybe to own some real estate there.
Orion Span is designing Aurora Station to be expandable. They can attach more stations to the original without disrupting anything. And this leads us to Orion Span’s next goal: space condos.
As it says on Orion Span’s website, “Like a city rising from the ground, this unique architecture enables us to build up Aurora Station in orbit dynamically – on the fly – and with no impact to the remainder of Aurora Station. As we add capacity, we will design in condos available for purchase.”
I think we all knew this would happen eventually. If you have the money, you can visit space, and even own a condo there.
And if you’re interested in looking back, here’s an archive to all the past Carnivals of Space. If you’ve got a space-related blog, you should really join the carnival. Just email an entry to email@example.com, and the next host will link to it. It will help get awareness out there about your writing, help you meet others in the space community – and community is what blogging is all about. And if you really want to help out, sign up to be a host. Send an email to the above address.
As a species, we humans tend to take it for granted that we are the only ones that live in sedentary communities, use tools, and alter our landscape to meet our needs. It is also a foregone conclusion that in the history of planet Earth, humans are the only species to develop machinery, automation, electricity, and mass communications – the hallmarks of industrial civilization.
But what if another industrial civilization existed on Earth millions of years ago? Would we be able to find evidence of it within the geological record today? By examining the impact human industrial civilization has had on Earth, a pair of researchers conducted a study that considers how such a civilization could be found and how this could have implications in the search for extra-terrestrial life.
Carbon dioxide in Earth’s atmosphere if half of global-warming emissions are not absorbed. Credit: NASA/JPL/GSFC
As they indicate in their study, the search for life on other planets has often involved looking to Earth-analogues to see what kind conditions life could exist under. However, this pursuit also entails the search for extra-terrestrial intelligence (SETI) that would be capable of communicating with us. Naturally, it is assumed that any such civilization would need to develop and industrial base first.
This, in turn, raises the question of how often an industrial civilization might develop – what Schmidt and Frank refer to as the “Silurian Hypothesis”. Naturally, this raises some complications since humanity is the only example of an industrialized species that we know of. In addition, humanity has only been an industrial civilization for the past few centuries – a mere fraction of its existence as a species and a tiny fraction of the time that complex life has existed on Earth.
For the sake of their study, the team first noted the importance of this question to the Drake Equation. To recap, this theory states that the number of civilizations (N) in our galaxy that we might be able to communicate is equal to the average rate of star formation (R*), the fraction of those stars that have planets (fp), the number of planets that can support life (ne), the number of planets that will develop life ( fl), the number of planets that will develop intelligent life (fi), the number civilizations that would develop transmission technologies (fc), and the length of time these civilizations will have to transmit signals into space (L).
This can be expressed mathematically as: N = R* x fp x ne x fl x fi x fc x L
The Drake Equation, a mathematical formula for the probability of finding life or advanced civilizations in the universe. Credit: University of Rochester
As they indicate in their study, the parameters of this equation may change thanks to the addition of the Silurian Hypothesis, as well as recent exoplanets surveys:
“If over the course of a planet’s existence, multiple industrial civilizations can arise over the span of time that life exists at all, the value of fc may in fact be greater than one. This is a particularly cogent issue in light of recent developments in astrobiology in which the first three terms, which all involve purely astronomical observations, have now been fully determined. It is now apparent that most stars harbor families of planets. Indeed, many of those planets will be in the star’s habitable zones.”
In short, thanks to improvements in instrumentation and methodology, scientists have been able to determine the rate at which stars form in our galaxy. Furthermore, recent surveys for extra-solar planets have led some astronomers to estimate that our galaxy could contains as many as 100 billion potentially-habitable planets. If evidence could be found of another civilization in Earth’s history, it would further constrain the Drake Equation.
They then address the likely geologic consequences of human industrial civilization and then compare that fingerprint to potentially similar events in the geologic record. These include the release of isotope anomalies of carbon, oxygen, hydrogen and nitrogen, which are a result of greenhouse gas emissions and nitrogen fertilizers. As they indicate in their study:
“Since the mid-18th Century, humans have released over 0.5 trillion tons of fossil carbon via the burning of coal, oil and natural gas, at a rate orders of magnitude faster than natural long-term sources or sinks. In addition, there has been widespread deforestation and addition of carbon dioxide into the air via biomass burning.”
Based on fossil records, 250 million years ago over 90% of all species on Earth died out, effectively resetting evolution. Credit: Lunar and Planetary Institute
They also consider increased rates of sediment flow in rivers and its deposition in coastal environments, as a result of agricultural processes, deforestation, and the digging of canals. The spread of domesticated animals, rodents and other small animals are also considered – as are the extinction of certain species of animals – as a direct result of industrialization and the growth of cities.
The presence of synthetic materials, plastics, and radioactive elements (caused by nuclear power or nuclear testing) will also leave a mark on the geological record – in the case of radioactive isotopes, sometimes for millions of years. Finally, they compare past extinction level events to determine how they would compare to a hypothetical event where human civilization collapsed. As they state:
“The clearest class of event with such similarities are the hyperthermals, most notably the Paleocene-Eocene Thermal Maximum (56 Ma), but this also includes smaller hyperthermal events, ocean anoxic events in the Cretaceous and Jurassic, and significant (if less well characterized) events of the Paleozoic.”
These events were specifically considered because they coincided with rises in temperatures, increases in carbon and oxygen isotopes, increased sediment, and depletions of oceanic oxygen. Events that had a very clear and distinct cause, such as the Cretaceous-Paleogene extinction event (caused by an asteroid impact and massive volcanism) or the Eocene-Oligocene boundary (the onset of Antarctic glaciation) were not considered.
Artistic rendition of the Chicxulub impactor striking ancient Earth, with Pterosaur observing. Credit: NASA
According to the team, the events they did consider (known as “hyperthermals”) show similarities to the Anthropocene fingerprint that they identified. In particular, according to research cited by the authors, the Paleocene-Eocene Thermal Maximum (PETM) shows signs that could be consistent with anthorpogenic climate change. These include:
“[A] fascinating sequence of events lasting 100–200 kyr and involving a rapid input (in perhaps less than 5 kyr) of exogenous carbon into the system, possibly related to the intrusion of the North American Igneous Province into organic sediments. Temperatures rose 5–7?C (derived from multiple proxies), and there was a negative spike in carbon isotopes (>3%), and decreased ocean carbonate preservation in the upper ocean.”
Finally, the team addressed some possible research directions that might improve the constraints on this question. This, they claim, could consist of a “deeper exploration of elemental and compositional anomalies in extant sediments spanning previous events be performed”. In other words, the geological record for these extinction events should be examined more closely for anomalies that could be associated with industrial civilization.
If any anomalies are found, they further recommend that the fossil record could be examined for candidate species, which would raise questions about their ultimate fate. Of course, they also acknowledge that more evidence is necessary before the Silurian Hypothesis can be considered viable. For instance, many past events where abrupt Climate Change took place have been linked to changes in volcanic/tectonic activity.
Scientists were able to gauge the rate of water loss on Mars by measuring the ratio of water and HDO from today and 4.3 billion years ago. Credit: Kevin Gill
Second, there is the fact that current changes in our climate are happening faster than in any other geological period. However, this is difficult to say for certain since there are limits when it comes to the chronology of the geological record. In the end, more research will be necessary to determine how long previous extinction events (those that were not due to impacts) took as well.
Beyond Earth, this study may also have implications for the study of past life on planets like Mars and Venus. Here too, the authors suggest how explorations of both could reveal the existence of past civilizations, and maybe even bolster the possibility of finding evidence of past civilizations on Earth.
“We note here that abundant evidence exists of surface water in ancient Martian climates (3.8 Ga), and speculation that early Venus (2 Ga to 0.7 Ga) was habitable (due to a dimmer sun and lower CO2 atmosphere) has been supported by recent modeling studies,” they state. “Conceivably, deep drilling operations could be carried out on either planet in future to assess their geological history. This would constrain consideration of what the fingerprint might be of life, and even organized civilization.”
Two key aspects of the Drake Equation, which addresses the probability of finding life elsewhere in the galaxy, are the sheer number of stars and planets out there and the amount of time life has had to evolve. Until now, it has been assumed that one planet would give rise to one intelligent species capable of advanced technology and communications.
But if this number should prove to be more, we may a find a galaxy filled with civilizations, both past and present. And who knows? The remains of a once advanced and great non-human civilization may very well be right beneath us!