Sunday 21 October 2007

Integral's 5 year journey



Integral was launched on 17 October 2002. Since then, the satellite has helped scientists make great strides in understanding the gamma-ray universe - from the atoms that make up all matter, giant black holes, mysterious gamma-ray bursts to the densest objects in the universe.

Surveying the entire galaxy looking for the radioactive isotope aluminium 26 with Integral, scientists have been able to calculate that a supernova goes off in our galaxy, once every 50 years.

According to Integral, something is creating a lot of gamma rays at the centre of our galaxy - the suspect is positrons, the antimatter counterparts of electrons. Scientists have been baffled as to how vast numbers of such particles can be generated every second and how these sources would be distributed over the sky to match the gamma-ray map.

Within months of operation, Integral solved a thirty-year-old mystery by showing that the broadband gamma-ray emission observed towards the centre of the galaxy was produced by a hundred individual sources. A catalogue of close to 500 gamma-ray sources from all over the sky, most of them new, was then complied.

Scientists now know that a rare class of anomalous X-ray pulsars, or magnetars, generates magnetic fields a thousand million times stronger than the strongest steady magnetic field achievable in a laboratory on Earth. These sources show, unexpectedly, strong emission in the Integral energy range.

Integral revealed that a sub-class of X-ray binary stars, called super-giant fast X-ray transients, previously thought to be extremely rare, is actually common in our galaxy. The satellite has also discovered a completely new class of high-mass X-ray binaries, called highly absorbed X-ray binaries.

Integral has seen about 100 of the brightest supermassive black holes, the main producers of gamma radiation in our universe, in other galaxies. But while looking for them in nearby galaxies, surprisingly few have been found.

They are either too well-hidden or are only present in the younger galaxies which populate the more distant universe.

Galaxies throughout the universe are believed to be responsible for creating the diffuse background glow of gamma rays, observed over the entire sky. Integral used the Earth as a giant shield to disentangle this faint glow. Making the measurements possible was a technological and operational feat.

On 27 December 2004 Integral was hit by the strongest flux of gamma rays ever measured by any spacecraft and it even measured radiation that bounced off the Moon. The culprit was a magnetar, SGR 1806-20, located 50 000 light years away on the other side of our Milky Way. Thanks to this outburst, astronomers now think that some gamma-ray bursts might come from similar magnetars in other galaxies.

Integral has also been able to take images of gamma ray bursts, while the telescope was not pointed in the right direction. This was done using radiation that passed through the side of Integral’s imaging telescope and struck the detector.

Integral has indeed played a major role in modern gamma-ray astronomy. So much has happened in the span of five years but much more is still to come.

Source: Christoph Winkler, ESA Integral Project Scientist
_________________________________________________________
_________________________________________________________
The Genesis Probe - Clues to the Origins of the Solar System @ The Daily Galaxy
_________________________________________________________
_________________________________________________________

Tuesday 16 October 2007

Future Space Ships



The risks from radiation in space, and the need to keep the crew safe on long flights, may influence the shape of future spaceships.

The major radiation sources are galactic cosmic rays, charged particles: from electrons up to the heavy metal elements and 'solar particle events', which throw out protons and helium nuclei.

Exposure from the hazards of severe space radiation in long-duration deep space missions is 'the show stopper'. Protection from the hazards of severe space radiation is of paramount importance to NASA's new vision to reach the Moon, Mars and beyond.

The electrons, protons & heavy-metal ions such as iron and uranium whiz through the void and can all cause cancers. But aluminium shielding capable of staving the radiation off on extended journeys would be prohibitively heavy, burning too much fuel.

The ideal form, according to Ram Tripathi, a spaceflight engineer at NASA, is a grapefruit spiked with cherries on sticks. Positively and negatively charged metal spheres be arranged on struts jutting out of the crew capsule, in carefully controlled directions, to give the crew a high degree of electrostatic radiation cover.

Tripathi calculates the "cherries" would need to be between 10 and 20 metres in diameter and would be stationed about 50 metres from the crew capsule – the "grapefruit". These spheres would protect the crew by deflecting charged particles away from the central habitat. Spheres give you more volume and less mass, and evenly distribute the deflecting charges over their surface.

The spheres would be made of lightweight hollow aluminium, the material shielding the crew capsule would incorporate carbon nanotubes – in a novel composite with aluminium. The nanotubes are light, and they can take a pounding from heavy incoming ions.

Or we could have spaceships with a more conventional shape like a submarine, the starship enterprise, the space shuttle or nerva, with a false skin filled with smaller spheres (or even tubes) having the same desired effect, deflecting radiation and adding volume, without overwhelmingly increasing the mass.

Laser power stations, drawing energy from the local environment, might one day propel spacecraft throughout the solar system. NASA studies of advanced planetary missions have ranged from small robotic probes to multiple-spacecraft human exploration missions.

The completed International Space Station will have a mass of about 1,040,000 pounds. It will measure 356 feet across and 290 feet in length, with almost an acre of solar panels to provide electrical power to six laboratories.

The assembled space station will provide the first lab complex where gravity, a fundamental force on Earth, can be controlled for extended periods. This control of gravity opens up an unimaginable world where almost everything grows differently than on Earth. For example, purer protein crystals can be grown in space than on Earth. By analyzing crystals grown on the ISS, scientists may be able to develop medicines that target particular disease-causing proteins.

Such crystals for research into cancer, diabetes, emphysema and immune disorders grown on the space station have already shown promise. New drugs to fight influenza and post-surgery inflammation are already in clinical trials, and future research will benefit from the extended exposure to weightlessness available on the station.

Many of the changes in the human body that result from space flight mimic those seen on Earth as a result of aging. Understanding of the causes of these changes may lead to the development of countermeasures against bone loss, muscle atrophy, balance disorders and other symptoms common in an aging population.

The Johnson Space Centre, together with scientists and researchers at NASA's other field centers, is working on the technologies that will be required for further exploration of the universe in the next years. For example, a new rocket team at Marshall is developing revolutionary technologies that will make space transportation as safe, reliable and affordable as today's airline transportation.

Hospitals, business parks and solar electric power stations that beam clean, inexpensive energy back to Earth are likely to dot the "space-scape" 40 years from now. Space adventure tourism and travel, orbiting movie studios, and worldwide, two-hour express package delivery also appear just over the horizon.

By 2040, it's expected to cost only tens of dollars per pound to launch humans or cargo to space; today, it costs as much as $10,000 per pound. Bridging that gap requires intense research and technology development focused on accelerating breakthroughs that will serve as keys to open the space frontier for business and pleasure.

Space transportation technology breakthroughs will launch a new age of space exploration, just as the silicon chip revolutionized the computer industry and made desktop computers commonplace.
_________________________________________________________
_________________________________________________________
The New Space Race by Brian Appleyard @ The Sunday Times
The Johnson Space Centre Celebrates 40 Years of Human Space Flight
_________________________________________________________
_________________________________________________________

Tuesday 9 October 2007

Nuclear Space Travel


Image Credit: Project Orion

Compared with the best chemical rockets, nuclear propulsion systems (NPS's) are more reliable and flexible for long-distance missions, and can achieve a desired space mission at a lower cost. The reason for these advantages in a nutshell is that NPS's can get "more miles per gallon" than a chemical rockets.

For any space mission, basic questions must be answered:

1 - What is the destination?
2 - What is the trip time?
3 - Do we want to return?
4 - the mass of the payload we want to send there & bring back?

In chemical rocket engines such as the Space Shuttle Main Engine (SSME), the chemical reaction between the hydrogen and oxygen releases heat which raises the combustion gases (steam and excess hydrogen gas) up to high temperatures (3000-4000 K). These hot gases are then accelerated through a thermodynamic nozzle, which converts thermal energy into kinetic energy, and hence provides thrust. The propellant and the heat source are one in the same.

Because there is a limited energy release in chemical reactions and because a thermodynamic nozzle is being used to accelerate the combustion gases that do not have the minimum possible molecular weight, there is a limit on the exhaust velocity that can be achieved.

The maximum specific impulse Isp that can be achieved with chemical engines is in the range of 400 to 500 s. So, for example, if we have an Isp of 450 s, and a mission delta-V of 10 km/s (typical for launching into low earth orbit (LEO)), then the mass ratio will be 9.63. The problem here is that most of the vehicle mass is propellant, and due to limitations of the strength of materials, it may be impossible to build such a vehicle, just to ascend into orbit.

Early rocket scientists got around this problem by building a rocket in stages, throwing away the structural mass of the lower stages once the propellant was consumed. This effectively allowed higher mass ratios to be achieved, and hence a space mission could be achieved with low-Isp engines. This is what all rockets do today, even the Space Shuttle. In spite of the relatively low Isp, chemical engines do have a relatively high thrust-to-weight ratio (T/W).

A high T/W (50-75) is necessary for a rocket vehicle to overcome the force of gravity on Earth and accelerate into space. The thrust of the rocket engines must compensate for the weight of the rocket engines, the propellant, the structural mass, and the payload. Although it is not always necessary, a high T/W engine will allow orbital and interplanetary space vehicles to accelerate quickly and reach there destinations in shorter time periods.

Nuclear propulsion systems have the ability to overcome the Isp limitations of chemical rockets because the source of energy and the propellant are independent of each other. The energy comes from a critical nuclear reactor in which neutrons split fissile isotopes, such as 92-U-235 (Uranium) or 94-Pu-239 (Plutonium), and release energetic fission products, gamma rays, and enough extra neutrons to keep the reactor operating.

The energy density of nuclear fuel is enormous. The heat energy released from the reactor can then be used to heat up a low-molecular weight propellant (such as hydrogen) and then accelerate it through a thermodynamic nozzle in same way that chemical rockets do. This is how nuclear thermal rockets (NTR's) work.

Solid-core NTR's (See Figure 2) have a solid reactor core with cooling channels through which the propellant is heated up to high temperatures (2500-3000 K). Although solid NTR's don't operate at temperatures as high as some chemical engines (due to material limitations), they can use pure hydrogen propellant which allows higher Isp's to be achieved (up to 1000 s).

In gas-core NTR's, the nuclear fuel is in gaseous form and is inter-mixed with the hydrogen propellant. Gas core nuclear rockets (GCNR) can operate at much higher temperatures (5000 - 20000 K), and thus achieve much higher Isp's (up to 6000 s).

Of course, there is a problem in that some radioactive fission products will end up in the exhaust, but other concepts such as the nuclear light bulb (NLB) can contain the uranium plasma within a fused silica vessel that easily transfers heat to a surrounding blanket of propellant. At such high temperatures, whether an open-cycle GCNR, or a closed-cycle NLB, the propellants will dissociate and become partially ionized.

In this situation, a standard thermodynamic nozzle must be replaced by a magnetic nozzle which uses magnetic fields to insulate the solid wall from the partially-ionized gaseous exhaust.

NTR's give a significant performance improvement over chemical engines, and are desirable for interplanetary missions. It may also be possible that solid core NTR's could be used in a future launch vehicle to supplement or replace chemical engines altogether4. Advances in metallurgy and material science would be required to improve the durability and T/W ratio of NTR's for launch vehicle applications.

An alternative approach to NTR's is to use the heat from nuclear reactor to generate electrical power through a converter, and then use the electrical power to operate various types of electrical thrusters (ion, hall-type, or magneto-plasma-dynamic (MPD)) that operate on a wide variety of propellants (hydrogen, hydrazine, ammonia, argon, xenon, fullerenes) This is how nuclear-electric propulsion (NEP) systems work.

To convert the reactor heat into electricity, thermoelectric or thermionic devices could be used, but these have low efficiencies and low power to weight ratios. The alternative is to use a thermodynamic cycle with either a liquid metal (sodium, potassium), or a gaseous (helium) working fluid. These thermodynamic cycles can achieve higher efficiencies and power to weight ratios.

No matter what type of power converter is used, a heat rejection system is needed, meaning that simple radiators, heat pipes, or liquid-droplet radiators would be required to get rid of the waste heat. Unlike ground-based reactors, space reactors cannot dump the waste heat into a lake or into the air with cooling towers.

The electricity from the space nuclear reactor can be used to operate a variety of thrusters. Ion thrusters use electric fields to accelerate ions to high velocities. In principle, the only limit on the Isp that can be achieved with ion thrusters is the operating voltage and the power supply. Hall thrusters use a combination of magnetic fields to ionize the propellant gas and create a net axial electric field which accelerates ions in the thrust direction. MPD thrusters use either steady-state or pulsed electromagnetic fields to accelerate plasma (a mixture of ions and electrons) in the thrust direction. To get a high thrust density, ion thrusters typically use xenon, while Hall thrusters and MPD thrusters can operate quite well with argon or hydrogen.

Compared with NTR's, NEP systems can achieve much higher Isp's. Their main problem is that they have a low power to weight ratio, a low thrust density, and hence a very low T/W ratio. This is due to the mass of the reactor, the heat rejection system, and the low-pressure operating regime of electrical thrusters.

This makes NEP systems unfeasible for launch vehicle applications and mission scenarios where high accelerations are required; however, they can operate successfully in low-gravity environments such as LEO and interplanetary space.

In contrast to a chemical rocket or an NTR which may operate only for several minutes to less than an hour at a time, an NEP system might operate continuously for days, weeks, perhaps even months, as the space vehicle slowly accelerates to meet its mission delta-V. An NEP system is well suited for unmanned cargo missions between the Earth, Moon and other planets.

For manned missions to the outer planets, there would be a close competition between gas-core NTR's and high-thrust NEP systems.

The performance gain of nuclear propulsion systems over chemical propulsion systems is overwhelming. Nuclear systems can achieve space missions at a significantly lower cost due to the reduction in propellant requirements.

When humanity gains the will to explore and develop space more ambitiously, nuclear propulsion will be an attractive choice.

Source: Nuclear Propulsion from Astro Digital. - Quasar9
_______________________________________________________
_______________________________________________________
Innovative Nuclear Space Power and Propulsion Institute University of Florida
_______________________________________________________
_______________________________________________________

Tuesday 25 September 2007

Fluid theory confirmed


The Foton M-3. Credit ESA

The Foton M-3 capsule carries a 400 kg European experiment payload with experiments in a range of scientific disciplines - including fluid physics, biology, crystal growth, radiation exposure and exobiology.

The capsule spends 12 days orbiting the Earth, exposing the experiments to microgravity and, in the case of a handful of experiments also exposing them to the harsh environment of open space, before re-entering the atmosphere and landing in the border zone between Russia and Kazakhstan.

All liquids experience minute fluctuations in temperature or concentration as a result of the different velocities of individual molecules. These fluctuations are usually so small that they are extremely difficult to observe.

In the 1990s, scientists discovered that these tiny fluctuations in fluids and gases can increase in size, and even be made visible to the naked eye, if a strong gradient is introduced. One way to achieve this is to increase the temperature at the bottom of a thin liquid layer, though not quite enough to cause convection. Alternatively, by heating the fluid from above, convection is suppressed, making it possible to achieve more accurate measurements.

It was suggested that the fluctuations would become much more noticeable in a weightless environment. Now, thanks to the Foton mission, the opportunity to test this prediction has come about, and the results completely support the earlier forecast.

To the delight of the science team, the images visually support the theoretical predictions by showing a very large increase in the size of the fluctuations. Data analysis has also shown that the amplitude of the fluctuations in temperature and concentration greatly increased.

It may be that the results will influence other types of microgravity research, such as the growth of crystals. This research may even lead to some new technological spin-offs.

Read more: Fluid Theory confirmed by Foton
_________________________________________________________
_________________________________________________________

Thursday 30 August 2007

Hayabusa limps home


A third ion engine is now running on Japan's problem-plagued Hayabusa spacecraft. Having another working engine increases the chances that the spacecraft will be able to limp back to Earth.

If the craft does return as planned in 2010, researchers will finally find out whether it collected the first-ever samples from an asteroid during its two landings on the tiny space rock Itokawa.

watch an animated video of the mission

In late 2005, the spacecraft lost all the fuel for its chemical thrusters because of a leak, so mission managers have been trying to get Hayabusa home using its ion engines instead.
These engines ionise xenon gas and then use electric fields to accelerate the ions, providing a steady – though weak – thrust. They were meant to be used only for the outward journey to the asteroid.

Two of the four ion engines were tested in mid 2006 and found to be in working order, and Hayabusa began its return journey in April 2007. But these engines are in danger of failing – one of them has been firing for a total of 13,500 hours, close to its design lifetime of 14,000 hours.

Now, spacecraft operators have coaxed a third engine back to life. The engine started firing ions on 28 July after several days had been spent warming up the engine's power supply, a statement on the Japan Aerospace Exploration Agency (JAXA) website said.

This third engine has only been fired for 7,000 hours, leaving it with more expected lifetime than either of the others. The fourth engine is being reserved as a spare in case the others fail.

Hayabusa was meant to collect samples from Itokawa by firing pellets into the surface of the 535-metre-long rock and scooping up the resulting debris. But data from two landings in November 2005 suggest that the pellets never fired because the craft's onboard computer sent conflicting signals to its collection instruments.

Still, mission officials hope to bring the spacecraft back to Earth in case some asteroid dust slipped into its collection chamber by chance. If it completes the trip, it is expected to drop a capsule in the Australian outback in June 2010.
________________________________________________________
________________________________________________________
Mini-Mag Orion: A Near-Term Starship? from Centauri Dreams
________________________________________________________
________________________________________________________