Welcome to the hub of knowledge.Here you can know about the details of our new developing world.

Tuesday 5 April 2011

Detect Extensive Drought Impact On Amazon Forests


WASHINGTON -- A new NASA-funded study has revealed widespread reductions in the greenness of Amazon forests caused by last year's record-breaking drought.

"The greenness levels of Amazonian vegetation -- a measure of its health -- decreased dramatically over an area more than three and one-half times the size of Texas," said Liang Xu, the study's lead author from Boston University. "It did not recover to normal levels, even after the drought ended in late October 2010."

The drought sensitivity of Amazon rainforests is a subject of intense study. Computer models predict a changing climate with warmer temperatures and altered rainfall patterns could cause moisture stress leading to rainforests being replaced by grasslands or woody savannas. This would release the carbon stored in rotting wood into the atmosphere, which could accelerate global warming. The United Nations' Intergovernmental Panel on Climate Change has warned similar droughts could be more frequent in the Amazon region in the future.

The comprehensive study was prepared by an international team of scientists using more than a decade's worth of satellite data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) and Tropical Rainfall Measuring Mission (TRMM). Analysis of these data produced detailed maps of vegetation greenness declines from the 2010 drought. The study has been accepted for publication in Geophysical Research Letters, a journal of the American Geophysical Union.

The authors first developed maps of drought-affected areas using thresholds of below-average rainfall as a guide. Next, they identified affected vegetation using two different greenness indexes as surrogates for green leaf area and physiological functioning.
The maps show the 2010 drought reduced the greenness of approximately 965,000 square miles of vegetation in the Amazon -- more than four times the area affected by the last severe drought in 2005.

"The MODIS vegetation greenness data suggest a more widespread, severe and long-lasting impact to Amazonian vegetation than what can be inferred based solely on rainfall data," said Arindam Samanta, a co-lead author from Atmospheric and Environmental Research Inc. in Lexington, Mass.

The severity of the 2010 drought also was seen in records of water levels in rivers across the Amazon basin, including the Rio Negro which represents rainfall levels over the entire western Amazon. Water levels started to fall in August 2010, reaching record low levels in late October. Water levels only began to rise with the arrival of rains later that winter.

"Last year was the driest year on record based on 109 years of Rio Negro water level data at the Manaus harbor," said Marcos Costa, co-author from the Federal University in Vicosa, Brazil. "For comparison, the lowest level during the so-called once-in-a-century drought in 2005 was only eighth lowest."

As anecdotal reports of a severe drought began to appear in the news media last summer, the authors started near-real time processing of massive amounts of satellite data. They used a new capability, the NASA Earth Exchange (NEX), built for the NASA Advanced Supercomputer facility at the agency's Ames Research Center in Moffett Field, Calif. NEX is a collaborative supercomputing environment that brings together data, models and computing resources.

Space Station Crew Launches from Birthplace of Human Spaceflight


WASHINGTON -- One week shy of the 50th anniversary of the first human spaceflight, NASA astronaut Ron Garan and Russian cosmonauts Andrey Borisenko and Alexander Samokutyaev launched to the International Space Station at 6:18 p.m. EDT Monday (4:18 a.m. local time, April 5) from the Baikonur Cosmodrome in Kazakhstan.

The Soyuz rocket that lifted Garan, Borisenko and Samokutyaev into orbit was decorated with Yuri Gagarin's name. The mission lifted off from the same launch pad used April 12, 1961, when Gagarin became the first human to journey into space.

The crew is scheduled to dock its Soyuz TMA-21 spacecraft to the station's Poisk port at 7:18 p.m. on Wednesday, April 6. The crew members will join Expedition 27 Commander Dmitry Kondratyev and Flight Engineers Cady Coleman of NASA and Paolo Nespoli of the European Space Agency, who have been aboard the orbiting laboratory since December 2010.

Scientists Find New Type Of Mineral In Historic Meteorite


HOUSTON -- NASA and co-researchers from the United States, South Korea and Japan have found a new mineral named "Wassonite" in one of the most historically significant meteorites recovered in Antarctica in December 1969.

The new mineral was discovered within the meteorite officially designated Yamato 691 enstatite chondrite. The meteorite was discovered the same year as other landmark meteorites Allende and Murchison and the return of the first Apollo lunar samples. The study of meteorites helps define our understanding of the formation and history of the solar system.

The meteorite likely may have originated from an asteroid orbiting between Mars and Jupiter. Wassonite is among the tiniest, yet most important, minerals identified in the 4.5-billion-year-old sample. The research team, headed by NASA space scientist Keiko Nakamura-Messenger, added the mineral to the list of 4,500 officially approved by the International Mineralogical Association.

"Wassonite is a mineral formed from only two elements, sulfur and titanium, yet it possesses a unique crystal structure that has not been previously observed in nature," said Nakamura-Messenger.

In 1969, members of the Japanese Antarctic Research Expedition discovered nine meteorites on the blue ice field of the Yamato Mountains in Antarctica. This was the first significant recovery of Antarctic meteorites and represented samples of several different types. As a result, the United States and Japan conducted systematic follow-up searches for meteorites in Antarctica that recovered more than 40,000 specimens, including extremely rare Martian and lunar meteorites.

Researchers found Wassonite surrounded by additional unknown minerals that are being investigated. The mineral is less than one-hundredth the width of a human hair or 50x450 nanometers. It would have been impossible to discover without NASA's transmission electron microscope, which is capable of isolating the Wassonite grains and determining their chemical composition and atomic structure.

"More secrets of the universe can be revealed from these specimens using 21st century nano-technology," said Nakamura-Messenger.

The new mineral's name was approved by the International Mineralogical Association. It honors John T. Wasson, professor at the University of California, Los Angeles (UCLA). Wasson is known for his achievements across a broad swath of meteorite and impact research, including the use of neutron activation data to classify meteorites and to formulate models for the chemical makeup of bulk chondrites.

"Meteorites, and the minerals within them, are windows to the formation of our solar system," said Lindsay Keller, space scientist at NASA's Johnson Space Center in Houston. Keller is the co-discoverer and principal investigator of the microscope used to analyze the Wassonite crystals. "Through these kinds of studies we can learn about the conditions that existed and the processes that were occurring then."

Johnson's advanced work in nanotechnology is part of the center's Astromaterial Research and Exploration Science Directorate. It is currently the location for celestial materials that would be returned to Earth from spacecraft. The facility collaborates with industry, academic and international organizations.

"The beauty of this research is that it really demonstrates how the Johnson Space Center has become a pre-eminent leader in the field of nanoscale analysis," said Simon Clemett, a space scientist at Johnson and co-discoverer of the new mineral. "In the words of the great English poet William Blake, we are now able 'to see the world in a grain of sand'.

Collaborators in the discovery of the new mineral include Clemett, Keller and Zia Rahman in the Astromaterials Research and Exploration Science Directorate at Johnson; Alan Rubin from UCLA; Byeon-Gak Choi from Seoul National University, South Korea; Shouliang Zhang from the Lunar and Planetary Institute in Houston; and Katsunari Oikawa from Tohoku University, Japan.

NASA Kepler Mission Update

University of Sydney astrophysicists are behind a major breakthrough in the study of the senior citizens of our galaxy: stars known as Red Giants. Using high precision brightness measurements taken by the Kepler spacecraft, scientists have been able to distinguish profound differences inside the cores of stars that otherwise look the same on the surface.

The discovery, published in the latest edition of the journal Nature and made possible by observations using NASA's powerful Kepler space telescope, is shedding new light on the evolution of stars, including our own sun.

The paper's lead author, the University of Sydney's Professor Tim Bedding, explains, "Red giants are evolved stars that have exhausted the supply of hydrogen in their cores that powers nuclear fusion, and instead burn hydrogen in a surrounding shell. Towards the end of their lives, red giants begin burning the helium in their cores."

The Kepler space telescope has allowed Professor Bedding and colleagues to continuously study starlight from hundreds of red giants at an unprecedented level of precision for nearly a year, opening up a window into the stars' cores.

"The changes in brightness at a star's surface is a result of turbulent motions inside that cause continuous star-quakes, creating sound waves that travel down through the interior and back to the surface," Professor Bedding said.

"Under the right conditions, these waves interact with other waves trapped inside the star's helium core. It is these 'mixed' oscillation modes that are the key to understanding a star's particular life stage. By carefully measuring very subtle features of the oscillations in a star's brightness, we can see that some stars have run out of hydrogen in the center and are now burning helium, and are therefore at a later stage of life."

Astronomer Travis Metcalfe of the US National Center for Atmospheric Research, in a companion piece in the same Nature issue which highlights the discovery's significance, compares red giants to Hollywood stars, whose age is not always obvious from the surface. "During certain phases in a star's life, its size and brightness are remarkably constant, even while profound transformations are taking place deep inside."

Professor Bedding and his colleagues work in an expanding field called asteroseismology. "In the same way that geologists use earthquakes to explore Earth's interior, we use star quakes to explore the internal structure of stars," he explained.

Professor Bedding said: "We are very excited about the results. We had some idea from theoretical models that these subtle oscillation patterns would be there, but this confirms our models. It allows us to tell red giants apart, and we will be able to compare the fraction of stars that are at the different stages of evolution in a way that we couldn't before."

Daniel Huber, a PhD student working with Professor Bedding, added: "This shows how wonderful the Kepler satellite really is. The main aim of the telescope was to find Earth-sized planets that could be habitable, but it has also provided us with a great opportunity to improve our understanding of stars."

Hydrogen -- Green Revolution

Host Lisa Van Pay meets with NSF-funded scientists Yang-Shao Horn and Yogi Surendranath at the Massachusetts Institute of Technology as they take on the hydrogen energy challenge. Hydrogen bonds are an extremely efficient way to store energy, and scientists would like to capture this energy to power all sorts of things--from cars to laptops.

Lord of the Tree Rings

Trees are outstanding historians. In fact, scientists dating back to Leonardo da Vinci recognized the value of trees. While others had figured out that you could determine the age of a tree by counting its growth rings, da Vinci went beyond that basic knowledge.
"He was a genius and realized also that the width of those growth rings carried information about the environmental conditions during each year the rings were formed," says David Stahle, director of the Tree Ring Laboratory at the University of Arkansas.
"So, he really anticipated the entire science of dendrochronology using annual growth rings from trees to infer past environmental variability, especially climate variability," continues Stahle, a professor in the university's Geosciences Department. "The time series of fat rings versus skinny rings is telling you about the history of wet years versus dry years."
Along with colleagues from the National Laboratory of Dendrochronology at the Mexican Forest Research Institute, Stahle collects tree-ring samples from remote forests, far from human influence. With support from the National Science Foundation (NSF), Stahle is now developing tree-ring records of Mexico's climate variability.
"Mexico has suffered persistent drought and we've done research on this using both the instrumental record and tree-ring reconstructions. One notion is that this 21st century drought may be being aggravated by human activity, both at the global scale and at the regional scale due to land surface changes," explains Stahle. "Not only is Mexico vulnerable to water availability, but her hydroelectric power supply system is also vulnerable to climate variations and drought. Mexico has had a notorious history of drought that has interacted with food supply availability, famine and disease, and has resulted in catastrophic population loss in the colonial history of that country."
So how do scientists extract a tree-ring core as thick as a pencil without harming the tree?
The key is a tool called a Swedish increment borer, invented more than 100 years ago to test the growth rate of living trees. It's basically a long, hollow steel auger. "This increment borer can be screwed into the center of the tree and it extrudes a core inside the long drill bit. You then remove the core from the auger with a long thin steel foil called an extraction spoon," Stahle says.
It does take practice and a little elbow grease! But, if done properly there's no permanent damage to the tree. In addition to offering a history of weather, these tree rings also offer some insight into how our ancestors lived, such as the climate extremes they suffered and the construction and abandonment of their settlements. In fact, under a microscope, experts can determine in exactly what year a barbed wire fence made a wound into a tree, helping to settle modern property disputes.
Tree rings may also help solve some of history's mysteries. For example, Stahle believes drought may have played a part in the New World's "Lost Colony" of Roanoke. "This is the drought of 1587, '88, and '89. That was the most severe drought of 800 years in this part of the United States, and 1587 was a particularly significant year because Virginia Dare, the first English baby born in the New World, and the other colonists at the Roanoke colony in North Carolina, were last seen in the summer of 1587," says Stahle, pointing to some extremely skinny rings on a piece of bald cypress from Blackwater River, Virginia.
So what trees do these experts like to study most?
"In the pantheon of tree species for dendrochronology, there are a few that are the crème de la crème, if you will--the very best species in the world. There are a limited number of them and really, in North America, it would be the Douglas fir, especially grown on arid sites in the interior of the continent," says Stahle.
Also among his favorites: ponderosa pine, the southern bald cypress in the United States, and the Montezuma bald cypress in Mexico.
His work also involves the bigger picture of protecting the world's forests. "These forests are being cleared and cut even today because progress marches on. So it's kind of a burden on the dendrochronological community to try to identify these relic old-growth forests that are still found and still threatened in many parts of our country," notes Stahle. "But, it's a great pleasure to travel to remote areas in the United States or Mexico to original forests, even virgin forests, with old growth trees, and there are precious few of these locations left. I think they're aesthetically beautiful; these old growth forests are important from an ecological perspective, and for the climate histories they preserve in their annual rings."

Monday 4 April 2011

"Epidemiological" Study Demonstrates Climate Change Effects on Forests

An 18-year study of 27,000 individual trees by National Science Foundation (NSF)-funded scientists finds that tree growth and fecundity--the ability to produce viable seeds--are more sensitive to climate change than previously thought.
The results, published tomorrow in the journal Global Change Biology, identify earlier spring warming as one of several factors that affect tree reproduction and growth. 
They also show summer drought as an important but overlooked risk factor for tree survival, and that species in four types of trees--pine, elm, beech, and magnolia--are especially vulnerable to climate change.
The findings may help scientists and policymakers better predict which species are vulnerable to climate change and why.
"In a sense, what we've done is an epidemiological study on trees to better understand how and why certain species, or demographics, are sensitive to variation and in what ways," says James Clark of Duke University, lead author of the paper.
To conduct the study, Clark and colleagues measured and recorded the growth, mortality and fecundity of each of the 27,000 trees at least once every three years, ultimately compiling an archive of more than 280,000 tree-years of data.
Using a specially designed bioinformatic analysis, they quantified the effects of climate change on tree species over time.
"This work demonstrates the limitations of current modeling approaches to predict which species are vulnerable to climate change and illustrates the importance of incorporating ecological factors such as species competition," says Alan Tessier, program director in NSF's Division of Environmental Biology, which funded the research.
The approach allowed the scientists to calculate the relative importance of various factors, alone and in combination, including the effects of localized variables such as competition with other trees for light, or the impact of summer drought.
"As climate continues to change, we know forests will respond," says Clark.
"The problem is, the models scientists have used to predict forest responses focus almost solely on spatial variation in tree species abundance--their distribution and density over geographic range."
If all trees of a species grew in the same conditions--the same light, moisture, soil and competition for resources--this generalized, species-wide spatial analysis might suffice, Clark says.
Then scientists wouldn't need to worry about demographic variables and risk factors when trying to predict biodiversity losses due to climate change.
"But in the real world, we do," Clark says. "That's where the new concept of climate and resource tracking of demographic rates comes in.
"Trees are much more sensitive to climate variation than can be interpreted from regional climate averages."
The trees studied included 40 species, located in eleven different forest stands in three geographic regions of the Southeast--the southern Appalachians, the Piedmont and the coastal plain.
They were subjected to both natural and experimental variations.
"By quantifying the effects and relative importance of competition [between species] and climate variables," says Clark, "including impacts on fecundity, over both time and space, the model we've developed addresses this need and can be used to guide planning."

NASA Spacecraft Reveal Mysteries Of Jupiter And Saturn Rings

upiter's ripple-producing culprit was comet Shoemaker-Levy 9. The comet's debris cloud hurtled through the thin Jupiter ring system on a collision course into the planet in July 1994. Scientists attribute Saturn's ripples to a similar object - likely another cloud of comet debris - plunging through the inner rings in 1983. The findings are detailed in two papers published Thursday in the journal Science.

"We're finding evidence that a planet's rings can be affected by specific, traceable events that happened in the last 30 years, rather than a hundred million years ago," said Matthew Hedman, a Cassini imaging team associate, lead author on one of the papers, and a research associate at Cornell University in Ithaca, N.Y. "The solar system is a much more dynamic place than we gave it credit for."

Scientists learned about the patchy patterns in Jupiter's rings in the late 1990s from Galileo's visit to Jupiter. Unfortunately, the images from that mission were fuzzy, and scientists didn't understand why such patterns would occur. Not until Cassini entered orbit around Saturn in 2004 and started sending back thousands of images did scientists have a better picture of the activity. A 2007 science paper by Hedman and colleagues first noted corrugations in Saturn's innermost ring, dubbed the D ring.

A group including Hedman and Mark Showalter, a Cassini co-investigator based at the SETI Institute in Mountain View, Calif., saw that the grooves in the D ring appeared to wind together more tightly over time. Playing the process backward, Hedman demonstrated the pattern originated when something tilted the D ring off its axis by about 300 feet (100 meters) in late 1983. The scientists found Saturn's gravity on the tilted area warped the ring into a tightening spiral.

Cassini imaging scientists received another clue around August 2009 when the sun shone directly along Saturn's equator and lit the rings edge-on. The unique lighting conditions highlighted ripples not previously seen in another part of the ring system. Whatever happened in 1983 was big - not a small, localized event.

The collision tilted a region more than 12,000 miles (19,000 kilometers) wide, covering part of the D ring and the next outermost ring, called the C ring. Unfortunately, spacecraft were not visiting Saturn at that time, and the planet was on the far side of the sun out of sight from ground or space-based telescopes.

Hedman and Showalter, the lead author on the second paper, wondered whether the long-forgotten pattern in Jupiter's ring system might illuminate the mystery. Using Galileo images from 1996 and 2000, Showalter confirmed a similar winding spiral pattern by applying the same math they had applied to Saturn and factoring in Jupiter's gravitational influence. Galileo was launched on a space shuttle in 1989 and studied Jupiter until 2003.

Unwinding the spiral pinpointed the date when Jupiter's ring was tilted off its axis between June and September 1994. Shoemaker-Levy plunged into the Jovian atmosphere in late July. The Galileo images also revealed a second spiral, which was calculated to have originated in 1990. Images taken by New Horizons in 2007, when the spacecraft flew by Jupiter on its way to Pluto, showed two newer ripple patterns, in addition to the fading echo of the Shoemaker-Levy impact.

"We now know that collisions into the rings are very common – a few times per decade for Jupiter and a few times per century for Saturn," Showalter said. "Now scientists know that the rings record these impacts like grooves in a vinyl record, and we can play back their history later."

Launched in Oct. 15, 1997, Cassini began orbiting Saturn in 2004 and sends back data daily.

"Finding these fingerprints still in the rings is amazing and helps us better understand impact processes in our solar system," said Linda Spilker, Cassini project scientist, based at NASA's Jet Propulsion Laboratory in Pasadena, Calif. "Cassini's long sojourn around Saturn has helped us tease out subtle clues that tell us about the history of our origins."

Open-source Software Designed to Minimize Synthetic Biology Risks

Virginia Tech has licensed GenoTHREAT, a software tool that helps detect the use of synthetic DNA as bioterrorism agents. Developed as an open-source project by a team led by Jean Peccoud, associate professor at Virginia Bioinformatics Institute at Virginia Tech, it is being released using the Apache License Version 2.0 to ensure broad accessibility.
GenoTHREAT implements the “best match” screening protocol method recommended by the federal government to minimize the risk that unauthorized individuals or those with malicious intent will obtain toxins and other potentially dangerous materials from DNA synthesis providers. The process of developing GenoTHREAT allowed Peccoud’s team to conduct a rigorous bioinformatic analysis of the strengths and limitations of the best match method which was published in the March issue of Nature Biotechnology.
“It was natural to start developing GenoTHREAT around the federal guidance on synthetic DNA,” said Peccoud. “Since this regulation is only one of many regulations and policies that providers of synthetic DNA need to comply with, our current efforts aim at developing a more comprehensive biosecurity solution that can be customized for a variety of users.”
Five of the report’s co-authors – Arunima Srivastava of Delhi, India, a sophomore majoring in computer science in the College of Science; Michael Kozar of Harwich, Mass., a junior majoring in biology in the College of Science and French in the College of Liberal Arts and Human Sciences; Tyler Stewart of Springfield, Va., a junior majoring in biological sciences and biochemistry in the College of Science; and Gaelle Letort and Olivier Mirat, visiting students from ENSIMAG, an engineering school in Grenoble, France – are undergraduate students who worked with Peccoud and were part of a team of undergraduate students enrolled in the 2010 International Genetically Engineered Machines (iGEM) competition. The other two authors are Laura Adam, graduate research assistant, and Mandy Wilson, database administrator with Peccoud’s group
“This project exemplifies how it is possible to train students to use interdisciplinary strategies to confront today’s most important scientific problems,” said Virginia Tech Vice President and Dean for Undergraduate Education Daniel Wubah. “By breaking down the separation of basic and applied research, and by combining engineering and life science expertise, this team has made a valuable contribution to a real-world problem directly related to the security of our nation.”
Since the first synthetic cell in May 2010, the security and ethical aspects of synthetic biology have been debated in congressional hearings and by the Presidential Commission for the Study of Bioethical Issues and the National Academy of Science, among others.
“While the U.S. government does not endorse any particular tools, as stated in the Guidance, the U.S. government is supportive of efforts to develop new and improved methods to screen double-stranded DNA sequences for biosecurity purposes,” explained Capt. Theresa Lawrence of the U.S. Department of Health and Human Services. “We applaud efforts to advance this important field of science and enhance security.”
The Virginia Bioinformatics Institute at Virginia Tech is a premier bioinformatics, computational biology, and systems biology research facility that uses transdisciplinary approaches to science combining information technology, biology, and medicine. These approaches are used to interpret and apply vast amounts of biological data generated from basic research to some of today’s key challenges in the biomedical, environmental, and agricultural sciences. With more than 240 highly trained multidisciplinary, international personnel, research at the institute involves collaboration in diverse disciplines such as mathematics, computer science, biology, plant pathology, biochemistry, systems biology, statistics, economics, synthetic biology, and medicine. The large amounts of data generated by this approach are analyzed and interpreted to create new knowledge that is disseminated to the world’s scientific, governmental, and wider communities.

Hurricane Season 2011: Tropical Depression 91S (South Pacific)

Australia's Northern Territory has been experiencing rainfall and winds from the low pressure system called "System 91S" for several days this week. Today, NASA's Tropical Rainfall Measuring Mission satellite spotted light to moderate rainfall in the system as continues tracking southwest and bringing rains and winds to the northern coast of Western Australia this weekend.

System 91S was located in the Timor Sea, west-southwest of Darwin, Australia (Northern Territory) and was moving in a west-southwesterly direction. System 91S is forecast to continue traveling in that direction and its center is expected to remain at sea over the next several days as it heads toward the Southern Indian Ocean.

As it continues to track along coastal areas of the Northern Territory and Western Australia, cyclone warnings are in effect. A Cyclone Warning is in effect for Western Australian coastal and island areas from Kuri Bay to the Western Australia/Northern Territory border, including Kalumburu and Wyndham. In addition, a cyclone watch is in effect from Kuri Bay to Cape Leveque, not including Derby.

The TRMM satellite, operated by both NASA and the Japanese Space Agency, passed over the system at 1451 UTC on April 1. The U.S. Navy and Naval Research Laboratory's Monterey Marine Meteorology Division, Calif. overlaid TRMM rainfall rate imagery on top of Japan's METSAT-2 infrared imagery to provide a complete picture of the low pressure areas cloud extent and rainfall rates. TRMM's precipitation radar instrument measured rainfall rates close to 1 inch (25 mm) per hour.

Surface winds on April 1 are estimated between 25 and 30 knots (29 to 34 mph / 46 to 55 kmh). Satellite imagery indicated that the center of Depression 91S' circulation was located near 13.2 South latitude and 128.9 East longitude. It is moving in a southwesterly direction near 4 knots (5 mph/7 kmh).

Infrared satellite imagery shows that the low-level center of circulation appears to be consolidating, and there is improved banding of thunderstorms wrapping into the center of the storm. Both of those factors hint at a storm organizing and strengthening. The only challenge in the forecast is that there is a moderate wind shear battering the storm which is inhibiting further strengthening into a tropical storm today. That vertical wind shear is expected to decrease as System 91S tracks along the northern coast this weekend.

There is a good chance that System 91S could become a tropical storm over the weekend. Regardless, residents of the northern coastal areas of Western Australia should expect some heavy rainfall, gusty winds and rough surf at the beaches this weekend.

Wednesday 30 March 2011

NASA Satellites Eyeing 4 Tropical Systems Around the World For Possible Development

There are four low pressure areas in the tropics today that NASA satellites are all keeping an eye on for possible development. They are Systems 90S, 91S and 99S in the Southern Pacific, and System 93B in the Northern Indian Ocean. Despite a poor chance for development in all of them, one has triggered warnings in northern Australia because of its proximity to land.

NASA and the Japanese Space Agency manage the Tropical Rainfall Measuring Mission or TRMM satellite and TRMM passed over two of those four systems today. TRMM captured light to moderate rainfall in the low pressure area called "System 90S" on March 30 at 01:49 UTC. Rainfall rates were between 5 and 20 millimeters (0.2 and 0.8 inches) per hour within the storm. System 90S is located 500 miles north-northwest of Port Hedland, Australia, near 12.0 South latitude and 116.0 East longitude.

Infrared satellite imagery from the Atmospheric Infrared Sounder (AIRS) instrument aboard NASA's Aqua satellite revealed that the low has consolidated during the morning hours, while the Advanced Microwave Scanning Radiometer-E instrument showed deep convection on the north and south sides of the center of circulation. Despite these developments atmospheric dynamics are not currently favorable, so the Joint Typhoon Warning Center currently gives this low a poor chance for development.

The second tropical low pressure area NASA satellites are watching is also 500 miles away from land, and that's System 99S. System 99S is 500 miles north of the Cocos Islands today, near 9.8 South and 99.4 East. The TRMM satellite measured rainfall rates between 5 and 20 millimeters (0.2 and 0.8 inches) per hour within the System 99S early today. The AIRS infrared imagery captured from NASA's Aqua satellite shows that areas of deep convection exist on all sides of the low pressure center, but it's not uniform. Vertical wind shear is currently light and sea surface temperatures are warm enough to support development, however, the chance that it will develop into a tropical storm in the next 24 hours is poor. As the week progresses, perhaps the chance will improve with the environmental conditions.

The third tropical low pressure area isn't a tropical storm but it has triggered a watch for Australia's Northern Territories. Because of System 91S' location, about 200 miles northeast of Darwin, Australia (near 10.0 South and 133.1 East), a tropical cyclone Watch has been issued for the coastal communities between Cape Hotham, Port Keats, including Darwin and the Tiwi Islands. The Tiwi Islands include Melville and Bathurst Islands and are part of Australia's Northern Territory, 25 miles (40 km) north of Darwin where the Arafura Sea joins the Timor Sea. In addition, a Strong Wind Warning has been issued from Milingimbi to Troughton Island.

System 91S is expected to move in a southwesterly direction over the next several days and track over Snake Bay, Mellville Island and Cape Fourcroy, Bathurst Island.

NASA AIRS infrared imagery revealed today that the convection (rapidly rising air that produces the thunderstorms that power a tropical cyclone) are intensifying and expanding around System 91S' center. The convection, however appears disorganized in the low and the maximum sustained winds are between 15 and 20 knots (17-23 mph/27-37 kmh). The chance for development into a tropical storm in the next 24 hours remains poor, but the areas under the watch may feel System 91S' rains and some gusty winds.

The fourth area is in the northern hemisphere and in a different ocean. Tropical low 93B is located in the Northern Indian Ocean. Last night it was only 50 miles north of Phuket, Thailand near 9.0 North latitude and 98.7 East longitude. However, today, infrared satellite imagery from NASA's AIRS instrument showed that the low level circulation center has drifted inland. When a low is inland, its center of circulation is cut off from the warm waters that power the tropical cyclone. Because of weak steering winds, however, the low may move back seaward and redevelop in the warm waters offshore.

Currently the system's maximum sustained winds are between 15 and 20 knots (17-23 mph/27-37 kmh). The chance for development into a tropical storm in the next 24 hours remains poor, but coastal Thailand are already experiencing rains and some gusty winds from System 93B.

NASA's TRMM and Aqua satellites continue to provide data to forecasters who are keeping a watchful eye on all of these tropical low pressure areas.

Evolutionary 'Winners' and 'Losers' Revealed in Collaborative Study

March 22, 2011-Houston-
In a study that literally analyzed competing bacteria fighting it out to the death, a University of Houston (UH) researcher and his colleagues identified evolutionary ‘winners’ and ‘losers.’ Continuing research to understand the basis of these fates may become a useful tool is designing roadblocks to antibiotic resistance. Tim Cooper with student

In collaboration with scientists at Michigan State University (MSU), UH evolutionary biologist Timothy Cooper and his graduate student Utpala Shrestha were co-authors on a paper titled “Second-Order Selection for Evolvability in a Large Escherichia coli Population.” The report appeared March 18 in Science, the world’s leading journal of original scientific research, global news and commentary.

“The project found that bacteria growing for thousands of generations in an environment containing glucose as the only food had evolved to be better at getting better,” Cooper said. “We found that two lineages of bacteria arose and competed in a single experimental population. The lineage that initially grew more slowly, yet had the potential to evolve more rapidly, was the evolutionary ‘winner.’ This is surprising because it’s usually thought that competition is decided by what competitors can do now and not what they are capable of in the future.”

As genetic changes occurred, making some individuals better competitors on the glucose food, other individuals that did not quickly get their own beneficial mutations were outcompeted and went extinct. Down the line, understanding the benefits of evolving quickly like this will be a useful tool to predict such things as antibiotic resistance and the evolution of infectious disease. Cooper said this knowledge may one day help scientists design intervention strategies that make the evolution of these traits less likely to occur.

The work done by Cooper and Shrestha at UH established the specific genetic changes occurring during this bacterial evolution experiment that caused the change in their ability to evolve further. They discovered the genetic change that was important for determining which bacteria would prevail and which were destined to become extinct.

“Our collaborators isolated individual bacteria from a population that had evolved for 500 generations and sequenced their entire DNA genome to determine all the changes that had occurred,” Cooper said. “By isolating these changes and adding them in defined combinations back into the original ancestral strain, we were able to determine their individual effects.”

Reminiscent of Aesop’s lesson that ‘slow and steady wins the race,’ Cooper adds that even bacteria can benefit from a long-term view, with their experiment showing that bacteria that adapted, slowly but consistently, outcompeted those that initially grew quickly but then ran out of ways to improve.

With much of his work based on experimental evolution, which is the lab-based study of evolving populations, Cooper’s motivation for this experiment comes from wanting to understand the factors involved in evolution of organisms to better fit their environments. Using bacterial and computational experimental systems he aims to identify and integrate these mechanisms and examine how they depend on genetic and environmental factors.

“Bacteria provide an ideal model system to address these questions, because they evolve so quickly, undergoing thousands of generations in only a few years,” Cooper said. “Additionally, we can now sequence their entire genomes and determine the genetic changes that lead to improvements in their ability to grow.”

Funded by the National Science Foundation and the Defense Advanced Research Projects Agency, this work was a multidisciplinary effort done in collaboration with researchers in zoology, microbiology and molecular genetics at MSU. In addition to UH’s Cooper and Shrestha, the MSU team consisted of Richard Lenski, Jeffrey Barrick, Robert Woods and Mark Kauth. Woods has since moved on to the University of Michigan and Barrick is currently at the University of Texas at Austin.

iMobot Rolls, Crawls and Creeps

An intelligent, reconfigurable modular robot invented by a UC Davis alumnus and a professor of mechanical and aerospace engineering is headed for commercial development with the help of a grant from the National Science Foundation.
Graham Ryland and Professor Harry Cheng hope their “iMobot” will be a useful research and teaching tool. They also say the technology could be used in industrial applications for rapidly prototyping complex robotics — and may eventually form the basis of robots for search-and-rescue operations in difficult terrain. The university has filed a patent on the robot.
Ryland and Cheng developed the iMobot while Ryland was studying for his master's degree in mechanical engineering and conducting research in Cheng’s Integration Engineering Laboratory at UC Davis.
A single iMobot module has four controllable degrees of freedom, with two joints in the center section and two wheels, one on each end. An individual module can drive on its wheels, crawl like an inchworm, or raise one end of its body and pan around as a camera platform.
Individual modules could be assembled into larger robots for particular tasks, such as a snakelike robot that could get into confined spaces, or a larger, wheeled robot for smoother terrain.
"We wanted to create a robot that was modular and could be assembled together, but was also mobile and useful by itself. We feel this hardware platform could drastically speed up university and industry research in the field of robotics," Ryland said.
Commercial robots are usually built for a specific application. But there is a lot of interest in modular robots -- machines made up of durable subunits that can function alone or be configured for a specific task.
The iMobot could be used as a testbed tool for engineers studying control systems for individual robots or groups of robots, Cheng said.
"It's very difficult to build the kind of robot with flexibility, modularity, and reconfigurability that people want to use for research and teaching," he said.
By using an off-the-shelf commercial robot like iMobot, researchers can focus on solving problems in areas such as artificial intelligence, robot collaboration, and reconfigurable and adaptive systems, without having to first develop the hardware part of the robot.
Currently, there are no commercial research-grade modular robots available, Ryland said.
Cheng and Ryland have formed a company, Barobo Inc., to develop the robot commercially. Ryland is the company’s president. Barobo recently received a small-business innovation research grant from the National Science Foundation to begin commercial development. The initial grant is for $150,000 over six months, with the opportunity to apply for another $500,000. The inventors hope to have the robot on the market by the end of this year.

Epigenomic Findings Illuminate Veiled Variants

Genes make up only a tiny percentage of the human genome. The rest, which has remained measurable but mysterious, may hold vital clues about the genetic origins of disease. Using a new mapping strategy, a collaborative team led by researchers at the Broad Institute of MIT and Harvard, Massachusetts General Hospital (MGH), and MIT has begun to assign meaning to the regions beyond our genes and has revealed how minute changes in these regions might be connected to common diseases. The researchers’ findings appear in the March 23 advance online issue of Nature.
The results have implications for interpreting genome-wide association studies – large-scale studies of hundreds or thousands of people in which scientists look across the genome for single “letter” changes or SNPs (single nucleotide polymorphisms) that influence the risk of developing a particular disease. The majority of SNPs associated with disease reside outside of genes and until now, very little was known about the functions of most of them.
“Our ultimate goal is to figure out how our genome dictates our biology,” said co-senior author Manolis Kellis, a Broad associate member and associate professor of computer science at MIT. “But 98.5 percent of the genome is non-protein coding, and those non-coding regions are generally devoid of annotation.”
The term “epigenome” refers to a layer of chemical information on top of the genetic code, which helps determine when and where (and in what types of cells) genes will be active. This layer of information consists of chemical modifications, or “chromatin marks,” that appear across the genetic landscape of every cell, and can differ dramatically between cell types.
In a previous study, the authors showed that specific combinations of these chromatin marks (known as “chromatin states”) can be used to annotate parts of the genome – namely to attach biological meaning to the stretches of As, Cs, Ts, and Gs that compose our DNA. However, many questions remained about how these annotations differ between cell types, and what these differences can reveal about human biology.
In the current study, the researchers mapped chromatin marks in nine different kinds of cells, including blood cells, liver cancer cells, skin cells, and embryonic cells. By looking at the chemical marks, the researchers were able to create maps showing the locations of key control elements in each cell type. The researchers then asked how chromatin marks change across cell types, and looked for matching patterns of activity between controlling elements and the expression of neighboring genes.
“We first annotated the elements and figured out which cell types they are active in,” said co-senior author Bradley Bernstein, a Broad senior associate member and Harvard Medical School (HMS) associate professor at Massachusetts General Hospital (MGH). “We could then begin to link the elements and put together a regulatory network.”
Having pieced together these networks connecting non-coding regions of the genome to the genes they control, the researchers could begin to interpret data from disease studies. The team studied a large compendium of genome-wide association studies (GWAS), looking to characterize non-coding SNPs associated with control regions in specific cell types.
“Across 10 association studies of various human diseases, we found a striking overlap between previously uncharacterized SNPs and the control region annotations in specific cell types,” said Kellis. “This suggests that these DNA changes are disrupting important regulatory elements and thus play a role in disease biology.”
The researchers confirmed the reliability of their approach by showing that SNPs were associated with the appropriate cell types. For example, SNPs from autoimmune diseases such as rheumatoid arthritis and lupus sit in regions that are only active in immune cells, and SNPs associated with cholesterol and metabolic disease sit in regions active in liver cells. While more in-depth, follow-up studies will be needed to confirm the biological significance of these connections, the current study can help guide the direction of these investigations.
“GWAS has identified hundreds of non-coding regions of the genome that influence human disease, but a major barrier to progress is that we remain quite ignorant of the functions of these non-coding regions,” said David Altshuler, deputy director at the Broad and an HMS professor at MGH, who was not involved in the study. “This remarkable and much-needed resource is a major step forward in helping researchers address that challenge.”
SNPs in the non-coding regions of the genome may have subtler biological effects than their counterparts that arise in genes because they can influence how much protein is produced. The researchers mainly focused on SNPs in enhancer regions, which help boost a gene’s expression, and their network connections to regulators that control them and genes that they target. Follow-up efforts can then focus on specific pieces of this network that could be targeted with drugs.
The team involved in this study hopes to expand its analysis to include many other cell types and map additional marks to expand their networks beyond enhancer regions. In the meantime, researchers involved in genome-wide association studies will be able to use the maps from this project to analyze non-coding SNPs in a new light.
“These maps can be used to come up with hypotheses about how the variants themselves are working and which ones are causal,” said Bernstein. “This resource now goes back to the GWAS community, which can use the maps to form and test new functional models.”
This research was supported by the National Human Genome Research Institute under an ENCODE grant, the Howard Hughes Medical Institute, the National Science Foundation, and the Sloan Foundation.
About the Broad Institute of Harvard and MIT
The Eli and Edythe L. Broad Institute of Harvard and MIT was launched in 2004 to empower this generation of creative scientists to transform medicine. The Broad Institute seeks to describe all the molecular components of life and their connections; discover the molecular basis of major human diseases; develop effective new approaches to diagnostics and therapeutics; and disseminate discoveries, tools, methods and data openly to the entire scientific community.
Founded by MIT, Harvard and its affiliated hospitals, and the visionary Los Angeles philanthropists Eli and Edythe L. Broad, the Broad Institute includes faculty, professional staff and students from throughout the MIT and Harvard biomedical research communities and beyond, with collaborations spanning over a hundred private and public institutions in more than 40 countries worldwide. For further information about the Broad Institute, go to http://www.broadinstitute.org.

Extra-Cold Winters in Northeastern North America and Northeastern Asia?

f you're sitting on a bench in New York City's Central Park in winter, you're probably freezing.
After all, the average temperature in January is 32 degrees Fahrenheit.
But if you were just across the pond in Porto, Portugal, which shares New York's latitude, you'd be much warmer--the average temperature is a balmy 48 degrees Fahrenheit.
Throughout northern Europe, average winter temperatures are at least 10 degrees Fahrenheit warmer than similar latitudes on the northeastern coast of the United States and the eastern coast of Canada.
The same phenomenon happens over the Pacific, where winters on the northeastern coast of Asia are colder than in the Pacific Northwest.
Researchers at the California Institute of Technology (Caltech) have now found a mechanism that helps explain these chillier winters--and the culprit is warm water off the eastern coasts of these continents.
"These warm ocean waters off the eastern coasts actually make it cold in winter--it's counterintuitive," says Tapio Schneider, a geoscientist at Caltech.
Schneider and Yohai Kaspi, a postdoctoral fellow at Caltech, describe their work in a paper published in the March 31 issue of the journal Nature.
"This research makes a novel contribution to an old problem," says Eric DeWeaver, program director in the National Science Foundation's (NSF) Division of Atmospheric and Geospace Sciences, which funded the research.
"Traditional wisdom has it that western Europe is warm because of the Gulf Stream, but this paper presents evidence that atmospheric circulation plays an important role in maintaining the colder temperatures found on the eastern boundaries of the mid-latitude continents."
Using computer simulations of the atmosphere, the researchers found that the warm water off an eastern coast will heat the air above it and lead to the formation of atmospheric waves, drawing cold air from the northern polar region.
The cold air forms a plume just to the west of the warm water. In the case of the Atlantic Ocean, this means the frigid air ends up right over the northeastern United States and eastern Canada.
For decades, the conventional explanation for the cross-oceanic temperature difference was that the Gulf Stream delivers warm water from the Gulf of Mexico to northern Europe.
But in 2002, research showed that ocean currents aren't capable of transporting that much heat, instead contributing only up to 10 percent of the warming.
Kaspi's and Schneider's work reveals a mechanism that helps create a temperature contrast not by warming Europe, but by cooling the eastern United States.
Surprisingly, it's the Gulf Stream that causes this cooling.
In the northern hemisphere, the subtropical ocean currents circulate in a clockwise direction, bringing an influx of warm water from low latitudes into the western part of the ocean.
These warm waters heat the air above it.
"It's not that the warm Gulf Stream waters substantially heat up Europe," Kaspi says. "But the existence of the Gulf Stream near the U.S. coast is causing the cooling of the northeastern United States."
The researchers' computer model simulates a simplified, ocean-covered Earth with a warm region to mimic the coastal reservoir of warm water in the Gulf Stream.
The simulations show that such a warm spot produces so-called Rossby waves.
Rossby waves are large atmospheric waves--with wavelengths that stretch for more than 1,609 kilometers (1,000 miles).
They form when the path of moving air is deflected due to Earth's rotation, a phenomenon known as the Coriolis effect.
Similar to how gravity produces water waves on the surface of a pond, the Coriolis force is responsible for Rossby waves.
In the simulations, the warm water produces stationary Rossby waves, in which the peaks and valleys of the waves don't move, but the waves still transfer energy.
In the northern hemisphere, the stationary Rossby waves cause air to circulate in a clockwise direction just to the west of the warm region.
To the east of the warm region, the air swirls in the counterclockwise direction. These motions draw in cold air from the north, balancing the heating over the warm ocean waters.
To gain insight into the mechanisms that control the atmospheric dynamics, the researchers sped up Earth's rotation in the simulations.
In those cases, the plume of cold air gets bigger--consistent with it being a stationary Rossby-wave plume.  Most other atmospheric features would get smaller if the planet were to spin faster.
Although it's long been known that a heat source could produce Rossby waves, which can then form plumes, this is the first time scientists have shown how the mechanism causes cooling that extends west of the heat source.
According to the researchers, the cooling effect could account for 30 to 50 percent of the temperature difference across oceans.
It also explains why the cold region is just as big for both North America and Asia, despite the continents' differences in topography and size.
The Rossby-wave induced cooling depends on heating air over warm ocean water. Since the warm currents along western ocean boundaries in both the Pacific and Atlantic are similar, the resulting cold region to their west would be similar as well.
The next step, Schneider says, is to build simulations that more realistically reflect what happens on Earth. Future simulations would incorporate more complex features like continents and cloud feedbacks.

New Research Suggests Wild Birds May Play a Role in the Spread of Bird Flu

A study by the U.S. Geological Survey, the United Nations Food and Agriculture Organization and the Chinese Academy of Sciences used satellites, outbreak data and genetics to uncover an unknown link in Tibet among wild birds, poultry and the movement of the often-deadly virus.
Researchers attached GPS satellite transmitters to 29 bar-headed geese – a wild species that migrates across most of Asia and that died in the thousands in the 2005 bird flu outbreak in Qinghai Lake, China. GPS data showed that wild geese tagged at Qinghai Lake spend their winters in a region outside of Lhasa, the capitol of Tibet, near farms where H5N1 outbreaks have occurred in domestic geese and chickens.
This is the first evidence of a mechanism for transmission between domestic farms and wild birds, said Diann Prosser, a USGS biologist at the USGS Patuxent Wildlife Research Center. “Our research suggests initial outbreaks in poultry in winter, followed by outbreaks in wild birds in spring and in the breeding season. The telemetry data also show that during winter, wild geese use agricultural fields and wetlands near captive bar-headed geese and chicken farms where outbreaks have occurred.”
The part that wild birds play in the spread of bird flu has been hotly debated since the 2005 outbreaks in Qinghai Lake. Bird flu that spread beyond Asia and into Europe and Africa was later found to have genetically originated in the Qinghai Lake area. Discovering the Tibet connection adds another significant link in the global transmission of bird flu.
From 2003 through 2009, the Qinghai-Tibet Plateau experienced 16 confirmed outbreaks of the virus in wild and domestic birds, most of them in the newly documented migratory pathway of the wild bar-headed geese. “Every summer, more than 150,000 migratory birds use Qinghai Lake, which sits within the eastern portion of the Central Asian Flyway, which extends from India to Russia,” said John Takekawa, a wildlife biologist at the USGS Western Ecological Research Center. 
The study also uncovered an undocumented migratory link between Qinghai Lake and Mongolia, further suggesting that Qinghai may be a pivotal point of H5N1 transmission. 
Scott Newman, Head of the EMPRES Wildlife Health and Ecology Unit of the FAO, noted that poultry production at the southern end of the Central Asian Flyway is extensive, which has resulted in many more HPAI H5N1 outbreaks there than in the northern end, where poultry production is more limited. “In general,” said Newman, “H5N1 outbreaks along this flyway mirror human and poultry densities, with domestic poultry the primary reservoir for this disease.”
This study is part of a global program to not only better understand the movement of avian influenza viruses and other diseases in the Central Asian Flyway, but also to improve the understanding of the ecological habits of waterbirds internationally, identify important conservation issues, and better define  interactions among wild and domestic birds.
The H5N1 virus continues to reemerge across much of Eurasia and Africa, with high fatality rates in people, and the threat of a possible global pandemic.  Since 2003, H5N1 has killed 300 people, including 18 in 2010, and has led to the culling of more than 250 million domestic poultry throughout Eurasia and Africa.  Sixteen countries reported H5N1 outbreaks in poultry in 2010.  No HPAI H5N1 has been detected in North America, despite extensive efforts to test migratory birds to provide early warnings should birds with the virus arrive in the country.
The study was funded by the USGS, FAO, National Science Foundation and the Chinese Academy of Sciences. It was published in PLoS ONE and authored by Diann J Prosser, Peng Cui, John Y. Takekawa, Scott Newman and others.

A bit of fiber makes for sudsier beer

his St. Patrick’s Day, raise a glass to the beer researchers of Ireland. By figuring out how bubbles form in stout beers, a team of mathematicians has come up with an idea that could better unleash the foamy head of Guinness: beer cans lined with a material similar to a coffee filter.
Stout beers foam less than other beers when poured, thanks to the nitrogen gas that brewers inject into the liquid before packaging. Nitrogen cuts down on acidic tastes and provides a longer-lasting head and a creamier mouthfeel, owing to the small size of its bubbles. But nitrogen doesn’t dissolve as well as other gases or form bubbles as easily. That’s why bars need special devices to force the bubbles out of tapped stout beers and why every can of Guinness contains a hollow ball filled with nitrogen gas that triggers foaming. It’s also why beer aficionados in search of a foamy head spend 30 seconds on the perfect pour.
Carbon dioxide, present only in small quantities in stouts, gives a quick fizz and foam to soft drinks, champagne and paler beers. Pour a bottle of one of these carbonated beverages, and the gas readily comes out of the liquid. Bubbles speedily form and grow larger while rising through the liquid, adsorbing gas and releasing it at the surface.
To see if stout beers could learn a thing or two from lighter beers, mathematician William Lee of the University of Limerick and colleagues investigated the mechanism behind bubble formation.
In a glass of beer, bubbles form only at the sides and bottom — not within the beer itself. Trapped pockets of gas create steady streams of these bubbles much in the same way that dropping sugar crystals into an overly sweet — what chemists call supersatured — sugar solution creates rock candy. Irregularities in the glass itself were once thought to be the origins of these pockets of gas.
But in 2002 Gérard Liger-Belair, a chemical physicist at the University of Reims Champagne-Ardenne in France, proposed that gas pockets in bits of fibrous material clinging to the sides of containers were actually driving the formation of bubbles in carbonated beverages. Observations of these bits of cellulose under a microscope led to a mathematical explanation of bubble nucleation in champagne.
“The cellulose fiber model does a better job than the previous one dealing with irregularities because it is more ‘stuck’ to reality,” Liger-Belair says. “Most of bubble nucleation sites in a glass are tiny cellulose fibers.” The fibers, he suggests, float into the glass from dust in the air or break off from cloths used to wipe down glasses.
By adapting Liger-Belair’s math, Lee showed that the same mechanism is at work in stout beer, only much more slowly. A cellulose fiber triggers the formation of a bubble in a stout beer in 1.28 seconds, as compared to the 0.079 seconds required in a typical carbonated liquid.
“I was surprised to find that it did work, albeit slowly, and even more surprised when a rough estimate suggested it could potentially be useful,” says Lee, who posted his results online at arXiv.org on March 3.
Adding a coating of dense fibers inside of a stout can’s upper lip should trigger the release of nitrogen bubbles during pouring without the need for a nitrogen gas widget now in use, he suggests. To demonstrate this concept, he poured stout beer onto a coffee filter and watched with satisfaction as it bubbled on the fibrous material — sacrificing a perfectly good beer to the pursuit of the perfect pour.

Missing bits of DNA may define humans

In evolution as on reality TV, sometimes the biggest loser is really a winner.
Losing 510 chunks of DNA may have enabled humans to develop bigger brains, spineless penises and other human traits, researchers from Stanford University and their colleagues report in the March 10 Nature.
The research is the latest attempt to find genetic factors that make humans human. Previously researchers have searched for genes that humans have but other species do not, but the new study turns that approach on its head, looking instead for pages redacted from the human genetic instruction book during the course of evolution.
“This is a clever thing to do and as so often with good ideas, seems almost obvious in hindsight,” says Svante Pääbo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.
Looking for missing DNA that could shape human traits was something of a no-brainer for Stanford researchers David Kingsley and Gill Bejerano. Kingsley, a developmental geneticist, had previously discovered that stickleback fish species can shed prominent pelvic spines by losing a bit of DNA involved in growing limbs and other appendages (SN: 1/31/09, p. 26).
The team looked at the genetic blueprints of humans, chimpanzees and macaques to see if humans were missing any pieces found in the other two species.
Indeed, humans lack many chunks of DNA that chimps, macaques and mice all seem to share — at least 510 different bits. Most are also missing from Neandertals, suggesting that the pieces were lost sometime between 500,000 and 6 million years ago.
Only one of the missing bits contained an actual gene. The rest of the absent genetic instructions may be important switches for turning on genes. Such switches, known as enhancers, can be located far from a gene but still govern when and where the gene is flipped on during development.
Humans and chimps have roughly the same set of genes, and yet have clear physical and behavioral differences. Some scientists have reasoned that changing the way genes are used — by turning a gene on or off in a particular tissue or during a phase of development — may be a way to evolve new characteristics without damaging important genes. Adding or deleting enhancers is one way to regulate gene activity.
“It’s probably the best way to develop new functionality in the short run,” says David Haussler, a Howard Hughes Medical Institute investigator at the University of California, Santa Cruz.
Because most of the missing chunks of DNA don’t contain genes, it is difficult to say exactly what the pieces are supposed to do. The Stanford team used “a very clever computational analysis” to tease out a couple of pieces of DNA that might have clear-cut functions, Haussler says.
One of the bits is near a gene that controls production of an androgen receptor, a protein that detects testosterone. The piece of DNA missing in humans is an enhancer for the gene, which controls the production of facial sensory whiskers and small spines on the penises of both chimps and mice, the researchers found. Since humans don’t have the enhancer, the gene is not turned on, and sensory whiskers and penile spines fail to develop.
In some mammals, penile spines are prominent and may help males achieve ejaculation quickly. “The key to reproductive success is impregnation, and the faster you can achieve that, the better,” says Owen Lovejoy, a biological anthropologist at Kent State University in Ohio. But even though the loss of the spines makes copulation last longer, it hasn’t hurt the reproductive success of humans, he says. Longer copulation times may help cement bonds between mating partners, which Lovejoy says has been a key to humans’ evolutionary success.
Another missing enhancer identified in the new study may help explain the evolution of human brain size. The enhancer lies next to a tumor-suppressing gene called GADD45G, which normally reins in cell growth so that cancer doesn’t develop. In mice and chimps, the enhancer DNA turns the gene on in the brain. Because humans lack the enhancer, the gene is not turned on in the brain and brain cells may proliferate, possibly contributing to bigger brains.
The enhancer probably isn’t the only thing that led to humans’ increased brain power, Kingsley says. “There’s no way changes in a single gene are going to produce all the interesting changes that led to the human brain.”
Removing the enhancer from mice may help researchers learn just how much brain growth the missing piece of DNA is responsible for, he says.

Who felt it not, smelt it not

People who feel no pain due to a particular rare genetic defect also can’t smell anything, a new study finds. The unexpected discovery shows that nerves that detect pain and odors rely on the same protein to transmit information to the brain.
Researchers examined three people who have mutations in the SCN9A gene and can’t feel pain. All of the people had broken multiple bones without feeling any pain, and two had given birth painlessly. But none were aware that they also couldn’t smell a thing.
None of the study participants could distinguish balsamic vinegar, orange, mint, coffee or perfume from plain water, even when researchers poured on so much perfume and vinegar that the scents were unbearable to people with a normal sense of smell, an international team of researchers reports online March 23 in Nature.
It may not be so strange that none of the people realized that they lacked a sense of smell. “If this was a genetic defect from birth they wouldn’t even know what they were missing,” says Graeme Lowe, a neurophysiologist at the Monell Chemical Senses Center in Philadelphia who was not involved in the study.
As oblivious as the patients were to their smell deficit, the scientists had been equally clueless that smell and pain shared a common communication gateway.
Researchers had previously shown that SCN9A controls pain sensitivity in people (SN Online: 3/8/10). The gene makes a protein, called a sodium channel, that lets sodium pass in through a nerve cell’s membrane when the nerve detects something painful. That flood of sodium sends an electrical signal racing down a long tendril, called an axon, toward the brain.
In the new study, the team discovered that odor-detecting nerve cells have the same sodium channel at the tips of their axons. In those cells, the channel controls the release of a chemical called glutamate, which in turn sparks electrical communication with odor-processing centers in the brain.
Because the sodium channel is missing in people with SCN9A mutations, the messages sent by pain- and odor-sensing nerves never actually make it to the brain.
“It was completely surprising that these two sensory systems would use the same sodium channel,” says Frank Zufall, a neurophysiologist at the University of Saarland School of Medicine in Homburg, Germany. “But it’s clearly not needed for all senses.” None of the people with the faulty gene had hearing or vision problems. The researchers plan to test whether those people have a sense of taste, and whether taste cells also use the sodium channel to communicate.
Although the people weren’t bothered by not being able to smell, mice engineered to lack the sodium channel in smell-sensing nerve cells had a tough time. Baby mice are blind and rely on their sense of smell to guide them to their mother’s milk. Without the sodium channel, baby mice were underweight and mother mice couldn’t locate babies that had gotten out of the nest. Adult mice lacking the protein also wandered into territory marked with the scent of foxes — a place normal mice strictly avoided. “These [mutant] mice have no chance in nature,” Zufall says.
Some pharmaceutical companies are already working on painkilling drugs that would block the sodium channel’s activity. But the new study suggests that such drugs could have the side effect of eliminating smell, Lowe says. Because odor is an important component in creating the flavor of a food, people’s ability to taste would also be compromised, he says.

When bacteria are flu fighters

Not all germs are created equal. Some invade the body and cause disease or harmful infections but others live peacefully in your intestines, helping your body run smoothly. Antibiotics can't tell the difference. They knock out good and harmful bacteria alike.
That's usually okay for people fending off bacterial infections, like the painful ones that can grow in your ear, but it might be a problem for fighting the flu. Researchers from Yale University say that when mice on antibiotics were infected with the flu virus, they had a harder time fighting off sickness than did mice that hadn't been taking antibiotics.
Helpful bacteria are called “commensal” bacteria — and you've got a lot of them. More than 1,000 different types of bacteria live in the human body. In the intestines, commensal bacteria help break down food . They can also prevent bad bacteria from launching an infection.
In earlier experiments, scientists had shown that commensal bacteria can help fend off bad bacteria in the intestines. But that's not the only place, say the Yale researchers. The scientists discovered that these friendly gut-dwelling germs can help the immune system find and fight invaders elsewhere in the body. When antibiotics knocked out the good bacteria in the mice intestines, the mice had trouble fighting off viral infections in their lungs.
This long-distance relationship shows that the good germs in the gut can make a difference all over the body, and that antibiotics may jeopardize those benefits. The gut bacteria help the immune system create a protein called interleukin-1 beta, or IL-1 beta. The body uses this molecule to fight off the flu and other viral infections.
When the scientists gave antibiotics to the mice, the antibiotics wiped out the good bacteria that made IL-1 beta. As a result, when those mice were given the flu, they had a harder time fighting the infection than did the mice that hadn't been given The scientists showed that antibiotics can wipe out important commensal bacteria, but a mystery remains. It’s not clear which bacteria are responsible for the IL-1 beta. But it’s possible to rule out some kinds.
“We know for sure that there are certain bacteria that can’t do it,” Akiko Iwasaki told Science News. Iwasaki, who led the study, is an immunologist at Yale. Immunologists study the body's immune system.
The mice had a hard time with the flu, but their immune systems still worked. Iwasaki and her colleagues found that even while on antibiotics, mice could stop an infection by the herpes virus. That's because the body doesn't need IL-1 beta to fight herpes.
The connection between the gut and the immune system could lead to new types of medicine. If scientists can find which bacteria help make IL-1 beta, for example, they might be able to find a way to boost production of those bacteria ― and thus boost the immune system at the same time.
The experiment is also a reminder: Next time you fight off the flu, thank your germs!
POWER WORDS (adapted from the New Oxford American Dictionary)
gut The intestines
bacteria Single-celled organisms that can live in soil, water and air, as well as inside and outside the human body. In the intestines, beneficial bacteria called commensal bacteria can help break down food and fight infection.
virus An organism that is made of genetic material surrounded by a protein shell. When a virus invades a host, it begins reproducing rapidly, causing infection.
proteins Large molecules that perform a wide variety of functions and are essential to life.
immunity The ability of a body to resist a particular infection.
immunology The branch of medicine and biology that focuses on immunity.

Club drug tied to out-of-body sensations

A popular “club drug” promises to open a scientific window on the strange world of out-of-body experiences, researchers say.
Recreational users of a substance called ketamine often report having felt like they left their bodies or underwent other bizarre physical transformations, according to an online survey conducted by psychologist Todd Girard of Ryerson University in Toronto and his colleagues.
Ketamine, an anesthetic known to interfere with memory and cause feelings of detachment from one’s self or body, reduces transmission of the brain chemical glutamate through a particular class of molecular gateways. Glutamate generally jacks up brain activity. Ketamine stimulates sensations of illusory movement or leaving one’s body by cutting glutamate’s ability to energize certain brain areas, the researchers propose in a paper published online February 15 in Consciousness and Cognition.
“Ketamine may disrupt patterns of brain activation that coalesce to represent an integrated body and self, leading to out-of-body experiences,” Girard says.
National surveys indicate that 1.6 percent of high school seniors in Canada and the United States have used ketamine at least once. An estimated 70 percent of Toronto rave-goers now report taking ketamine at these all-night parties, Girard notes.
In the new survey, use of marijuana, LSD and MDMA, also known as ecstasy, displayed modest links to volunteers’ reports of illusions of walking or moving rapidly up and down while actually remaining still. But only ketamine use exhibited a strong relationship with having had a range of out-of-body experiences, regardless of any other drugs ingested at the time of those sensations, researchers say.
Neuroscientist Olaf Blanke of the Swiss Federal Institute of Technology in Lausanne calls ketamine “an interesting candidate to further understand some of the brain mechanisms in out-of-body experiences.” Blanke, who like a growing number of scientists studies these phenomena in controlled experiments (SN: 6/5/10, p. 10), says that drugs such as ecstasy and amphetamines also deserve close scrutiny.
Blanke has linked out-of-body experiences to reduced activity in brain areas that integrate diverse sensations into a unified perception of one’s body and self. Ketamine and other recreational drugs act throughout the brain, making it difficult to explain how any one drug might specifically affect sensation-integrating tissue, Blanke says.
Girard’s team administered online surveys about drug use and drug-related experiences to 192 volunteers, ages 14 to 48. Almost half the sample reported having used marijuana, alcohol, ecstasy, ketamine and amphetamines. Roughly two-thirds had taken ketamine, and nearly everyone had used marijuana and alcohol.
Almost three-quarters of all participants reported having had a feeling of temporarily leaving their bodies, usually on several occasions. About 42 percent had experienced seeing their own bodies from an outside vantage point. Feelings of rapidly moving up and down, falling, flying or spinning had affected more than 60 percent of volunteers. Another 41 percent reported illusions of sitting up, moving a limb or walking around a room, only to realize that they had not moved.
Of those reporting feelings of leaving their bodies, 58 percent were under the influence of ketamine at the time. Ketamine use also displayed a close association with other unusual bodily sensations.
Apparent effects of drugs such as ecstasy on out-of-body experiences were largely explained by associated ketamine use, Girard says.

Half of adult males carry HPV

he virus notorious for causing cervical cancer in women also turns up frequently in men and can hang on unnoticed for months or even years, researchers report online March 1 in Lancet. The study solidifies earlier research indicating that human papillomavirus is highly prevalent in men and strengthens the case for vaccinating men and boys against it, the report’s authors say.

There are dozens of types of HPV, including more than 40 that can be transmitted sexually. Some can cause cancer. Two vaccines, Merck’s Gardasil and GlaxoSmithKline’s Cervarix protect against two types of cancer-causing HPV. Both vaccines are approved and recommended for girls and young women. Gardasil is also recommended for boys up to age 18 since its protection extends to two additional types of HPV that cause genital warts in males and females.

It’s widely assumed that limiting the virus in men or women would diminish its spread in the whole population. But while HPV has been extensively studied in women, its prevalence is less well understood in men, says Joseph Monsonego of the Institute of the Cervix in Paris, writing in the same Lancet issue. For that reason, he says, the new study results “are of substantial interest.”

Starting in 2005, epidemiologist Anna Giuliano of the H. Lee Moffitt Cancer Center & Research Institute in Tampa, Fla., and an international team of researchers recruited more than 4,000 men living in Brazil, Mexico and Florida into a study of HPV. The new study reports on the first 1,159 of these volunteers. Their average age was 32 and none had been vaccinated against HPV. Swabs of the penis and genital area of each man revealed that 50 percent were infected with at least one HPV type upon enrollment.

The researchers repeated these exams every six months, and the men completed personal-history questionnaires. Over a median of 28 months, the group acquired 1,572 new HPV infections.

The human immune system can clear HPV out of the body, and the men wiped out most of their new infections during the study period. But it took a median 7.5 months. Median clearance times didn’t vary substantially among the countries, but did vary between HPV types. Some cases lingered as long as 24 months in the men.

HPV 16 is the type responsible for the most cervical cancers in women and is covered by both vaccines. It took a median of 12 months to clear. “It’s hanging around longer, and it’s completely asymptomatic,” Giuliano says. “You don’t even know you have it.” This silent infection means a person can transmit this HPV type for longer periods and “might help explain why HPV 16 is one of the most common types in both men and women,” she says.

The data also reveal that men who reported having 10 or more sexual partners in their lifetimes had roughly twice as many HPV infections as did men who had had one partner.

Giuliano says many insurance programs cover HPV vaccination in boys up to age 18.

Male circumcision and the use of condoms have shown little protection against HPV infection, Monsonego says. “HPV vaccination in men will protect not only them but will also have implications for their sexual partners,” he says.

Giuliano says she expects to have data on the full group in three years

Brain chemical influences sexual preference in mice

When courting, male mice lacking the chemical messenger serotonin don’t seem to care whether the object of their affection is female. Mice without the neurotransmitter no longer eschew the smells of other males, wooing them instead with squeaky love songs and attempts to mount them, researchers report online March 23 in Nature.
Serotonin’s surprise role in mouse courtship may lead to a deeper understanding of how brain cells control a complex behavior.
“Nobody thought that serotonin could be involved in this kind of sexual preference,” says study coauthor Zhou-Feng Chen of Washington University School of Medicine in St. Louis.   
Scientists emphasize that the male-male courtship seen in the lab isn’t equivalent to human homosexuality. And what, if anything, serotonin has to do with human sexual behavior is still an open question.
“We have to be cautious because this is work done in mice,” Chen says. “I would be extremely careful to extrapolate these results into humans. We just don’t know much about this.”
In the study, male mice that were genetically engineered to lack serotonin-producing brain cells still courted females. But when given the choice between males or females, these mice no longer reliably chose females over males. In tests where both a male and a female mating partner were present, nearly half of the serotonin-lacking males mounted the male first, report researchers led by Yi Rao of the National Institute of Biological Sciences and Peking University in Beijing.
These mice were also more likely than control mice to emit ultrasonic squeaks — a type of mouse love song — toward other males. And although male mice usually spend more time sniffing the odor of female genitals, these mice spent equal time sniffing male and female odors. Some of these signs of altered sexual behaviors could be reversed by injecting a compound into the mice that restored brain serotonin, Rao and his colleagues found.
Psychiatrist and sexual-research scientist Milton Wainberg of Columbia University says that it’s too simplistic to apply the experimental results to human sexuality. “These mice are not gay,” Wainberg says, “These mice have a disease that makes them do one behavior, which happens to be a behavior that can be thought of as a homosexual behavior, but it’s not homosexuality. It’s not being gay.”
The researchers don’t yet know whether serotonin affects sexual behavior in female mice.
The new result “opens now quite a lot of fascinating avenues,” including questions about where, when and how serotonin exerts its control over mouse sexual behavior, says Catherine Dulac, a molecular neuroscientist and Howard Hughes Medical Institute investigator at Harvard University.

Serotonin performs a wide variety of jobs by carrying messages across brain cell connections. The neurotransmitter has been linked to behaviors including feeding, sleeping and aggression. Serotonin also regulates many cognitive processes, including mood.
In humans, antidepressant drugs that increase the amount of serotonin in the brain do have some sex-related side effects. Selective serotonin reuptake inhibitors, or SSRIs, for example, can decrease libido in people. Yet there’s absolutely no evidence that the neurotransmitter has any influence on sexual orientation, Wainberg says.

Jumping on the bandwagon brings rewards

When minds think alike great things can happen, even if the minds themselves are not so great. Stock day traders who act in sync — no matter the stock, or whether they are buying or selling — make more money at the end of the day than their out-of-sync peers, reports an analysis to appear in the Proceedings of the National Academy of Sciences.

The research adds to a growing body of evidence that a collective wisdom can emerge from the myopic views of individuals. Understanding how and when could lead to ways of tapping that wisdom, enabling trading firms to make more money. Mining collective wisdom could also lead to new approaches for combating terrorism or fighting flu outbreaks, says study leader Brian Uzzi of Northwestern University in Evanston, Ill.

Uzzi and his Northwestern colleagues analyzed a year and a half of trades — more than a million transactions — made by 66 day traders at a single firm. Parsing the trading behavior down to a scale of seconds revealed sweet spots of synchronization —seconds to minutes when many traders were engaged in frenetic activity. On average the traders made money on 55 percent of their trades, but those who were in sync with their peers profited 60 percent of the time.

“I love the counterintuitive nature of the finding,” says complex-networks expert Albert-László Barabási of Northeastern University in Boston. “The dogma is that the successful investors are the Buffets — those who swim against the current. Yet this study shows that when it comes to day trading, going with the wave has real benefits.”

A peculiar aspect of the emergent intelligence is its lack of intentional coordination. There wasn’t a preestablished crowd for traders to glom onto, nor a leader to follow. The assessment suggests instead that instant messaging among traders helped couple their behavior. Instant messaging volume went up and down with trading volume, and the flow of instant messaging became less random as traders got more in sync, the researchers found.

Each trader uses whatever information is available — whether about housing market changes or Steve Jobs’ health — to figure out the best time to buy and sell for a particular specialty. “No single one can figure out the market,” says Uzzi. Yet somehow when many traders reach a decision to act at the same time, “It’s really super-special,” says Uzzi. “And by super-special, I mean really lucrative.”

It’s unlikely that an individual trader could take advantage of the phenomenon, given that the golden bandwagon emerges suddenly, as if from a mist. But someone with a global view of what’s happening might be able to jump on, or urge others to do so, says Uzzi. It may prove fruitful for government agencies, for example, to harness the collective wisdom of intelligence officers trying to figure out when to act on information about terrorist activity, or to keep an eye on specialists as they monitor for an infectious disease outbreak.

Of course synchronicity can also be neutral, or worse. And distinguishing which is which is much easier after the fact. “Sometimes synchronicity doesn’t lead down the path of collective wisdom,” says Uzzi, “but down the blind alley of mob madness.”

Two stars caught fusing into one

For stellar astronomers, “the two shall become one flesh” just took on a whole new meaning.
Scientists have directly observed for the first time the merger of two closely orbiting stars. Experts have suggested for decades that such stars — which whirl so close to each other that their outer layers actually touch — should ultimately commingle. The new work, by Romuald Tylenda of the Nicolaus Copernicus Astronomical Center in Toruń, Poland and collaborators, catches the stars in the act.
The researchers’ claim of catching the stars in the act is “not just plausible; it's compelling,” says Robert Williams of the Space Telescope Science Institute in Baltimore, who was not involved in the study. The results, to appear in an upcoming Astronomy & Astrophysics, add to previous work by Williams and colleagues to understand the nature of the star pair, called V1309 Scorpii.
V1309 Sco was discovered in 2008 when it erupted in a bright flare. Astronomers have proposed various explanations for the burst since then without reaching a consensus.
The new work hinges on a piece of good luck: Tylenda realized that the Optical Gravitational Lensing Experiment, a Warsaw University Observatory project hunting for dark matter since the mid-1990s, had been pointing its telescope at V1309 Sco’s region of the sky for years. Trolling through more than 2,000 observations taken from 2002 to 2010, he and his colleagues found light variations that suggest V1309 Sco was originally a contact binary star, a just-touching pair of stars circling each other about every 1.4 days. Over time, this periodic variation shortened as the stars’ outer layers combined to cocoon both orbs in a single envelope.
At that point the object got brighter, its light doubling every 19 days until late August 2008, when it brightened by a factor of 300 over 10 days. V1309 Sco’s final burst occurred that month when the stars’ cores finally merged and energy from their combined spins erupted outward. It became 10,000 times brighter than its original luminosity and more than 30,000 times brighter than the sun, then quickly faded away over the course of a few months to roughly its original brightness.
The best explanation for these variations is the merger of a contact binary system, Tylenda and his colleagues assert.
While the resulting object should be a star — albeit one with a weird gut structure and a quick spin — the material thrown off during coalescence still largely blocks V1309 Sco, so astronomers can’t see what it looks like. Astronomers have requested time on the Hubble Space Telescope to observe the object, says Williams, but it may take years for the disk to dissipate, notes Stefan Kimeswenger of the University of Innsbruck in Austria.
Kimeswenger fully agrees with the conclusion that V1309 Sco was a contact binary. But he is skeptical of the authors’ suggestion that a merger scenario could explain flares of a larger class of objects called V838 Mon-type eruptions, named after an object sighted in 2002 (SN 10/14/06, p. 248). “They all had outbursts of unknown type and they moved quickly to a red cold shell,” he says. But with different burst energies and chemical compositions, “that's all they have in common.”
While Tylenda agrees that V838 Mon itself was “almost certainly not” a contact binary, he does think that stellar mergers of different types could explain these eruptions.
“The objects share the same crucial characteristics, that is they become extremely cool (for stars) at the end of the eruption,” he says. “This indicates that the energy source of the eruption quickly faded during the eruption. This is just what is expected to happen when the violent merger processes are over.”

Laser proposed to deflect space junk

It won’t prevent Armageddon, but a simple ground-based laser system could nudge small pieces of space junk away from satellites to prevent collisions, a new study suggests.
The proposed system uses photons generated by a medium-power laser and aimed into space through a 1.5-meter telescope. The photons exert pressure on space debris in low-Earth orbit, gently pushing the objects aside rather than vaporizing them. Researchers have applied the same idea, using the pressure from sunlight, to propel spacecraft (SN: 8/21/99, p. 120).
James Mason of the Universities Space Research Association and NASA’s Ames Research Center in Mountain View, Calif., and his colleagues describe their system online at arXiv.org on March 10. The proposed device, which would cost a little over $10 million, could be ready for testing next year and fully operational a few years later.
About 500,000 pieces of space debris centimeter-sized and larger reside in low-Earth orbit. Most are fragments of abandoned spacecraft that have broken up or exploded. The number of cataloged space-debris pieces larger than 10 centimeters has risen dramatically in recent years and most satellites don’t have shielding that would protect them from collisions with such debris, says Don Kessler, a retired NASA senior scientist and orbital debris expert.
If a piece of space debris had to be moved by about 200 meters a day to avoid a collision, a medium-power laser of about 5 kilowatts could provide the needed push — provided the debris had a large surface area and was no heavier than 50 to 100 kilograms, Mason calculates.
Such a laser couldn’t have prevented a 2009 collision between two satellites (SN: 3/14/09, p. 9), nor could it push aside an asteroid. But the system ”could move light debris out of the way of a big object,” says Mason.
Mason’s team suggests that the laser facility be built at a near-polar, high-altitude site, such as the Plateau Observatory in Antarctica, because most debris passes over the polar regions many times a day.
Researchers have suggested using lasers to vaporize space debris for more than two decades, but those systems would require powerful devices that might be mistaken for weapons, notes Mason.
Using a laser to slightly alter the speed of small debris doesn’t take much energy, notes Kessler. And if the medium-power laser missed its target it would be unlikely to do much damage, he adds. Kessler notes, however, that scientists would need precise knowledge of the path of debris in order for the system to be effective.

Go east, ancient tool makers

Finds unearthed in southeastern India offer a cutting-edge revision of hominid migrations out of Africa more than 1 million years ago that spread pivotal tool-making methods.
Makers of a specific style of teardrop-shaped stone hand ax, flat-edged cleavers and other implements that originated in Africa around 1.6 million years ago (SN: 1/31/09, p. 11) reached South Asia not long afterward, between 1.5 and 1 million years ago, say archaeologist Shanti Pappu of the Sharma Center for Heritage Education in Tamil Nadu, India and her colleagues.
Rather than waiting until around 500,000 years ago to head into South Asia, as many researchers have thought, the African hand ax crowd wasted relatively little time before hightailing it to India, Pappu’s team concludes in the March 25 Science.
Archaeologists categorize stone hand axes and related implements as Acheulian tools. Most researchers regard Homo erectus, a species that originated around 2 million years ago, as the original brains behind Acheulian innovations.
“Acheulian tool makers were clearly present in South Asia more than 1 million years ago,” Pappu says. Several previous excavations in different parts of India have also yielded Acheulian tools, but these finds lack firm age estimates, she adds
No fossils of members of the human evolutionary family, or hominids, turned up among the new tool finds.
H. erectus must have rapidly moved from East Africa to South Asia, proposes archaeologist Robin Dennell of the University of Sheffield in England. Pappu’s new finds raise the possibility that 800,000-year-old hand axes previously discovered in southeastern China (SN: 3/4/00, p. 148) indicate the presence of H. erectus groups that came from South Asia — or at least exposure of Chinese hominids to Acheulian techniques, Dennell suggests in a comment published in the same Science.
Prior finds point to a second migration of Acheulian-savvy hominids out of Africa, he says. Homo heidelbergensis — a species first identified in Europe that some researchers now suspect inhabited East Africa and possibly Asia — trekked northward to the Middle East and then westward into Europe by half a million years ago, Dennell hypothesizes.
Until now, scientific consensus held that Acheulian tool makers, presumably H. erectus, reached the Middle East at least twice, around 1.4 million and 800,000 years ago, but went no further. H. heidelbergensis then took Acheulian implements from Africa to both South Asia and Europe approximately 500,000 years ago in this scenario.
If that was the case, even older Chinese hand axes might represent a tool tradition that developed independently of outside influences.
Any relationship of those Chinese finds to tools unearthed by Pappu’s group remains unclear, comments Harvard anthropologist Philip Rightmire. But it’s not surprising that H. erectus inhabited South Asia sometime around 1.5 million years ago, he says. Other evidence suggests that H. erectus left Africa for several destinations throughout Asia beginning at least 1.8 million years ago, wielding simple chopping tools.
“For now, it’s enough to say that Homo erectus introduced Acheulian tools to India,” Rightmire says.
Pappu’s team excavated and dated stone tools at Attirampakkam, an Indian site discovered in 1863. Work since 1999 has produced more than 3,500 Acheulian artifacts, including 76 hand axes and cleavers.
Artifact-bearing soil contained signs of a reversal in Earth’s magnetic field that places the finds at between 1.07 and 1.77 million years old. Measurements of radioactive isotopes in six quartz tools unearthed at Attirampakkam indicated that these finds had been buried approximately 1.5 million years ago.
.