While several recent studies suggest that much of the world is likely to experience freshwater shortages as the population increases and temperatures rise, determining the relative impact of each has been difficult. A recent Oak Ridge National Laboratory paper outlines a process that might help.
Using a sophisticated weather model, environmental engineers at Stanford University have defined optimal placement of a grid of four wind farms off the United States East Coast. The model successfully balances production at times of peak demand and significantly reduces costly spikes and zero-power events.
Climate is believed to be the driving force behind most of humanity’s evolutionary processes, including geographical range change. According to a new paper, new concepts such as “refugia”, or movements forced by harsh Ice Age climates, may explain the emergence of new species, or subspecies.
In a challenge to current astrophysical models, researchers at Sandia National Laboratories and the University of Rostock in Germany have found that current calibrations of planetary interiors overstate water's compressibility by as much as 30%.
On the one-year anniversary of the devastating Japanese tsunami, engineers from the University of Southern California’s Viterbi School of Engineering Tsunami Research Center are working with the State of California to better understand the damaging currents caused by tsunamis. They hope to better understand the effects generated by tsunamis within California ports and harbors.
In October 2010, a neutron star near the center of our galaxy erupted with hundreds of X-ray bursts that were powered by a barrage of thermonuclear explosions on the star's surface. NASA's Rossi X-ray Timing Explorer captured the month-long fusillade in high detail, identifying behavior not seen in the previous 100 neutron star observations in the past 30 years.
Modeling biological systems can provide key insights for scientists and medical researchers, but periodic cycles that repeat themselves—so-called oscillatory systems—pose some key challenges. Researchers at North Carolina State University have developed a new method for estimating the parameters used in such models.
Researchers at Oak Ridge National Laboratory are sharing computational resources and expertise to improve the detail and performance of the Community Earth System Model, a scientific application code that is the product of one of the world's largest collaborations of climate researchers.
In 2009, when the United States fell into economic recession, greenhouse gas emissions also fell, by 6.59% relative to 2008. In the power sector, however, the recession was not the main cause. Researchers at Harvard University have shown that the primary explanation for the reduction in carbon dioxide emissions from power generation that year was that a decrease in the price of natural gas reduced the industry's reliance on coal.
According to textbooks, arsenic occurs in gray, yellow, and black forms. However, the existence of black arsenic, which should be analogous to black phosphorus, has never been indisputably proven. Chemists seeking an answer recently combined quantum chemical computations with experimental investigations of phase formation to discover the stable form of black arsenic.
The least expensive way for the Western United States to reduce greenhouse gas emissions enough to help prevent the worst consequences of global warming is to replace coal with renewable and other sources of energy that may include nuclear power, according to a new study by University of California, Berkeley researchers.
Data from Scripps Institution of Oceanography, NOAA, and the University of California, San Diego has been used by Google experts this week to sharpen the resolution of seafloor maps in the popular Google Earth application. The original version of the program, according to a Scripps geophysicist, had high resolution but was full of thousands of blunders from old data.
Using models similar to those used in weapons research, scientists may soon know more about exoplanets, those objects beyond the realm of our solar system. In a new study, Lawrence Livermore National Laboratory scientists and collaborators came up with new methods for deriving and testing the equation of state of matter in exoplanets and figured out the mass-radius and mass-pressure relations for materials relevant to planetary interiors.
Coinciding with a peak in solar activity, NASA Goddard Space Flight Center’s Space Weather Laboratory will soon simultaneously produce as many as 100 computerized forecasts by calculating multiple possible parameters, improving our ability to predict the impact of solar storms. Currently, just one set of conditions is used to anticipate solar-storm activity.
Every year, students studying aeronautical and astronautical design brace themselves for the time-consuming process of writing their own code to optimize aerospace designs. In search of a better way, a team of engineers at the Aerospace Design Lab at Stanford University has released SU2, an open-source application that models the effects of fluids moving over aerodynamic surfaces.
Givaudan has turned to researchers in the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory (CSAIL) for help analyzing taste-test results. To analyze taste-test results, the CSAIL researchers are using genetic programming, in which mathematical models compete with each other to fit the available data and then cross-pollinate to produce models that are more accurate.
Civil engineers at Syracuse University have developed various statistical prediction models using data obtained from the Metropolitan Sewer District of Greater Cincinnati, Ohio, to generate deterioration models for wastewater pipes. The models, when adapted to a given system, is intended to facilitate a proactive approach to pipeline replacements and maintenance.
A new University of Michigan computer model of disease transmission in space and time can predict cholera outbreaks in Bangladesh up to 11 months in advance, providing an early warning system that could help public health officials there.
Traditional motion capture technology works by attaching markers to a subject’s skin or clothing and tracking them as the subject moves. A new system of eight video cameras, shooting from different angles, can now quantify a person’s movements without having the limitations of wiring attached to the subject.
Research into biofuel crops such as switchgrass and Miscanthus has focused mainly on how to grow these crops and convert them into fuels. But many steps lead from the farm to the biorefinery, and each could help or hinder the growth of this new industry. A new computer model developed at the University of Illinois can simplify this transition.
Agent-based computer models use fine-scale data from actual movements of individuals obtained by detailed video recordings, global positioning systems, or mobile phone tracking. Researchers say that these tools, which can help them simulate crowd movements, could also help them model the spread of infections in mass gatherings.
Addressing the complexity of Domain Name System Security (DNSSEC), Sandia National Laboratories computer scientist Casey Deccio has developed a new visualization tool known as DNSViz. DNSSEC is a standard security feature at high-level government offices, but it is extremely complex and Deccio’s tool helps simplify implementation.
A quad porosity model developed by Oklahoma State University researchers uses scanning electron microscopy to characterize up to four porosity systems for shale gas. The simulation model, which will offer better forecasting and potential cost savings, is about to be field-tested in gas reservoirs over the next few months.
When geochemist David Valentine and colleagues published a study in early 2011 documenting how bacteria blooms had consumed almost all of the deepwater methane plumes following the Deepwater Horizon oil spill in 2010, some people were skeptical. A recent publication explains how they did it.
Konrad Juethner, a software engineering consultant, recently used Windows HPC Server to run cluster-based analysis with COMSOL Multiphysics using the hardware he had available at home. His successful setup highlights a high level of accessibility for advanced supercomputing approaches.