Through advanced computer modeling of house fires, mechanical engineers at the University of New South Wales are giving fire fighters a new suite of tools to investigate and battle dangerous blazes in time for the traditionally high-risk winter months. Beginning with an ignition point, the models can map how fires behave as they grow, accurately predicting their overall temperature and pinpointing dangerous hotspots that responding personnel should avoid.
A Sandia National Laboratories modeling study contradicts a long-held belief of geologists that pore sizes and chemical compositions are uniform throughout a given strata, which are horizontal slices of sedimentary rock. By understanding the variety of pore sizes and spatial patterns in strata, geologists can help achieve more production from underground oil reservoirs and water aquifers.
New research from North Carolina State University shows that a wind-driven "tumbleweed" Mars rover would be capable of moving across rocky Martian terrain—findings that could also help the National Aeronautics and Space Administration (NASA) design the best possible vehicle.
A new study by civil engineers at Massachusetts Institute of Technology shows that using stiffer pavements on the nation's roads could reduce vehicle fuel consumption by as much as 3%—a savings that could add up to 273 million barrels of crude oil per year, or $15.6 billion at today's oil prices. This would result in an accompanying annual decrease in carbon dioxide emissions of 46.5 million metric tons.
Multiphysics, COMSOL’s software environment for modeling and simulating any physics-based system, recently received a major update. New capabilities in version 4.3 include three new discipline-specific add-on modules, fast and powerful meshing, a new "Double Dogleg" solver for mechanical contact and highly nonlinear simulations, and numerous user-inspired enhancements.
A collaboration between Lehigh University physicists and University of Miami biologists addresses an important fundamental question in basic cell biology: How do living cells figure out when and where to grow?
A multiyear collaboration among Stanford University engineering departments uses some of the world's fastest supercomputers to model the complexities of hypersonic flight. Someday, their work may lead to planes that fly at many times the speed of sound.
Modeling and simulation tools can help researchers understand and optimize the design of lithium-ion batteries.
For those who study earthquakes, one major challenge has been trying to understand all the physics of a fault—both during an earthquake and at times of "rest"—in order to know more about how a particular region may behave in the future. Now, researchers at the California Institute of Technology have developed the first computer model of an earthquake-producing fault segment that reproduces, in a single physical framework, the available observations of both the fault's seismic (fast) and aseismic (slow) behavior.
By developing software that uses 3D models of proteins involved in cystic fibrosis, a team of scientists at Duke University has identified several new molecules that may ease the symptoms of the disease.
Lockheed Martin extends 3D printing to manufacturing and custom vehicles.
Many simulations and experiments already generate petabytes of data—a single petabyte is 2,000 times more data than you can fit on a typical laptop—and they will soon be generating exabytes. The Department of Energy’s newly established Scalable Data Management, Analysis, and Visualization (SDAV) Institute is intended to help scientists deal with the deluge of data.
An air sampler the size of an ear plug is expected to cheaply and easily collect atmospheric samples to improve computer climate models. The novel design of Sandia National Laboratories' phase-change micro-valve sensor employs a commonly used alloy to house an inexpensive microvalve situated above the sample chamber.
A Massachusetts Institute of Technology researcher has come up with a model that predicts the flow of granular materials under a variety of conditions. The model improves on existing models by taking into account one important factor: How the size of a grain affects the entire flow.
Computer scientists and biologists in the Data Science Research Center at Rensselaer Polytechnic Institute have developed a rare collaboration between the two fields to pick apart a fundamental roadblock to progress in modern medicine. Their partnership has uncovered a new computational model called "cell graphs" that links the structure of human tissue to its corresponding biological function.
Intensive research around the world has focused on improving the performance of solar photovoltaic cells and bringing down their cost. But very little attention has been paid to the best ways of arranging those cells, which are typically placed flat on a rooftop or other surface. Now, a team of Massachusetts Institute of Technology researchers has come up with a very different approach.
While several recent studies suggest that much of the world is likely to experience freshwater shortages as the population increases and temperatures rise, determining the relative impact of each has been difficult. A recent Oak Ridge National Laboratory paper outlines a process that might help.
Using a sophisticated weather model, environmental engineers at Stanford University have defined optimal placement of a grid of four wind farms off the United States East Coast. The model successfully balances production at times of peak demand and significantly reduces costly spikes and zero-power events.
Climate is believed to be the driving force behind most of humanity’s evolutionary processes, including geographical range change. According to a new paper, new concepts such as “refugia”, or movements forced by harsh Ice Age climates, may explain the emergence of new species, or subspecies.
In a challenge to current astrophysical models, researchers at Sandia National Laboratories and the University of Rostock in Germany have found that current calibrations of planetary interiors overstate water's compressibility by as much as 30%.
On the one-year anniversary of the devastating Japanese tsunami, engineers from the University of Southern California’s Viterbi School of Engineering Tsunami Research Center are working with the State of California to better understand the damaging currents caused by tsunamis. They hope to better understand the effects generated by tsunamis within California ports and harbors.
In October 2010, a neutron star near the center of our galaxy erupted with hundreds of X-ray bursts that were powered by a barrage of thermonuclear explosions on the star's surface. NASA's Rossi X-ray Timing Explorer captured the month-long fusillade in high detail, identifying behavior not seen in the previous 100 neutron star observations in the past 30 years.
Modeling biological systems can provide key insights for scientists and medical researchers, but periodic cycles that repeat themselves—so-called oscillatory systems—pose some key challenges. Researchers at North Carolina State University have developed a new method for estimating the parameters used in such models.
Researchers at Oak Ridge National Laboratory are sharing computational resources and expertise to improve the detail and performance of the Community Earth System Model, a scientific application code that is the product of one of the world's largest collaborations of climate researchers.
In 2009, when the United States fell into economic recession, greenhouse gas emissions also fell, by 6.59% relative to 2008. In the power sector, however, the recession was not the main cause. Researchers at Harvard University have shown that the primary explanation for the reduction in carbon dioxide emissions from power generation that year was that a decrease in the price of natural gas reduced the industry's reliance on coal.