In 2009, when the United States fell into economic recession, greenhouse gas emissions also fell, by 6.59% relative to 2008. In the power sector, however, the recession was not the main cause. Researchers at Harvard University have shown that the primary explanation for the reduction in carbon dioxide emissions from power generation that year was that a decrease in the price of natural gas reduced the industry's reliance on coal.
According to textbooks, arsenic occurs in gray, yellow, and black forms. However, the existence of black arsenic, which should be analogous to black phosphorus, has never been indisputably proven. Chemists seeking an answer recently combined quantum chemical computations with experimental investigations of phase formation to discover the stable form of black arsenic.
The least expensive way for the Western United States to reduce greenhouse gas emissions enough to help prevent the worst consequences of global warming is to replace coal with renewable and other sources of energy that may include nuclear power, according to a new study by University of California, Berkeley researchers.
Data from Scripps Institution of Oceanography, NOAA, and the University of California, San Diego has been used by Google experts this week to sharpen the resolution of seafloor maps in the popular Google Earth application. The original version of the program, according to a Scripps geophysicist, had high resolution but was full of thousands of blunders from old data.
Using models similar to those used in weapons research, scientists may soon know more about exoplanets, those objects beyond the realm of our solar system. In a new study, Lawrence Livermore National Laboratory scientists and collaborators came up with new methods for deriving and testing the equation of state of matter in exoplanets and figured out the mass-radius and mass-pressure relations for materials relevant to planetary interiors.
Coinciding with a peak in solar activity, NASA Goddard Space Flight Center’s Space Weather Laboratory will soon simultaneously produce as many as 100 computerized forecasts by calculating multiple possible parameters, improving our ability to predict the impact of solar storms. Currently, just one set of conditions is used to anticipate solar-storm activity.
Every year, students studying aeronautical and astronautical design brace themselves for the time-consuming process of writing their own code to optimize aerospace designs. In search of a better way, a team of engineers at the Aerospace Design Lab at Stanford University has released SU2, an open-source application that models the effects of fluids moving over aerodynamic surfaces.
Givaudan has turned to researchers in the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory (CSAIL) for help analyzing taste-test results. To analyze taste-test results, the CSAIL researchers are using genetic programming, in which mathematical models compete with each other to fit the available data and then cross-pollinate to produce models that are more accurate.
Civil engineers at Syracuse University have developed various statistical prediction models using data obtained from the Metropolitan Sewer District of Greater Cincinnati, Ohio, to generate deterioration models for wastewater pipes. The models, when adapted to a given system, is intended to facilitate a proactive approach to pipeline replacements and maintenance.
A new University of Michigan computer model of disease transmission in space and time can predict cholera outbreaks in Bangladesh up to 11 months in advance, providing an early warning system that could help public health officials there.
Traditional motion capture technology works by attaching markers to a subject’s skin or clothing and tracking them as the subject moves. A new system of eight video cameras, shooting from different angles, can now quantify a person’s movements without having the limitations of wiring attached to the subject.
Research into biofuel crops such as switchgrass and Miscanthus has focused mainly on how to grow these crops and convert them into fuels. But many steps lead from the farm to the biorefinery, and each could help or hinder the growth of this new industry. A new computer model developed at the University of Illinois can simplify this transition.
Agent-based computer models use fine-scale data from actual movements of individuals obtained by detailed video recordings, global positioning systems, or mobile phone tracking. Researchers say that these tools, which can help them simulate crowd movements, could also help them model the spread of infections in mass gatherings.
Addressing the complexity of Domain Name System Security (DNSSEC), Sandia National Laboratories computer scientist Casey Deccio has developed a new visualization tool known as DNSViz. DNSSEC is a standard security feature at high-level government offices, but it is extremely complex and Deccio’s tool helps simplify implementation.
A quad porosity model developed by Oklahoma State University researchers uses scanning electron microscopy to characterize up to four porosity systems for shale gas. The simulation model, which will offer better forecasting and potential cost savings, is about to be field-tested in gas reservoirs over the next few months.
When geochemist David Valentine and colleagues published a study in early 2011 documenting how bacteria blooms had consumed almost all of the deepwater methane plumes following the Deepwater Horizon oil spill in 2010, some people were skeptical. A recent publication explains how they did it.
Konrad Juethner, a software engineering consultant, recently used Windows HPC Server to run cluster-based analysis with COMSOL Multiphysics using the hardware he had available at home. His successful setup highlights a high level of accessibility for advanced supercomputing approaches.
Saturn's largest moon, Titan, is an intriguing, alien world that's covered in a thick atmosphere with abundant methane. With an average surface temperature of -300 F and a diameter just less than half of the Earth's, Titan boasts methane clouds and fog, as well as rainstorms and plentiful lakes of liquid methane. The origins of many of these features have remained puzzling to scientists. Until now.
For the first time, scientists have developed a method for generating accurate 3D models of the entire DNA strand of a cell, known as a genome. The genome plays a central role in the functions of almost all human cells, and flaws in its structure are thought to cause various disorders. The method brings scientists one step closer to understanding the genome's function as a whole.
A new automated software toolbox from the Vector Fields Software product line of Cobham Technical Services, called 3D Transformers Environment, delivers finite-element analysis for the rapid design of transformers and reactors.
The current fracture control plan for maintaining and inspecting bridges was developed in the 1960s. It is both costly has not kept up with advances in materials and computerized system analysis. Virginia Tech civil engineer William Wright is working hard to update these standards.
Researchers at the Norwegian University of Life Sciences (UMB) and Forschungszentrum Jülich in Germany have conducted detailed analyses of electrical activity in the brain with the help of mathematical models that reveal the connection between nerve cell activity and the electrical signal recorded by an electrode.
In a new study, scientists at the RIKEN Brain Science Institute have uncovered the mechanisms that help our brain to focus, or lose focus. Computational models and advanced imaging methods have identified the filters that efficiently route only relevant information to perceptual brain regions.
Supercomputer simulations at Oak Ridge National Laboratory are giving scientists access to a key class of proteins involved in drug detoxification. Researchers have performed simulations to observe the motions of water molecules in a class of enzymes called P450s, which are responsible for processing a large fraction of drugs taken by humans.
Physicists at in Germany have significantly improved the calculation method for scattering experiments in particle physics. The new calculation method could be applied to both experiments that are underway at the Large Hadron Collider, and ones that have already been completed at other colliders.