Modern research tools like supercomputers, particle colliders, and telescopes are generating so much data, so quickly, many scientists fear that soon they will not be able to keep up with the deluge. A team of computer researchers from universities and national laboratories are fighting to keep up, and have recently developed a tool that is able to query a massive 32 TB dataset in just 3 secs.
A new computational model developed by a team of Virginia Tech researchers provides a framework to better understand responses of macrophage cells of the human immune system. The Virginia Tech team used the Metropolis algorithm, a computer simulation technique widely used in physics and chemistry, to enumerate possible molecular mechanisms giving rise to priming and tolerance.
With the help of a $2 million grant from the U.S. Office of Naval Research, mechanical engineers at the University of Wisconsin-Madison will develop a tool to characterize the performance of a new class of alternative fuels that could be used in maritime vehicles such as submarines and aircraft carriers.
A group of Japanese scientists have surprised themselves by being able to predict the success or failure of blockbuster movies at the box office using a set of mathematical models. The researchers used the effects of advertising and word-of-mouth communication to create a model that turned out to be successful in predicting how each movie fared once it hit the silver screen.
A new set of computer models has successfully predicted negative side effects in hundreds of current drugs, based on the similarity between their chemical structures and those molecules known to cause side effects, according to a paper.
New research by a team of Lawrence Livermore National Laboratory scientists and international collaborators shows that the observed ocean warming over the last 50 years is consistent with climate models only if the models include the impacts of observed increases in greenhouse gas during the 20th century.
Cutting-edge computer processors consist of 1.4 billion transistors. Such tiny structures, however, have a major drawback: The read-out process can influence their states in an uncontrolled way. A new model is able to detect and to avoid these “back-action effects” particularly at the quantum level.
In a significant departure from earlier models, neural engineers and neuroscientists working at Stanford University have developed a new model for the brain activity underlying arm movements. Motor neurons do not represent external-world parameters as previously thought, but rather send a few basic rhythmic patterns down the spin to drive movement.
Scientists at Rice University and the University of Texas MD Anderson Cancer Center have successfully profiled protein pathways found to be distinctive to leukemia patients with particular variants of the disease. Their research involved the creation of a new computational approach to identifying complex networks in protein signaling.
When power plants begin capturing their carbon emissions to reduce greenhouse gases it will be an expensive undertaking. Current technologies would use about one-third of the energy generated by the plants and, as a result, substantially drive up the price of electricity. But a new computer model developed by University of California, Berkeley chemists shows that less expensive technologies are on the horizon.
Through advanced computer modeling of house fires, mechanical engineers at the University of New South Wales are giving fire fighters a new suite of tools to investigate and battle dangerous blazes in time for the traditionally high-risk winter months. Beginning with an ignition point, the models can map how fires behave as they grow, accurately predicting their overall temperature and pinpointing dangerous hotspots that responding personnel should avoid.
A Sandia National Laboratories modeling study contradicts a long-held belief of geologists that pore sizes and chemical compositions are uniform throughout a given strata, which are horizontal slices of sedimentary rock. By understanding the variety of pore sizes and spatial patterns in strata, geologists can help achieve more production from underground oil reservoirs and water aquifers.
New research from North Carolina State University shows that a wind-driven "tumbleweed" Mars rover would be capable of moving across rocky Martian terrain—findings that could also help the National Aeronautics and Space Administration (NASA) design the best possible vehicle.
A new study by civil engineers at Massachusetts Institute of Technology shows that using stiffer pavements on the nation's roads could reduce vehicle fuel consumption by as much as 3%—a savings that could add up to 273 million barrels of crude oil per year, or $15.6 billion at today's oil prices. This would result in an accompanying annual decrease in carbon dioxide emissions of 46.5 million metric tons.
Multiphysics, COMSOL’s software environment for modeling and simulating any physics-based system, recently received a major update. New capabilities in version 4.3 include three new discipline-specific add-on modules, fast and powerful meshing, a new "Double Dogleg" solver for mechanical contact and highly nonlinear simulations, and numerous user-inspired enhancements.
A collaboration between Lehigh University physicists and University of Miami biologists addresses an important fundamental question in basic cell biology: How do living cells figure out when and where to grow?
A multiyear collaboration among Stanford University engineering departments uses some of the world's fastest supercomputers to model the complexities of hypersonic flight. Someday, their work may lead to planes that fly at many times the speed of sound.
Modeling and simulation tools can help researchers understand and optimize the design of lithium-ion batteries.
For those who study earthquakes, one major challenge has been trying to understand all the physics of a fault—both during an earthquake and at times of "rest"—in order to know more about how a particular region may behave in the future. Now, researchers at the California Institute of Technology have developed the first computer model of an earthquake-producing fault segment that reproduces, in a single physical framework, the available observations of both the fault's seismic (fast) and aseismic (slow) behavior.
By developing software that uses 3D models of proteins involved in cystic fibrosis, a team of scientists at Duke University has identified several new molecules that may ease the symptoms of the disease.
Lockheed Martin extends 3D printing to manufacturing and custom vehicles.
Many simulations and experiments already generate petabytes of data—a single petabyte is 2,000 times more data than you can fit on a typical laptop—and they will soon be generating exabytes. The Department of Energy’s newly established Scalable Data Management, Analysis, and Visualization (SDAV) Institute is intended to help scientists deal with the deluge of data.
An air sampler the size of an ear plug is expected to cheaply and easily collect atmospheric samples to improve computer climate models. The novel design of Sandia National Laboratories' phase-change micro-valve sensor employs a commonly used alloy to house an inexpensive microvalve situated above the sample chamber.
A Massachusetts Institute of Technology researcher has come up with a model that predicts the flow of granular materials under a variety of conditions. The model improves on existing models by taking into account one important factor: How the size of a grain affects the entire flow.
Computer scientists and biologists in the Data Science Research Center at Rensselaer Polytechnic Institute have developed a rare collaboration between the two fields to pick apart a fundamental roadblock to progress in modern medicine. Their partnership has uncovered a new computational model called "cell graphs" that links the structure of human tissue to its corresponding biological function.