Making choices involves the evaluation of an accumulation of facts. If a wrong choice is made, Princeton University researchers have recently found, the problem may lie in the facts, or information, rather than the brain's decision-making process. The researchers report that erroneous decisions tend to arise from errors, or "noise," in the information coming into the brain.
Sandia National Laboratories researchers Lisa Deibler and Arthur Brown had a ready-made problem for their computer modeling work when they partnered with the National Nuclear Security Administration’s Kansas City Plant to improve stainless steel tubing that was too hard to meet nuclear weapon requirements.
Researchers from North Carolina State University believe they have solved a puzzle that has vexed science since plants first appeared on Earth. In a paper published online in Proceedings of the National Academy of Sciences, the researchers provide the first 3D model of an enzyme that links a simple sugar, glucose, into long-chain cellulose, the basic building block within plant cell walls that gives plants structure.
For decades, scientists have used sophisticated instruments and computer models to predict the nature of droughts. The majority of these models have steadily predicted an increasingly frequent and severe global drought cycle. But a recent study from a team of researchers in the United State and Australia suggests that one of these widely used tools—the Palmer Drought Severity Index (PDSI)—may be incorrect.
According to a new study by scientists funded by the National Science Foundation (NSF) and at the National Oceanic and Atmospheric Administration (NOAA), clouds over the central Greenland Ice Sheet last July were "just right" for driving surface temperatures there above the melting point. The 2012 melt illustrates the often-overlooked role that clouds play in climate change. Current models don’t do enough, says researchers, to account for their effects.
A team of researchers at the San Diego Supercomputer Center (SDSC) and the University of California, San Diego, has developed a highly scalable computer code that promises to dramatically cut both research times and energy costs in simulating seismic hazards throughout California and elsewhere. The accelerated makes heavier use of graphic processing units (GPUs) than CPUs.
One of the most powerful supercomputers in the world, was recently declared available for use at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (UIUC). Capable at peak performance of nearly 12 quadrillion floating point operations per second, Blue Waters has, more importantly, demonstrated sustained system performance of more than one petaflop on a range of commonly-used science and engineering applications.
Rice University researchers are developing a comprehensive model that will predict how brine, oil, and gas drawn from ultradeep wells react to everything encountered on the way up to the surface and to suggest strategies to maintain the flow.
Researchers from several universities, AT&T Labs, and the American Museum of Natural History have built new models that show a widespread redistribution of Arctic vegetation. They say their findings predict a massive “greening” in the Arctic, as much as 50% in over the next few decades. This transition will help accelerate climate warming, they add.
A once-promising approach for using next-generation, ultra-intense lasers to help deliver commercially viable fusion energy has been brought into serious question by new experimental results and first-of-a-kind simulations of laser-plasma interaction. So-called fast ignition, this process involves a long-discussed possibility of using a hollow cone to help focus laser energy on the pellet core to induce fusion. Unfortunately, these cones appear to fail in that mission.
Lawrence Berkeley National Laboratory recently hosted an international workshop that brought together top climatologists, computer scientists, and engineers from Japan and the United States to exchange ideas for the next generation of climate models as well as the hyper-performance computing environments that will be needed to process the data from those models. It was the 15th in a series of such workshops that have been taking place around the world since 1999.
Researchers at Massachusetts Institute of Technology have devised a model of granular flow in three dimensions. The team found the model accurately predicts the results of granular flow experiments, including a flow configuration that has long puzzled scientists. The model may also be useful for improving the flow of drug powders, tablets, and capsules in pharmaceutical manufacturing.
Researchers at Lawrence Livermore National Laboratory have recently performed a record number of simulations using all 1,572,864 cores of Sequoia, the largest supercomputer in the world. The simulations are the largest particle-in-cell (PIC) code simulations by number of cores ever performed. PIC simulations are used extensively in plasma physics to model the motion of the charged particles
Three-quarters of the DNA in evolved organisms is wrapped around proteins, forming the basic unit of DNA packaging called nucleosomes, like a thread around a spool. The problem lies in understanding how DNA can then be read by such proteins. Nowphysicists have created a model showing how proteins move along DNA, in a paper just published in EPJ E
The drug-resistant bacteria known as MRSA, once confined to hospitals but now widespread in communities, will likely continue to exist in both settings as separate strains, according to a new study. Researchers at Princeton University used mathematical models to explore what will happen to community and hospital MRSA strains, which differ genetically.
Swarming is the spontaneous organized motion of a large number of individuals. It is observed at all scales, from bacterial colonies to animal herds. Physicists in Ireland have uncovered new collective properties of swarm dynamics that could ultimately guide efforts to control swarms of animals, robots, or human crowds.
As computer manufacturers cram ever more processing power onto tiny chips, the connections between electronic components that measure just a few billionths of a meter across allow electrons to leak. One promising solution is to replace those electrons with photons of light. Researchers in Singapore have now developed a numerical model to simulate the performance of circuits that rely on light
Clouds can both cool the planet, by acting as a shield against the sun, and warm the planet, by trapping heat. But why do clouds behave the way they do? And how will a warming planet affect the cloud cover? Lawrence Berkeley National Laboratory scientist David Romps has made it his mission to answer these questions.
Mechdyne Corporation has recently announced that it has licensed the CAVE2 hybrid reality environment developed by the Electronic Visualization Laboratory at University of Illinois at Chicago. The licensing agreement was signed in January of 2013, and continues the strong working relationship that began in 1994 when Mechdyne licensed the EVL-designed original CAVE technology.
According to a recent study from Rice University and the U.S. Environmental Protection Agency, there is good news and better news about ground-level ozone in American cities. While dangerous ozone levels have fallen in places that clamp down on emissions from vehicles and industry, the report suggests that a model widely used to predict the impact of remediation efforts has been too conservative.
A research team led by the University of Colorado Boulder had been looking for clues about why Earth did not warm as much as scientists expected between 2000 and 2010. They now think the culprits are hiding in plain sight—dozens of volcanoes spewing sulfur dioxide. The study results essentially exonerate Asia, including India and China, two countries that are estimated to have increased their industrial sulfur dioxide emissions by about 60% from 2000 to 2010 through coal burning.
Researchers at Pompeu Fabra University (Spain) have created a high resolution atlas of the heart with 3D images taken from 138 people. The study demonstrates that an average image of an organ along with its variations can be obtained for the purposes of comparing individual cases and differentiating healthy forms from pathologies.
Researchers have used the 3D simulation capabilities of the supercomputers at the Texas Advanced Computing Center to predict the formation of accretion disks and relativistic jets that warp and bend more than previously thought, shaped both by the extreme gravity of the black hole and by powerful magnetic forces generated by its spin. Their highly detailed models of the black hole environment contribute new knowledge to the field.
Volcanoes are well known for cooling the climate. But just how much and when has been a bone of contention among historians, glaciologists, and archeologists. Now a team of atmosphere chemists, from the Tokyo Institute of Technology and the University of Copenhagen, has come up with a way to say for sure which historic episodes of global cooling were caused by volcanic eruptions.
Maplesoft this week announced that its ongoing partnership with Toyota Motor Engineering & Manufacturing North America, Inc. has been expanded to include the use of new symbolic computation methods in control systems engineering. The new research will allow developers to consider system nonlinearities, modeling inaccuracies, and parametric uncertainties in the design process, helping Toyota shorten development time while maintaining high quality results.