Ion channels are important drug targets. A team of researchers University of Vienna has investigated the opening and closing mechanisms of these channels, which represent large proteins with more than 400 amino acids. Their work for the first time calculates in atomic detail the full energy landscape of this protein.
Computer simulations conducted in Germany have shown that the reduction of natural dental wear might be the main cause for widely spread non-carius cervical lesions—the loss of enamel and dentine at the base of the crown—in our teeth. The discovery was made by examining the biomechanical behavior of teeth using finite element analysis methods typically applied to engineering problems.
Scientists have long observed that species seem to have become increasingly capable of evolving in response to changes in the environment. But computer science researchers now say that the popular explanation of competition to survive in nature may not actually be necessary for evolvability to increase.
In efforts to prioritize and efficiently manage the repair of boats and stations damaged by Superstorm Sandy, the U.S. Coast Guard has accredited a system called Coast Guard Search and Rescue Visual Analytics (cgSARVA) developed in collaboration with Purdue University.
To understand the development of sensory representations within our brain, we have to comprehend how electrical activation is linked to the sensory experience. For this reason, researchers in Italy have analyzed the behavior and the activation of neural networks in rats while carrying out tactile object recognition tests. The study represents the first time that the activity of multiple neurons has been monitored.
An engineer in Finland has designed a new evaluation model that allows developers to determine how fatigue sets in with various welded steel materials. By considering the differences between traditional welds and structural joining technologies and newer more advanced methods, he model allows for the development of lighter structures, and as a consequence, more energy-efficient ships.
A new global-scale modeling study that takes into account nitrogen—a key nutrient for plants—estimates that carbon emissions from human activities on land were 40% higher in the 1990s than in studies that did not account for nitrogen. Most existing models used to estimate global emissions changes based on land use do not have the ability to model nitrogen limitations on plant regrowth.
A research team from Aalto University has modeled the work processes and human decision making in scientific peer review with the help of statistical physics. Their study will improve understanding of how actions of reviewers and editors during the review work correlate with the decisions to publish or reject article manuscripts.
The Naval Research Laboratory aided both the 2009 and 2013 Presidential Inaugurations with a technology called CT-Analyst. The software modeling tool is designed to provide first responders with a tool that can provides accurate, instantaneous, 3D predictions of chemical, biological, and radiological agent transport in urban settings.
Georgia Institute of Technology researchers have developed a computational model that can predict video game players’ in-game performance and provide a corresponding challenge they can beat, leading to quicker mastery of new skills. The advance not only could help improve user experiences with video games but also applications beyond the gaming world.
The bones that support our bodies are made of remarkably complex arrangements of materials—so much so that decoding the precise structure responsible for their great strength and resilience has eluded scientists’ best efforts for decades. But now, a team of researchers has finally unraveled the structure of bone with almost atom-by-atom precision.
Making choices involves the evaluation of an accumulation of facts. If a wrong choice is made, Princeton University researchers have recently found, the problem may lie in the facts, or information, rather than the brain's decision-making process. The researchers report that erroneous decisions tend to arise from errors, or "noise," in the information coming into the brain.
Sandia National Laboratories researchers Lisa Deibler and Arthur Brown had a ready-made problem for their computer modeling work when they partnered with the National Nuclear Security Administration’s Kansas City Plant to improve stainless steel tubing that was too hard to meet nuclear weapon requirements.
Researchers from North Carolina State University believe they have solved a puzzle that has vexed science since plants first appeared on Earth. In a paper published online in Proceedings of the National Academy of Sciences, the researchers provide the first 3D model of an enzyme that links a simple sugar, glucose, into long-chain cellulose, the basic building block within plant cell walls that gives plants structure.
For decades, scientists have used sophisticated instruments and computer models to predict the nature of droughts. The majority of these models have steadily predicted an increasingly frequent and severe global drought cycle. But a recent study from a team of researchers in the United State and Australia suggests that one of these widely used tools—the Palmer Drought Severity Index (PDSI)—may be incorrect.
According to a new study by scientists funded by the National Science Foundation (NSF) and at the National Oceanic and Atmospheric Administration (NOAA), clouds over the central Greenland Ice Sheet last July were "just right" for driving surface temperatures there above the melting point. The 2012 melt illustrates the often-overlooked role that clouds play in climate change. Current models don’t do enough, says researchers, to account for their effects.
A team of researchers at the San Diego Supercomputer Center (SDSC) and the University of California, San Diego, has developed a highly scalable computer code that promises to dramatically cut both research times and energy costs in simulating seismic hazards throughout California and elsewhere. The accelerated makes heavier use of graphic processing units (GPUs) than CPUs.
One of the most powerful supercomputers in the world, was recently declared available for use at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (UIUC). Capable at peak performance of nearly 12 quadrillion floating point operations per second, Blue Waters has, more importantly, demonstrated sustained system performance of more than one petaflop on a range of commonly-used science and engineering applications.
Rice University researchers are developing a comprehensive model that will predict how brine, oil, and gas drawn from ultradeep wells react to everything encountered on the way up to the surface and to suggest strategies to maintain the flow.
Researchers from several universities, AT&T Labs, and the American Museum of Natural History have built new models that show a widespread redistribution of Arctic vegetation. They say their findings predict a massive “greening” in the Arctic, as much as 50% in over the next few decades. This transition will help accelerate climate warming, they add.
A once-promising approach for using next-generation, ultra-intense lasers to help deliver commercially viable fusion energy has been brought into serious question by new experimental results and first-of-a-kind simulations of laser-plasma interaction. So-called fast ignition, this process involves a long-discussed possibility of using a hollow cone to help focus laser energy on the pellet core to induce fusion. Unfortunately, these cones appear to fail in that mission.
Lawrence Berkeley National Laboratory recently hosted an international workshop that brought together top climatologists, computer scientists, and engineers from Japan and the United States to exchange ideas for the next generation of climate models as well as the hyper-performance computing environments that will be needed to process the data from those models. It was the 15th in a series of such workshops that have been taking place around the world since 1999.
Researchers at Massachusetts Institute of Technology have devised a model of granular flow in three dimensions. The team found the model accurately predicts the results of granular flow experiments, including a flow configuration that has long puzzled scientists. The model may also be useful for improving the flow of drug powders, tablets, and capsules in pharmaceutical manufacturing.
Researchers at Lawrence Livermore National Laboratory have recently performed a record number of simulations using all 1,572,864 cores of Sequoia, the largest supercomputer in the world. The simulations are the largest particle-in-cell (PIC) code simulations by number of cores ever performed. PIC simulations are used extensively in plasma physics to model the motion of the charged particles
Three-quarters of the DNA in evolved organisms is wrapped around proteins, forming the basic unit of DNA packaging called nucleosomes, like a thread around a spool. The problem lies in understanding how DNA can then be read by such proteins. Nowphysicists have created a model showing how proteins move along DNA, in a paper just published in EPJ E