Lawrence Livermore National Laboratory researchers have developed a new simulation capability to model a classic plasma configuration. The researchers demonstrated, for the first time, a fully kinetic model of the dense plasma focus (DPF) Z-pinch device, including the electrodes, in a realistic geometry.
Heat rising up from cities such as New York, Paris and Tokyo might be remotely warming up winters far away in some rural parts of Alaska, Canada, and Siberia, a new study theorizes. In an unusual twist revealed by computer modeling, that same urban heat from buildings and cars may be slightly cooling the autumns in much of the Western United States, Eastern Europe, and the Mediterranean. The finding stems from the ability of “heat island” energy to change high-altitude currents.
Marking the culmination of over 10 years of investigation by scientists to show—in vivo—that complex four-stranded structures exist in the human genome alongside Watson and Crick’s famous double helix, researchers in the U.K. have recently published a paper that goes on to show clear links between concentrations of four-stranded quadruplexes and the process of DNA replication, which is pivotal to cell division and production.
Our eyes may be our window to the world, but how do we make sense of the thousands of images that flood our retinas each day? Scientists at the University of California, Berkeley, have found that the brain is wired to put in order all the categories of objects and actions that we see. They have created the first interactive map of how the brain organizes these groupings.
Researchers at the University of California, San Diego School of Medicine and colleagues have proposed a new method that creates an ontology, or a specification of all the major players in the cell and the relationships between them. This computational model of the cell is made from large networks of gene and protein interactions, and is created automatically from large datasets, helping researchers see potentially new biological components.
A new NASA-funded prototype system developed by the National Center for Atmospheric Research now is providing weather forecasts that can help flights avoid major storms as they travel over remote ocean regions. The eight-hour forecasts of potentially dangerous atmospheric conditions are designed for pilots, air traffic controllers and others involved in transoceanic flights.
Understanding the arrangement of atoms in a solid is vital to materials research—but the problem can be difficult to solve in many important situations. Now, by combining the work of two different scientific camps, Northwestern University researchers have created an algorithm that makes crystal structure solution more automated and reliable.
Because of the limited image spatial-resolution of even today's best-quality laptop and desktop computers, researchers and physicians often can’t see phenomena that are too large, too small, too complex, or too distant. CAVE2, a next-generation, large-scale virtual environment, combines the benefits of scalable-resolution display walls with virtual-reality system to create a revealing and seamless 2D and 3D environment that is becoming increasingly important in scientific discovery.
According to new research by the University of Delaware, renewable energy could fully power a large electric grid 99.9% of the time by 2030 at costs comparable to today’s electricity expenses. The study’s authors developed a computer model to consider 28 billion combinations of renewable energy sources and storage mechanisms, each tested over four years of historical hourly weather data and electricity demands.
Using a new method for estimating greenhouse gases that combines atmospheric measurements with model predictions, Lawrence Berkeley National Laboratory researchers have found that the level of nitrous oxide, a potent greenhouse gas, in California may be 2.5 to 3 times greater than the current inventory. At that level, total nitrous oxide emissions would account for about 8% of California's total greenhouse gas emissions.
A collaboration of several government and academic research organizations are hard at work on a design and manufacturing concept called “model-based design and verification”. Instead of building prototypes and discarding them, manufacturers would conduct virtually all of the design, testing, error identification, and revisions on a computer up to the point of commercial production.
Over the course of two weeks this fall, computer models made a startling sequence of correct and useful predictions. By running thousands of simulations on polling data, Nate Silver correctly forecasted how all 50 states would vote for president. In the case of Hurricane Sandy, meteorologists identified the potential danger to the Northeast nearly a week before the storm arrived. Computer models of many kinds have improved in recent years, and the approach is finding new, unexpected uses.
Opsins, the light-sensitive proteins key to vision, may have evolved earlier and undergone fewer genetic changes than previously believed, according to a new study that used computer modelling to theorize the evolutionary developments of these structures. The analysis incorporated all available genomic information from all relevant animal lineages.
By tailoring geoengineering efforts by region and by need, a new model promises to maximize the effectiveness of solar radiation management while mitigating its potential side effects and risks. The study explores the feasibility of using solar geoengineering to counter the loss of Arctic sea ice.
To combat the effects of climate change, some scientists have proposed temporarily reducing the amount of sunlight reaching the earth. These various geoengineering schemes have typically thought as a standalone fix, but a new computer analysis of future climate change considers emissions reductions together with sunlight reduction. The model shows that such drastic steps to cool the earth would only be necessary in certain scenarios.
Ask adults what number is halfway between 1 and 9, and most will say 5. But pose the same question to small children and they're likely to answer 3. Cognitive scientists theorize that that's because it's actually more natural for humans to think logarithmically than linearly. A new information-theoretical model of human sensory perception and memory sheds light on these peculiarities of the nervous system.
Researchers from North Carolina State University have developed a new technique that allows users to better determine the amount of charge remaining in a battery in real time. Using the researchers' new technique, models are able to estimate remaining charge within 5%.
Using a computational model they designed to incorporate detailed information about plants' interconnected metabolic processes, scientists at Brookhaven National Laboratory have identified key pathways that appear to "favor" the production of either oils or proteins. The research may point the way to new strategies to tip the balance and increase plant oil production.
Nearly 100 years after a British neurologist first mapped the blind spots caused by missile wounds to the brains of soldiers, University of Pennsylvania scientists have perfected his map using modern-day technology. Their results create a map of vision in the brain based upon an individual's brain structure, even for people who cannot see. Their result could, among other things, guide efforts to restore vision using a neural prosthesis that stimulates the surface of the brain.
The natural decay of organic carbon contributes more than 90% of the yearly carbon dioxide released into Earth's atmosphere and oceans. Understanding the rate at which leaves decay can help scientists predict this global flux of carbon dioxide. But a single leaf may undergo different rates of decay depending on a number of variables. Researchers have just built a mathematical model that incorporates these variables, and have discovered a commonality within the diversity of leaf decay.
A one-of-a-kind, high-tech modeling tool designed to simulate different situations on the electric power grid will be on display at the White House. The result of a multi-year funding effort, Pacific Northwest National Laboratory researchers will joining Energy Secretary Steven Chu to demonstrate how GridLAB-D can help power system operators, industry, innovators, and entrepreneurs understand how making a change to one part of the power system impacts other parts on the grid.
Researchers at NIST have developed a new computational method for identifying candidate refrigerant fluids with low global warming potential—the tendency to trap heat in the atmosphere for many decades—as well as other desirable performance and safety features. The NIST effort is the most extensive systematic search for a new class of refrigerants that meet the latest concerns about climate change.
A new study has found that climate-prediction models are good at predicting long-term climate patterns on a global scale, but lose their edge when applied to time frames shorter than three decades and on sub-continental scales.
One of the greatest challenges in neuroscience is to identify the map, or “connectome”, of synaptic connections between neurons. The Blue Brain Project at the Swiss Federal Institute of Technology in Lausanne has recently announced it has identified key principles that determine synapse-scale connectivity by virtually reconstructing a cortical microcircuit and comparing it to a mammalian sample.
Global warming is expected to intensify extreme precipitation, but the rate at which it does so in the tropics has remained unclear. Now, a new study has given an estimate based on model simulations and observations: With every 1 C rise in temperature, the study finds, tropical regions will see 10% heavier rainfall extremes, with possible impacts for flooding in populous regions.