A team of researchers at the San Diego Supercomputer Center (SDSC) and the University of California, San Diego, has developed a highly scalable computer code that promises to dramatically cut both research times and energy costs in simulating seismic hazards throughout California and elsewhere. The accelerated makes heavier use of graphic processing units (GPUs) than CPUs.
One of the most powerful supercomputers in the world, was recently declared available for use at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (UIUC). Capable at peak performance of nearly 12 quadrillion floating point operations per second, Blue Waters has, more importantly, demonstrated sustained system performance of more than one petaflop on a range of commonly-used science and engineering applications.
A once-promising approach for using next-generation, ultra-intense lasers to help deliver commercially viable fusion energy has been brought into serious question by new experimental results and first-of-a-kind simulations of laser-plasma interaction. So-called fast ignition, this process involves a long-discussed possibility of using a hollow cone to help focus laser energy on the pellet core to induce fusion. Unfortunately, these cones appear to fail in that mission.
Cell interact with their surroundings using proteins called integrin, which reside in a cell’s outer plasma membrane. Despite their importance—good and bad—scientists don’t exactly know how integrins work. Scientists have yet to obtain the entire crystal structure of integrin within the plasma membrane, so a computer model of integrin that reveals its molecular dynamics has been developed by Lawrence Berkeley National Laboratory researchers.
Computer simulations of water under extreme pressure are helping geochemists understand how carbon might be recycled from hundreds of miles below the Earth's surface. Carbon compounds are the basis of life, provide most of our fuels and contribute to climate change. The cycling of carbon through the oceans, atmosphere, and shallow crust of the Earth has been intensively studied, but little is known about what happens to carbon deep in the Earth.
Researchers at Lawrence Livermore National Laboratory have recently performed a record number of simulations using all 1,572,864 cores of Sequoia, the largest supercomputer in the world. The simulations are the largest particle-in-cell (PIC) code simulations by number of cores ever performed. PIC simulations are used extensively in plasma physics to model the motion of the charged particles
Mechdyne Corporation has recently announced that it has licensed the CAVE2 hybrid reality environment developed by the Electronic Visualization Laboratory at University of Illinois at Chicago. The licensing agreement was signed in January of 2013, and continues the strong working relationship that began in 1994 when Mechdyne licensed the EVL-designed original CAVE technology.
Researchers at Pompeu Fabra University (Spain) have created a high resolution atlas of the heart with 3D images taken from 138 people. The study demonstrates that an average image of an organ along with its variations can be obtained for the purposes of comparing individual cases and differentiating healthy forms from pathologies.
Researchers have used the 3D simulation capabilities of the supercomputers at the Texas Advanced Computing Center to predict the formation of accretion disks and relativistic jets that warp and bend more than previously thought, shaped both by the extreme gravity of the black hole and by powerful magnetic forces generated by its spin. Their highly detailed models of the black hole environment contribute new knowledge to the field.
A Lawrence Livermore National Laboratory team is working to improve lithium-ion battery performance, lifetime, and safety. Working with Lawrence Berkeley National Laboratory, the scientists are developing a new methodology for performing first-principles quantum molecular dynamics simulations at an unprecedented scale to understand key aspects of the chemistry and dynamics in lithium-ion batteries, particularly at interfaces.
Researchers are improving the performance of technologies ranging from medical computed tomography scanners to digital cameras using a system of models to extract specific information from huge collections of data and then reconstructing images like a jigsaw puzzle. The new approach is called model-based iterative reconstruction, or MBIR, and it is helping to greatly reduce the noise in data, providing great clarity at lower radiation intensities.
The National Science Foundation (NSF), along with the journal Science, this week announced the 53 winners and honorable mentions of the International Science & Technology Visualization Challenge, a contest jointly sponsored by NSF and the joournal Science. The winning entries highlight the often stunning capabilities of computer-aided visualization techniques.
Using an exotic form of silicon could substantially improve the efficiency of solar cells, according to computer simulations by researchers at the University of California, Davis and in Hungary. Solar cells are based on the photoelectric effect: A photon, or particle of light, hits a silicon crystal and generates a negatively charged electron and a positively charged hole. Collecting those electron-hole pairs generates electric current. Conventional solar cells generate one electron-hole pair per incoming photon, and have a theoretical maximum efficiency of 33%. One exciting new route to improved efficiency is to generate more than one electron-hole pair per photon.
Stanford Engineering's Center for Turbulence Research has set a new record in computational science by successfully using a supercomputer with more than one million computing cores to solve a complex fluid dynamics problem—the prediction of noise generated by a supersonic jet engine.
Marking the culmination of over 10 years of investigation by scientists to show—in vivo—that complex four-stranded structures exist in the human genome alongside Watson and Crick’s famous double helix, researchers in the U.K. have recently published a paper that goes on to show clear links between concentrations of four-stranded quadruplexes and the process of DNA replication, which is pivotal to cell division and production.
Armed with a better understanding of how glasses age and evolve, researchers at the University of Chicago and the University of Wisconsin-Madison raise the possibility of designing a new class of materials at the molecular level via a vapor-deposition process.
Computer simulations are essential to test theories and explore what's inaccessible to direct experiment. Digital computers can't use exact, continuous equations of motion and have to slice time into chunks, so persistent errors are introduced in the form of "shadow work" that distorts the result. Scientists have learned to separate the physically realistic aspects of the simulation from the artifacts of the computer method.
Researchers at The University of Texas at Austin have designed a simulation that, for the first time, emulates key properties of electronic topological insulators. Their simulation is part of a rapidly moving scientific race to understand and exploit the potential of topological insulators, which are a state of matter that was only discovered in the past decade.
Researchers at the University of California, San Diego School of Medicine and colleagues have proposed a new method that creates an ontology, or a specification of all the major players in the cell and the relationships between them. This computational model of the cell is made from large networks of gene and protein interactions, and is created automatically from large datasets, helping researchers see potentially new biological components.
New research from North Carolina State University provides molecular-level insights into how cellulose breaks down in wood to create "bio-oils" which can be refined into any number of useful products. Using a supercomputer, the team calculated what's occurring at the molecular level when wood is rapidly heated to high temperature in the absence of oxygen, a decomposition process known as pyrolysis.
Because of the limited image spatial-resolution of even today's best-quality laptop and desktop computers, researchers and physicians often can’t see phenomena that are too large, too small, too complex, or too distant. CAVE2, a next-generation, large-scale virtual environment, combines the benefits of scalable-resolution display walls with virtual-reality system to create a revealing and seamless 2D and 3D environment that is becoming increasingly important in scientific discovery.
A decade ago, a British philosopher put forth the notion that the universe we live in might in fact be a computer simulation run by our descendants. While that seems far-fetched, perhaps even incomprehensible, a team of physicists at the University of Washington has come up with a potential test to see if the idea holds water.
By comparing simulations from 20 different computer models to satellite observations, Lawrence Livermore National Laboratory climate scientists and colleagues from 16 other organizations have found that tropospheric and stratospheric temperature changes are clearly related to human activities.
Like a homeowner prepping for a hurricane, the bacterium Bacillus subtilis uses a long checklist to prepare for survival in hard times. In a new study, scientists at Rice University and the University of Houston uncovered an elaborate mechanism that allows B. subtilis to begin preparing for survival, even as it delays the ultimate decision of whether to "hunker down" and withdraw into a hardened spore.
Metallic glass alloys (or liquid metals) are three times stronger than the best industrial steel, but can be molded into complex shapes with the same ease as plastic. These materials are highly resistant to scratching, denting, shattering, and corrosion. Mathematical methods developed by a Lawrence Berkeley National Laboratory scientists will help explain why liquid metals have wildly different breaking points.