Using computer simulations, researchers from the University of California, Davis and the Chinese Academy of Sciences in Beijing have helped to solve a mystery that scientists have puzzled over since the early 1950s: What accounts for Earth's core density?
Researchers at Sandia National Laboratories and the University of New Mexico are comparing supercomputer simulations of blast waves on the brain with clinical studies of veterans suffering from mild traumatic brain injuries (TBIs) to help improve helmet designs.
Over the course of two weeks this fall, computer models made a startling sequence of correct and useful predictions. By running thousands of simulations on polling data, Nate Silver correctly forecasted how all 50 states would vote for president. In the case of Hurricane Sandy, meteorologists identified the potential danger to the Northeast nearly a week before the storm arrived. Computer models of many kinds have improved in recent years, and the approach is finding new, unexpected uses.
A new way to make glass has been discovered by a collaboration of researchers at the Universities of Düsseldorf and Bristol using a method that controls how the atoms within a substance are arranged around each other. The researchers created the new type of glass in a computer through encouraging atoms in a nickel-phosphorous alloy to form a polyhedron.
Recent work by scientists in Italy provides a new tool to better understand how sliding friction works in nanotribology, through colloidal crystals. By theoretically studying these systems of charged microparticles, researchers are able to analyze friction forces through molecular dynamics simulations with accuracy never experienced before.
Conventional giant magnetoresistive devices or ferromagnetic tunnel junction devices provide only low frequency oscillation and have been deemed unsuitable for applications requiring millimeter-wave (30-300 GHz) oscillation, including radar. Researchers in Japan have recently demonstrated, however, that oscillations of 5 to 140 GHz is theoretically possible in these devices by supplying direct current.
With its deeply embedded roots, sturdy trunk, and dense profusion of branches, the Tree of Life is a structure of nearly unfathomable complexity and beauty. While major strides have been made to establish the evolutionary hierarchy encompassing every living species, the project is still in its infancy. At Arizona State University's Biodesign Institute, Sudhir Kumar has been filling in the Tree of Life by developing sophisticated methods and bioinformatics tools.
MSC Software Corporation this week announced that Stanford University is using its MSC Nastran and Marc simulation tools to conduct a new study on the testing and analysis of complex composite materials. The goals of the study are to reduce extensive and expensive testing programs, optimize the design of testing configurations and redefine structural deformation and failure processes.
Chemists at the California Institute of Technology have managed, for the first time, to simulate the biological function of a channel called the Sec translocon, which allows specific proteins to pass through membranes. The feat required bridging timescales from the realm of nanoseconds all the way up to full minutes, exceeding the scope of earlier simulation efforts by more than six orders of magnitude.
In this month's issue of R&D Magazine the editors explore the instrumentation and business strategies that help bring nanotechnology to the marketplace. Other features on dynamic light scattering, reverse engineering, click chemistry, product development, rapid prototyping, and simulation software are also included.
Simulation-based engineering design helped generate a first physical prototype of a microchannel heat exchanger.
Simulation tools have evolved from complicated, pricey programs to intelligent tools for use throughout the R&D process.
In the past, designers relied on numerous prototype rounds and tests to determine a design's feasibility. Despite technological advancements, many organizations continue to rely on spreadsheets or hand calculations during the design process. This approach may have worked in the past, but modern business speeds require a more efficient approach to product design.
Current engineering practices create computer models that are numerical in nature to explore different design concepts and evaluate their performance. However, a more natural way to model a system is to use mathematics.
FEA predicts the initiation and evolution of damage in metals, providing an alternative to laboratory structural testing.
An imaging systems developer accelerated the implementation of advanced thermal imaging filters and algorithms on FPGA hardware.
An unmanned aerial vehicle (UAV) that will fly at speeds approaching Mach 1.4—faster than anything in the sub-50-kg vehicle category today—using an engine two to four times more efficient than any other in its class is under development by researchers at the University of Colorado, Boulder, through a university startup Starcor. The prototype is expected to be ready within a year.
The aerospace and defense community is considered a pioneer in physics-based simulation development and one of its earliest adopters. Design engineers use simulation software to create virtual representations of practically anything and everything, including complete unmanned aerial systems (UAS).
Nearly 100 years after a British neurologist first mapped the blind spots caused by missile wounds to the brains of soldiers, University of Pennsylvania scientists have perfected his map using modern-day technology. Their results create a map of vision in the brain based upon an individual's brain structure, even for people who cannot see. Their result could, among other things, guide efforts to restore vision using a neural prosthesis that stimulates the surface of the brain.
The natural decay of organic carbon contributes more than 90% of the yearly carbon dioxide released into Earth's atmosphere and oceans. Understanding the rate at which leaves decay can help scientists predict this global flux of carbon dioxide. But a single leaf may undergo different rates of decay depending on a number of variables. Researchers have just built a mathematical model that incorporates these variables, and have discovered a commonality within the diversity of leaf decay.
A one-of-a-kind, high-tech modeling tool designed to simulate different situations on the electric power grid will be on display at the White House. The result of a multi-year funding effort, Pacific Northwest National Laboratory researchers will joining Energy Secretary Steven Chu to demonstrate how GridLAB-D can help power system operators, industry, innovators, and entrepreneurs understand how making a change to one part of the power system impacts other parts on the grid.
What does a yogurt look like over time? The food industry will soon be able to answer this question using a new fluid simulation tool developed by scientists at the University of Copenhagen, Denmark, as part of a broad partnership with other research institutions. The method distinguishes itself significantly from known simulation methods which use mesh structures where the vertices are locked in a fixed position. In the new method, the mesh structure is replaced by a dynamic structure where the vertices move one at a time.
University of Oregon scientists have found a way to correctly reproduce not only the structure but also important thermodynamic quantities, such as pressure and compressibility, of a large, multiscale system at variable levels of molecular coarse-graining.
If increasing numbers of wind turbines and photovoltaic systems feed electrical energy into the energy grid, it becomes denser—and more distributed. Researchers in Germany, using model simulations, have discovered that consumers and decentralized generators can easily self-synchronize. Their results indicate that a failure of an individual supply line in the decentralized grid less likely implies an outage in the network as a whole. But care must be taken when adding new lines.
Over the past few decades, the hunt for extrasolar planets has yielded incredible discoveries. Now, planetary researchers have a new tool—simulated models of how planets are born. A team of researchers at The University of Texas at Austin are using supercomputers to model and simulate the protostellar disks that precede the formation of planet.