Computer simulations are essential to test theories and explore what's inaccessible to direct experiment. Digital computers can't use exact, continuous equations of motion and have to slice time into chunks, so persistent errors are introduced in the form of "shadow work" that distorts the result. Scientists have learned to separate the physically realistic aspects of the simulation from the artifacts of the computer method.
Researchers at The University of Texas at Austin have designed a simulation that, for the first time, emulates key properties of electronic topological insulators. Their simulation is part of a rapidly moving scientific race to understand and exploit the potential of topological insulators, which are a state of matter that was only discovered in the past decade.
Researchers at the University of California, San Diego School of Medicine and colleagues have proposed a new method that creates an ontology, or a specification of all the major players in the cell and the relationships between them. This computational model of the cell is made from large networks of gene and protein interactions, and is created automatically from large datasets, helping researchers see potentially new biological components.
New research from North Carolina State University provides molecular-level insights into how cellulose breaks down in wood to create "bio-oils" which can be refined into any number of useful products. Using a supercomputer, the team calculated what's occurring at the molecular level when wood is rapidly heated to high temperature in the absence of oxygen, a decomposition process known as pyrolysis.
Because of the limited image spatial-resolution of even today's best-quality laptop and desktop computers, researchers and physicians often can’t see phenomena that are too large, too small, too complex, or too distant. CAVE2, a next-generation, large-scale virtual environment, combines the benefits of scalable-resolution display walls with virtual-reality system to create a revealing and seamless 2D and 3D environment that is becoming increasingly important in scientific discovery.
A decade ago, a British philosopher put forth the notion that the universe we live in might in fact be a computer simulation run by our descendants. While that seems far-fetched, perhaps even incomprehensible, a team of physicists at the University of Washington has come up with a potential test to see if the idea holds water.
By comparing simulations from 20 different computer models to satellite observations, Lawrence Livermore National Laboratory climate scientists and colleagues from 16 other organizations have found that tropospheric and stratospheric temperature changes are clearly related to human activities.
Like a homeowner prepping for a hurricane, the bacterium Bacillus subtilis uses a long checklist to prepare for survival in hard times. In a new study, scientists at Rice University and the University of Houston uncovered an elaborate mechanism that allows B. subtilis to begin preparing for survival, even as it delays the ultimate decision of whether to "hunker down" and withdraw into a hardened spore.
Metallic glass alloys (or liquid metals) are three times stronger than the best industrial steel, but can be molded into complex shapes with the same ease as plastic. These materials are highly resistant to scratching, denting, shattering, and corrosion. Mathematical methods developed by a Lawrence Berkeley National Laboratory scientists will help explain why liquid metals have wildly different breaking points.
Since the phenomenon was discovered in 1875, hydrogen embrittlement has been a persistent problem for the design of structural materials. Despite decades of research, experts have yet to fully understand the physics underlying the problem and must still resort to a trial-and-error approach. Now, a team of researchers have shown that the answer may be rooted in how hydrogen modifies material behaviors at the nanoscale.
Using computer simulations, researchers from the University of California, Davis and the Chinese Academy of Sciences in Beijing have helped to solve a mystery that scientists have puzzled over since the early 1950s: What accounts for Earth's core density?
Researchers at Sandia National Laboratories and the University of New Mexico are comparing supercomputer simulations of blast waves on the brain with clinical studies of veterans suffering from mild traumatic brain injuries (TBIs) to help improve helmet designs.
Over the course of two weeks this fall, computer models made a startling sequence of correct and useful predictions. By running thousands of simulations on polling data, Nate Silver correctly forecasted how all 50 states would vote for president. In the case of Hurricane Sandy, meteorologists identified the potential danger to the Northeast nearly a week before the storm arrived. Computer models of many kinds have improved in recent years, and the approach is finding new, unexpected uses.
A new way to make glass has been discovered by a collaboration of researchers at the Universities of Düsseldorf and Bristol using a method that controls how the atoms within a substance are arranged around each other. The researchers created the new type of glass in a computer through encouraging atoms in a nickel-phosphorous alloy to form a polyhedron.
Recent work by scientists in Italy provides a new tool to better understand how sliding friction works in nanotribology, through colloidal crystals. By theoretically studying these systems of charged microparticles, researchers are able to analyze friction forces through molecular dynamics simulations with accuracy never experienced before.
Conventional giant magnetoresistive devices or ferromagnetic tunnel junction devices provide only low frequency oscillation and have been deemed unsuitable for applications requiring millimeter-wave (30-300 GHz) oscillation, including radar. Researchers in Japan have recently demonstrated, however, that oscillations of 5 to 140 GHz is theoretically possible in these devices by supplying direct current.
With its deeply embedded roots, sturdy trunk, and dense profusion of branches, the Tree of Life is a structure of nearly unfathomable complexity and beauty. While major strides have been made to establish the evolutionary hierarchy encompassing every living species, the project is still in its infancy. At Arizona State University's Biodesign Institute, Sudhir Kumar has been filling in the Tree of Life by developing sophisticated methods and bioinformatics tools.
MSC Software Corporation this week announced that Stanford University is using its MSC Nastran and Marc simulation tools to conduct a new study on the testing and analysis of complex composite materials. The goals of the study are to reduce extensive and expensive testing programs, optimize the design of testing configurations and redefine structural deformation and failure processes.
Chemists at the California Institute of Technology have managed, for the first time, to simulate the biological function of a channel called the Sec translocon, which allows specific proteins to pass through membranes. The feat required bridging timescales from the realm of nanoseconds all the way up to full minutes, exceeding the scope of earlier simulation efforts by more than six orders of magnitude.
In this month's issue of R&D Magazine the editors explore the instrumentation and business strategies that help bring nanotechnology to the marketplace. Other features on dynamic light scattering, reverse engineering, click chemistry, product development, rapid prototyping, and simulation software are also included.
Simulation-based engineering design helped generate a first physical prototype of a microchannel heat exchanger.
Simulation tools have evolved from complicated, pricey programs to intelligent tools for use throughout the R&D process.
In the past, designers relied on numerous prototype rounds and tests to determine a design's feasibility. Despite technological advancements, many organizations continue to rely on spreadsheets or hand calculations during the design process. This approach may have worked in the past, but modern business speeds require a more efficient approach to product design.
Current engineering practices create computer models that are numerical in nature to explore different design concepts and evaluate their performance. However, a more natural way to model a system is to use mathematics.
FEA predicts the initiation and evolution of damage in metals, providing an alternative to laboratory structural testing.