Researchers at the University of California, San Diego School of Medicine and colleagues have proposed a new method that creates an ontology, or a specification of all the major players in the cell and the relationships between them. This computational model of the cell is made from large networks of gene and protein interactions, and is created automatically from large datasets, helping researchers see potentially new biological components.
New research from North Carolina State University provides molecular-level insights into how cellulose breaks down in wood to create "bio-oils" which can be refined into any number of useful products. Using a supercomputer, the team calculated what's occurring at the molecular level when wood is rapidly heated to high temperature in the absence of oxygen, a decomposition process known as pyrolysis.
Because of the limited image spatial-resolution of even today's best-quality laptop and desktop computers, researchers and physicians often can’t see phenomena that are too large, too small, too complex, or too distant. CAVE2, a next-generation, large-scale virtual environment, combines the benefits of scalable-resolution display walls with virtual-reality system to create a revealing and seamless 2D and 3D environment that is becoming increasingly important in scientific discovery.
A decade ago, a British philosopher put forth the notion that the universe we live in might in fact be a computer simulation run by our descendants. While that seems far-fetched, perhaps even incomprehensible, a team of physicists at the University of Washington has come up with a potential test to see if the idea holds water.
By comparing simulations from 20 different computer models to satellite observations, Lawrence Livermore National Laboratory climate scientists and colleagues from 16 other organizations have found that tropospheric and stratospheric temperature changes are clearly related to human activities.
Like a homeowner prepping for a hurricane, the bacterium Bacillus subtilis uses a long checklist to prepare for survival in hard times. In a new study, scientists at Rice University and the University of Houston uncovered an elaborate mechanism that allows B. subtilis to begin preparing for survival, even as it delays the ultimate decision of whether to "hunker down" and withdraw into a hardened spore.
Metallic glass alloys (or liquid metals) are three times stronger than the best industrial steel, but can be molded into complex shapes with the same ease as plastic. These materials are highly resistant to scratching, denting, shattering, and corrosion. Mathematical methods developed by a Lawrence Berkeley National Laboratory scientists will help explain why liquid metals have wildly different breaking points.
Since the phenomenon was discovered in 1875, hydrogen embrittlement has been a persistent problem for the design of structural materials. Despite decades of research, experts have yet to fully understand the physics underlying the problem and must still resort to a trial-and-error approach. Now, a team of researchers have shown that the answer may be rooted in how hydrogen modifies material behaviors at the nanoscale.
Using computer simulations, researchers from the University of California, Davis and the Chinese Academy of Sciences in Beijing have helped to solve a mystery that scientists have puzzled over since the early 1950s: What accounts for Earth's core density?
Researchers at Sandia National Laboratories and the University of New Mexico are comparing supercomputer simulations of blast waves on the brain with clinical studies of veterans suffering from mild traumatic brain injuries (TBIs) to help improve helmet designs.
A new way to make glass has been discovered by a collaboration of researchers at the Universities of Düsseldorf and Bristol using a method that controls how the atoms within a substance are arranged around each other. The researchers created the new type of glass in a computer through encouraging atoms in a nickel-phosphorous alloy to form a polyhedron.
Recent work by scientists in Italy provides a new tool to better understand how sliding friction works in nanotribology, through colloidal crystals. By theoretically studying these systems of charged microparticles, researchers are able to analyze friction forces through molecular dynamics simulations with accuracy never experienced before.
Conventional giant magnetoresistive devices or ferromagnetic tunnel junction devices provide only low frequency oscillation and have been deemed unsuitable for applications requiring millimeter-wave (30-300 GHz) oscillation, including radar. Researchers in Japan have recently demonstrated, however, that oscillations of 5 to 140 GHz is theoretically possible in these devices by supplying direct current.
With its deeply embedded roots, sturdy trunk, and dense profusion of branches, the Tree of Life is a structure of nearly unfathomable complexity and beauty. While major strides have been made to establish the evolutionary hierarchy encompassing every living species, the project is still in its infancy. At Arizona State University's Biodesign Institute, Sudhir Kumar has been filling in the Tree of Life by developing sophisticated methods and bioinformatics tools.
MSC Software Corporation this week announced that Stanford University is using its MSC Nastran and Marc simulation tools to conduct a new study on the testing and analysis of complex composite materials. The goals of the study are to reduce extensive and expensive testing programs, optimize the design of testing configurations and redefine structural deformation and failure processes.
Chemists at the California Institute of Technology have managed, for the first time, to simulate the biological function of a channel called the Sec translocon, which allows specific proteins to pass through membranes. The feat required bridging timescales from the realm of nanoseconds all the way up to full minutes, exceeding the scope of earlier simulation efforts by more than six orders of magnitude.
In this month's issue of R&D Magazine the editors explore the instrumentation and business strategies that help bring nanotechnology to the marketplace. Other features on dynamic light scattering, reverse engineering, click chemistry, product development, rapid prototyping, and simulation software are also included.
Simulation-based engineering design helped generate a first physical prototype of a microchannel heat exchanger.
Simulation tools have evolved from complicated, pricey programs to intelligent tools for use throughout the R&D process.
In the past, designers relied on numerous prototype rounds and tests to determine a design's feasibility. Despite technological advancements, many organizations continue to rely on spreadsheets or hand calculations during the design process. This approach may have worked in the past, but modern business speeds require a more efficient approach to product design.
Current engineering practices create computer models that are numerical in nature to explore different design concepts and evaluate their performance. However, a more natural way to model a system is to use mathematics.
FEA predicts the initiation and evolution of damage in metals, providing an alternative to laboratory structural testing.
An imaging systems developer accelerated the implementation of advanced thermal imaging filters and algorithms on FPGA hardware.
An unmanned aerial vehicle (UAV) that will fly at speeds approaching Mach 1.4—faster than anything in the sub-50-kg vehicle category today—using an engine two to four times more efficient than any other in its class is under development by researchers at the University of Colorado, Boulder, through a university startup Starcor. The prototype is expected to be ready within a year.
The aerospace and defense community is considered a pioneer in physics-based simulation development and one of its earliest adopters. Design engineers use simulation software to create virtual representations of practically anything and everything, including complete unmanned aerial systems (UAS).