Advertisement
Simulation Tools
Subscribe to Simulation Tools
View Sample

FREE Email Newsletter

Data visualization tool helps find the 'unknown unknowns'

July 26, 2012 3:30 am | News | Comments

A research team at the Georgia Tech Research Institute has developed a software tool that enables users to perform in-depth analysis of modeling and simulation data, then visualize the results on screen. The new data analysis and visualization tool offers improved ease of use compared to similar tools, the researchers say, and could be readily adapted for use with existing data sets in a variety of disciplines.

3D motion and common cold virus offers hope for improved drugs

July 16, 2012 6:28 am | News | Comments

University of Melbourne researchers are now simulating in 3D the motion of the complete human rhinovirus, the most frequent cause of the common cold, on Australia's fastest supercomputer, paving the way for new drug development.

New simulation method avoids mesh generation, speeds process

July 15, 2012 3:28 pm | News | Comments

Computer simulations are indispensable, but standard finite element technology requires designers to carry out a time-consuming and often error-prone mesh generation step that transfers the computer-aided design (CAD) model into the simulation model. A student in Germany has just accelerated this process by directly integrating the CAD geometry into the finite element analysis, circumventing any mesh generation.

Advertisement

Peering into the heart of a supernova

July 12, 2012 6:32 am | News | Comments

Using computer simulations, researchers from the California Institute of Technology have determined that if the interior of a dying star is spinning rapidly just before it explodes in a magnificent supernova, two different types of signals emanating from that stellar core will oscillate together at the same frequency. This could be a piece of "smoking-gun evidence" that would lead to a better understanding of supernovae.

Princeton researchers working at forefront of 'exascale' supercomputing

June 28, 2012 10:22 am | News | Comments

Scientists at Princeton University are composing the complex codes designed to instruct a new class of powerful computers that will allow researchers to tackle problems that were previously too difficult to solve. These supercomputers, operating at a speed called the "exascale," will produce realistic simulations of complex phenomena in nature such as fusion reactions, earthquakes, and climate change.

ORNL, UTK team maps the nuclear landscape

June 27, 2012 10:38 am | News | Comments

An Oak Ridge National Laboratory and University of Tennessee team has used the Jaguar supercomputer to calculate the number of isotopes allowed by the laws of physics. The team used a quantum approach known as density functional theory, applying it independently to six models of the nuclear interaction to determine that there are about 7,000 possible combinations of protons and neutrons allowed in bound nuclei with up to 120 protons.

X-ray vision exposes aerosol structures

June 27, 2012 9:23 am | News | Comments

Researchers at SLAC National Accelerator Laboratory have captured the most detailed images to date of airborne soot particles, a key contributor to global warming and a health hazard. The discovery reveals the particles' surprisingly complex nanostructures and could ultimately aid the understanding of atmospheric processes important to climate change, as well as the design of cleaner combustion sources, from car engines to power plants.

Researchers establish structure of a new superhard form of carbon

June 27, 2012 5:38 am | News | Comments

An international team led by Stony Brook University has established the structure of a new form of carbon. The team used a novel computational method to demonstrate that the properties of what had previously been thought to be only a hypothetical structure of a superhard form of carbon called "M-carbon" matched perfectly the experimental data on "superhard graphite."

Advertisement

Researchers use computer model to probe mysteries of human immune system

June 25, 2012 4:21 am | News | Comments

A new computational model developed by a team of Virginia Tech researchers provides a framework to better understand responses of macrophage cells of the human immune system. The Virginia Tech team used the Metropolis algorithm, a computer simulation technique widely used in physics and chemistry, to enumerate possible molecular mechanisms giving rise to priming and tolerance.

Technique allows simulation of noncrystalline materials

June 22, 2012 3:36 am | by David Chandler, MIT News Office | News | Comments

A multidisciplinary team of researchers at Massachusetts Institute of Technology and in Spain has found a new mathematical approach to simulating the electronic behavior of noncrystalline materials, which may eventually play an important part in new devices including solar cells; organic LED lights; and printable, flexible electronic circuits.

Aircraft engineered with failure in mind may last longer

June 15, 2012 3:46 am | by Jennifer Chu, MIT News Office | News | Comments

Complex systems inhabit a "gray world" of partial failures, Massachusetts Institute of Technology's Olivier de Weck says: While a system may continue to operate as a whole, bits and pieces inevitably degrade. Over time, these small failures can add up to a single catastrophic failure, incapacitating the system. However, De Weck and his colleagues have created a design approach that tailors planes to fly in the face of likely failures.

Repelling the drop on top

June 7, 2012 5:28 am | News | Comments

Life would be a lot easier if the surfaces of window panes, corrosion coatings or microfluidic systems in medical labs could keep themselves free of water and other liquids. A new simulation program developed by researchers in Germany can now work out just how such surfaces have to look for a variety of applications.

Research identifies precise measurement of radiation damage

June 5, 2012 6:42 am | News | Comments

Lawrence Livermore National Laboratory researchers have for the first time identified a precise measurement of the amount of radiation damage that will occur in any given material. With a full understanding of the early stages of the radiation damage process, researchers are provided with better knowledge and tools to manipulate materials to our advantage.

Advertisement

Nuclear weapon simulations show performance in molecular detail

June 5, 2012 6:26 am | News | Comments

U.S. researchers are perfecting simulations that show a nuclear weapon's performance in precise molecular detail. Because international treaties forbid the detonation of nuclear test weapons, tools that can accurately depict an explosion are becoming critical for national defense.

Quantum computers will simulate particle collisions

May 31, 2012 11:13 am | News | Comments

Quantum computers are still years away, but a trio of theorists has already figured out at least one talent they may have. According to the theorists, physicists might one day use quantum computers to study the inner workings of the universe in ways that are far beyond the reach of even the most powerful conventional supercomputers.

How ion bombardment reshapes metal surfaces

May 23, 2012 4:28 am | News | Comments

Ion bombardment of metal surfaces is an important, but poorly understood, nanomanufacturing technique. New research using sophisticated supercomputer simulations has shown what goes on in trillionths of a second. The advance could lead to better ways to predict the phenomenon and more uses of the technique to make new nanoscale products.

COMSOL releases Multiphysics Version 4.3

May 21, 2012 1:36 pm | News | Comments

Multiphysics, COMSOL’s software environment for modeling and simulating any physics-based system, recently received a major update. New capabilities in version 4.3 include three new discipline-specific add-on modules, fast and powerful meshing, a new "Double Dogleg" solver for mechanical contact and highly nonlinear simulations, and numerous user-inspired enhancements.

Navy pilot training enhanced by AEMASE smart machine

May 16, 2012 9:41 am | News | Comments

Navy pilots and other flight specialists soon will have a new "smart machine" installed in training simulators that learns from expert instructors to more efficiently train their students. Sandia National Laboratories' AEMASE is being provided to the Navy as a component of flight simulators.

Modeling the lithium-ion battery

May 15, 2012 5:30 am | White Papers

Modeling and simulation tools can help researchers understand and optimize the design of lithium-ion batteries.

Nanotube 'sponge' has potential in oil spill cleanup

May 10, 2012 10:11 am | News | Comments

A carbon nanotube sponge that can soak up oil in water with unparalleled efficiency has been developed with help from computational simulations performed at Oak Ridge National Laboratory.

Scientists develop large-scale simulation of human blood

May 2, 2012 5:23 am | News | Comments

A team of biomedical engineers and hematologists at the University of Pennsylvania has made large-scale, patient-specific simulations of blood function under the flow conditions found in blood vessels, using robots to run hundreds of tests on human platelets responding to combinations of activating agents that cause clotting.

A True Picture of a Product

April 19, 2012 11:43 am | by Jim Clark,Business Manager, Konica Minolta 3D Scanning Labs, Ramsey, N.J. | Articles | Comments

3D laser scanning technology captures comprehensive dimensional data for R&D, simulations, product testing, and quality control.

Simulate First; Build to Last

April 19, 2012 11:24 am | by Thierry Marchal, Industry Director, ANSYS Inc., Canonsburg, Pa. | Articles | Comments

Analyzing and modifying design parameters early and often can help companies engineer better products.

New institute to tackle 'data tsunami' challenge

April 19, 2012 6:19 am | by Louise Lerner | News | Comments

Many simulations and experiments already generate petabytes of data—a single petabyte is 2,000 times more data than you can fit on a typical laptop—and they will soon be generating exabytes. The Department of Energy’s newly established Scalable Data Management, Analysis, and Visualization (SDAV) Institute is intended to help scientists deal with the deluge of data.

Astronomers put forward new theory on size of black holes

March 23, 2012 1:25 pm | News | Comments

Black holes grow by sucking in gas, which forms a disc around the hole and spirals in. But this usually happens too slowly to explain the great size of black holes at the center of many galaxies, including ours. A new theory compares these giants to a Wall of Death, in which two motorcycles—or gas discs—crash and both quickly fall into the hole.

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading