Over the course of two weeks this fall, computer models made a startling sequence of correct and useful predictions. By running thousands of simulations on polling data, Nate Silver correctly forecasted how all 50 states would vote for president. In the case of Hurricane Sandy, meteorologists identified the potential danger to the Northeast nearly a week before the storm arrived. Computer models of many kinds have improved in recent years, and the approach is finding new, unexpected uses.
According to the Top500 list, the semiannual ranking of computing systems around the world that was announced Monday morning, Oak Ridge National Laboratory’s Titan is now the world’s most potent supercomputer. It eclipses the most recent top performed, Lawrence Livermore National Laboratory’s Sequoia, with a speed of 17.59 petaflops in testing. The Titan is a Cray XK7 hybrid system, built from 16-core processors equipped with graphic processing unit (GPU) accelerators.
Most electronic data is stored on magnetic hard drives that cannot simply be enlarged to store more data. The required spinning speed for larger sizes strains components. Researchers in Singapore report that an alternative technology, heat-assisted magnetic recording (HAMR), is now a significant step closer to commercial realization. The method has the potential to double storage capacity for a given hard drive.
At IBM, scientists have for the first time precisely placed and tested more than 10,000 carbon nanotube devices in a single chip using mainstream manufacturing processes. Achieved through conventional chemistry, materials, and wafer fabrication methods, the invention helps validate the used of carbon nanotube technology for future electronic circuit design.
The U.S. Department of Energy's Oak Ridge National Laboratory launched a new era of scientific supercomputing on Tuesday with Titan, a system capable of churning through more than 20,000 trillion calculations each second—or 20 petaflops—by employing a family of processors called graphic processing units first created for computer gaming. Titan will be 10 times more powerful than ORNL's last world-leading system, Jaguar.
With the launch of Windows 8, people are about to discover a computing experience unlike anything they've seen before. The Associated Press has written a brief guide to getting past some of the hurdles of this markedly different operation system experience.
A new computer algorithm developed at the University of Buffalo can analyze the footwear marks left at a crime scene according to clusters of footwear types, makes and tread patterns. The tool is able to group recurring patterns in a database of footwear marks, even if the imprint recorded by crime scene investigators is distorted or only a partial print.
A new study by Northwestern University researchers has revealed that public domain name services (DNS) could actually slow down users’ web-surfing experience. As a result, researchers have developed a solution to help avoid such an impact: a tool called “namehelp” that could speed web performance by 40%.
So far, quantum researchers have only been able to manipulate small numbers of qubits, not enough for a practical machine. But researchers at Princeton University have developed a method that may allow the quick and reliable transfer of quantum information throughout a computing device, potentially allowing engineers to build computers consisting of million of quantum bits.
Through a new website unveiled Wednesday, Google is opening a virtual window into the secretive data centers where an intricate maze of computers process Internet search requests, show YouTube video clips, and distribute email for millions of people. The photographic access to Google's data centers coincides with the publication of a Wired magazine article about how the company builds and operates them.
After leading mass spectrometer manufacturers agreed to license technology that has enabled researchers to develop software allows scientists to easily use and share research data collected across proprietary instrument platforms. Called the ProteoWizard Toolkit, this cross-platform set of libraries and applications is expected to bolster large-scale biological research and help improve the understanding of complex diseases like cancer.
People can let their fingers—and hands—do the talking with a new touch-activated system that projects onto walls and other surfaces and allows users to interact with their environment and each other. Developed at Purdue University, the "extended multitouch" system allows more than one person to use a surface at the same time and also enables people to use both hands, distinguishing between the right and left hand.
Named for the Greek word for wisdom, Sophia is a software sentry developed at Idaho National Laboratory that can passively monitor communication pathways in a static computer network and flag new types of conversations so operators can decide if a threat is present. It is the first such cybersecurity technology for SCADA control system network administrators that is being evaluated for deployment to industry.
Today's life scientists are producing genomes galore. But there's a problem: The latest DNA sequencing instruments are burying researchers in trillions of bytes of data and overwhelming existing tools in biological computing. It doesn't help that there's a variety of sequencing instruments feeding a diverse set of applications. Researchers from Iowa State University are developing a set of solutions using high-performance computing.
A one-of-a-kind, high-tech modeling tool designed to simulate different situations on the electric power grid will be on display at the White House. The result of a multi-year funding effort, Pacific Northwest National Laboratory researchers will joining Energy Secretary Steven Chu to demonstrate how GridLAB-D can help power system operators, industry, innovators, and entrepreneurs understand how making a change to one part of the power system impacts other parts on the grid.
Researchers at Rice University are designing transparent, two-terminal, 3D computer memories on flexible sheets that show promise for electronics and sophisticated heads-up displays. The technique is based on the switching properties of silicon oxide.
One hundred years after the birth of mathematician and computer scientist Alan Turing, whose “Turing test” stands as one of the foundational definitions of what constitutes true machine intelligence, a virtual “gamer” created by computer scientists at The University of Texas at Austin has won the annual BotPrize by convincing a panel of judges that their software-based robot was more human-like than half the humans it competed against.
No longer limited to narrow focus groups, painstaking in-person surveys, or artificially controlled studies, researchers today have a far easier time compiling and manipulating large data sets. At the same time, however, sharing such data can be fraught with risks. Researchers with the “Privacy Tools for Sharing Research Data” project at Harvard University aim to keep the flexibility and convenience of sharing large amounts of data while more fully protecting individual privacy.
This week, an open innovation challenge called Mozilla Ignite announced eight winning ideas for innovative applications that offer a glimpse of what the Internet's future might look like, and what the lives of Americans may look like as well. The challenge called for stellar application, or "app," ideas from anywhere in the world that would advance national priorities such as health care, public safety, clean energy, and transportation.
As data centers continue to come under scrutiny for the amount of energy they use, researchers at University of Toronto Scarborough have a suggestion: turn the air conditioning down. Their latest research suggests that turning up the temperature could save energy with little or no increased risk of equipment failure.
A Purdue University physicist, Leonid Rokhinson, has observed evidence of long-sought Majorana fermions, special particles that could unleash the potential of fault-tolerant quantum computing. Rokhinson led a team that is the first to successfully demonstrate the fractional a.c. Josephson effect, which is a signature of the particles.
The human body is proficient at making collagen. And human laboratories are getting better at it all the time. In a development that could lead to better drug design and new treatments for disease, Rice University researchers have made a major step toward synthesizing custom collagen. The scientists who have learned how to make collagen are now digging into its molecular structure to see how it forms and interacts with biological systems.
As cloud computing is becoming more popular, new techniques to protect the systems must be developed. Computer scientists in Texas have developed a technique to automatically allow one computer in a virtual network to monitor another for intrusions, viruses or anything else that could cause a computer to malfunction. Dubbed “space travel”, the technique bridges the gap between computer hardware and software systems.
University of Oregon scientists have found a way to correctly reproduce not only the structure but also important thermodynamic quantities, such as pressure and compressibility, of a large, multiscale system at variable levels of molecular coarse-graining.
Designed to improve the way businesses manage the scientific innovation lifecycle, the new Accelrys Process Management and Compliance Suite unifies Accelrys Inc.’s lifecycle management software offerings, covering the ground between product development and process execution. It is geared to help companies bring products to market faster and at a lower cost, while meeting critical quality and regulatory compliance objectives.