An international team of researchers affiliated with Göttingen University in Germany has found a way to store vast amounts of data—up to one petabyte—per square inch. The scientists developed a unique molecule with an exploitable electron that carries a spin. This serves as the memory for their electronic device, which can be read out by a magnetic reference electrode at room temperature.
The National Science Foundation (NSF), along with the journal Science, this week announced the 53 winners and honorable mentions of the International Science & Technology Visualization Challenge, a contest jointly sponsored by NSF and the joournal Science. The winning entries highlight the often stunning capabilities of computer-aided visualization techniques.
Watson, the supercomputer famous for beating the world's best human "Jeopardy!" champions, is going to college. IBM today says it will provide a Watson system to Rensselaer Polytechnic Institute, the first time the computer is being sent to a university. Just like the flesh-and-blood students who will work on it, Watson is leaving home to sharpen its skills.
Good grammar helps people make themselves be understood. But when used to concoct a long computer password, grammar provides crucial hints that can help someone crack that password. Carnegie Mellon University researchers have recently demonstrated this fact by developing a grammar-aware password-cracking algorithm that surpassed the capabilities of other state-of-the-art password crackers.
The Dow Innovation Center, a new research facility to be located at the University of Illinois Urbana-Champaign, has recently been announced by Dow and will develop data management solutions. At the same time, Dow has entered into an industry partnership with the National Center for Supercomputing Applications, providing access to expertise and equipment which will accelerate Dow’s discovery processes.
Teams of scientists from across Europe are vying for a funding bonanza that could see two of them receive more than a billion dollars over 10 years to keep the continent at the cutting edge of technology. The contest began with 26 proposals, and just four have made it to the final round, including a plan to develop digital guardian angels, an accurate model of the human brain, and better ways to produce and use graphene.
A new NASA-funded prototype system developed by the National Center for Atmospheric Research now is providing weather forecasts that can help flights avoid major storms as they travel over remote ocean regions. The eight-hour forecasts of potentially dangerous atmospheric conditions are designed for pilots, air traffic controllers and others involved in transoceanic flights.
Because of the limited image spatial-resolution of even today's best-quality laptop and desktop computers, researchers and physicians often can’t see phenomena that are too large, too small, too complex, or too distant. CAVE2, a next-generation, large-scale virtual environment, combines the benefits of scalable-resolution display walls with virtual-reality system to create a revealing and seamless 2D and 3D environment that is becoming increasingly important in scientific discovery.
After more than a decade of research, chip engineers at IBM Research have built a scalable, fab-ready microchip that successfully integrates a complete optical package built from silicon. This silicon nanophotonics breakthrough allows the new chip, which is built on an existing high-performance 90-nm CMOS fabrication line, to exceed a transceiver data rate of 25 Gbps per channel.
Apple’s newest iMac computer line, which went on sale last week, has something in common with world of shipbuilding. Refined for use in constructing vessels by the Office of Naval Research, friction-stir welding is responsible for the enabling the design of the iMacs. The process uses heat and pressure to join metals, and is used to achieve an extra-thin aluminum-bodied computer.
According to the Top500 list, the semiannual ranking of computing systems around the world that was announced Monday morning, Oak Ridge National Laboratory’s Titan is now the world’s most potent supercomputer. It eclipses the most recent top performed, Lawrence Livermore National Laboratory’s Sequoia, with a speed of 17.59 petaflops in testing. The Titan is a Cray XK7 hybrid system, built from 16-core processors equipped with graphic processing unit (GPU) accelerators.
Most electronic data is stored on magnetic hard drives that cannot simply be enlarged to store more data. The required spinning speed for larger sizes strains components. Researchers in Singapore report that an alternative technology, heat-assisted magnetic recording (HAMR), is now a significant step closer to commercial realization. The method has the potential to double storage capacity for a given hard drive.
At IBM, scientists have for the first time precisely placed and tested more than 10,000 carbon nanotube devices in a single chip using mainstream manufacturing processes. Achieved through conventional chemistry, materials, and wafer fabrication methods, the invention helps validate the used of carbon nanotube technology for future electronic circuit design.
The U.S. Department of Energy's Oak Ridge National Laboratory launched a new era of scientific supercomputing on Tuesday with Titan, a system capable of churning through more than 20,000 trillion calculations each second—or 20 petaflops—by employing a family of processors called graphic processing units first created for computer gaming. Titan will be 10 times more powerful than ORNL's last world-leading system, Jaguar.
A new computer algorithm developed at the University of Buffalo can analyze the footwear marks left at a crime scene according to clusters of footwear types, makes and tread patterns. The tool is able to group recurring patterns in a database of footwear marks, even if the imprint recorded by crime scene investigators is distorted or only a partial print.
A new study by Northwestern University researchers has revealed that public domain name services (DNS) could actually slow down users’ web-surfing experience. As a result, researchers have developed a solution to help avoid such an impact: a tool called “namehelp” that could speed web performance by 40%.
So far, quantum researchers have only been able to manipulate small numbers of qubits, not enough for a practical machine. But researchers at Princeton University have developed a method that may allow the quick and reliable transfer of quantum information throughout a computing device, potentially allowing engineers to build computers consisting of million of quantum bits.
After leading mass spectrometer manufacturers agreed to license technology that has enabled researchers to develop software allows scientists to easily use and share research data collected across proprietary instrument platforms. Called the ProteoWizard Toolkit, this cross-platform set of libraries and applications is expected to bolster large-scale biological research and help improve the understanding of complex diseases like cancer.
People can let their fingers—and hands—do the talking with a new touch-activated system that projects onto walls and other surfaces and allows users to interact with their environment and each other. Developed at Purdue University, the "extended multitouch" system allows more than one person to use a surface at the same time and also enables people to use both hands, distinguishing between the right and left hand.
Named for the Greek word for wisdom, Sophia is a software sentry developed at Idaho National Laboratory that can passively monitor communication pathways in a static computer network and flag new types of conversations so operators can decide if a threat is present. It is the first such cybersecurity technology for SCADA control system network administrators that is being evaluated for deployment to industry.
Today's life scientists are producing genomes galore. But there's a problem: The latest DNA sequencing instruments are burying researchers in trillions of bytes of data and overwhelming existing tools in biological computing. It doesn't help that there's a variety of sequencing instruments feeding a diverse set of applications. Researchers from Iowa State University are developing a set of solutions using high-performance computing.
A one-of-a-kind, high-tech modeling tool designed to simulate different situations on the electric power grid will be on display at the White House. The result of a multi-year funding effort, Pacific Northwest National Laboratory researchers will joining Energy Secretary Steven Chu to demonstrate how GridLAB-D can help power system operators, industry, innovators, and entrepreneurs understand how making a change to one part of the power system impacts other parts on the grid.
Researchers at Rice University are designing transparent, two-terminal, 3D computer memories on flexible sheets that show promise for electronics and sophisticated heads-up displays. The technique is based on the switching properties of silicon oxide.
One hundred years after the birth of mathematician and computer scientist Alan Turing, whose “Turing test” stands as one of the foundational definitions of what constitutes true machine intelligence, a virtual “gamer” created by computer scientists at The University of Texas at Austin has won the annual BotPrize by convincing a panel of judges that their software-based robot was more human-like than half the humans it competed against.
No longer limited to narrow focus groups, painstaking in-person surveys, or artificially controlled studies, researchers today have a far easier time compiling and manipulating large data sets. At the same time, however, sharing such data can be fraught with risks. Researchers with the “Privacy Tools for Sharing Research Data” project at Harvard University aim to keep the flexibility and convenience of sharing large amounts of data while more fully protecting individual privacy.