Big data needs big power. The server farms that undergird the Internet run on a vast tide of electricity. Even companies that have invested in upgrades to minimize their eco-footprint use tremendous amounts: The New York Times estimates that Google, for example, uses enough electricity in its data centers to power about 200,000 homes. Now, a team of Princeton University engineers has a solution that could radically cut that power use.
Computerized aids that include person-like characteristics can influence trust and dependence among adults, according to a Clemson University researcher. A recently published study by a Clemson University psychology associate professor examined how decision making would be affected by a human-like aid. The study focused on adults; trust, dependence, and performance while using a computerized decision-making aid for persons with diabetes.
Scientists at Brookhaven National Laboratory and Stony Brook University have been awarded processing time on a new supercomputer at Oak Ridge National Laboratory to study how proteins fold into their 3D shapes.
Since the early days of iris recognition technologies, it has been assumed that the iris was a "stable" biometric over a person's lifetime—"one enrollment for life." However, new research from University of Notre Dame researchers has found that iris biometric enrollment is susceptible to an aging process that causes recognition performance to degrade slowly over time.
Using computer simulations, researchers from the California Institute of Technology have determined that if the interior of a dying star is spinning rapidly just before it explodes in a magnificent supernova, two different types of signals emanating from that stellar core will oscillate together at the same frequency. This could be a piece of "smoking-gun evidence" that would lead to a better understanding of supernovae.
A vessel hunting system called “Rough Rhino,” sponsored by the Office of Naval Research and deployed aboard U.S. aircraft, ships and partner nation ships operating in waters off the coast of Senegal and Cape Verde, has helped track more than 600 targets since it’s been in operation. The effort has culminated in 24 boardings.
Cornell University researchers have developed a new method of generating terahertz signals on an inexpensive silicon chip, offering possible applications in medical imaging, security scanning, and wireless data transfer.
Biologists' capacity for generating genomic data is increasing more rapidly than computer power. A team of Massachusetts Institute of Technology and Harvard University researchers have developed a new algorithm that reduces the time it takes to find a particular gene sequence in a database of genomes.
Northwestern University researchers have created an entirely new family of logic circuits based on magnetic semiconductor devices. The advance could lead to logic circuits up to 1 million times more power-efficient than today's.
Researchers at Massachusetts Institute of Technology have taken a step toward battery-free monitoring systems. Previous work focused on the development of computer and wireless-communication chips that operate at extremely low power levels, and on a variety of devices that can harness power from natural light, heat, and vibrations in the environment. The latest development is a chip that could harness all three of these ambient power sources at once.
Innovation in computing will be essential to finding real-world solutions to sustainability challenges. The immense scale, numerous interconnected effects of actions over time, and diverse scope of these challenges require the ability to collect, structure, and analyze vast amounts of data.
Scientists at Princeton University are composing the complex codes designed to instruct a new class of powerful computers that will allow researchers to tackle problems that were previously too difficult to solve. These supercomputers, operating at a speed called the "exascale," will produce realistic simulations of complex phenomena in nature such as fusion reactions, earthquakes, and climate change.
An Oak Ridge National Laboratory and University of Tennessee team has used the Jaguar supercomputer to calculate the number of isotopes allowed by the laws of physics. The team used a quantum approach known as density functional theory, applying it independently to six models of the nuclear interaction to determine that there are about 7,000 possible combinations of protons and neutrons allowed in bound nuclei with up to 120 protons.
Researchers at IBM and Lawrence Livermore National Laboratory announced that they are broadening their nearly 20-year collaboration in high-performance computing by joining forces to work with industrial partners to help boost their competitiveness in the global economy.
Shimi, a musical companion developed by Georgia Tech’s Center for Music Technology, recommends songs, dances to the beat and keeps the music pumping based on listener feedback. Powered by an Android phone, the robot is also app-based, meaning it can perform other functions, such as face recognition, based on the type of software programmed for it.
The promise of ultrafast quantum computing has moved a step closer to reality with a technique to create rewritable computer chips using a beam of light. Researchers from The City College of New York and the University of California, Berkeley used light to control the spin of an atom's nucleus in order to encode information.
Supercomputing performance is getting a new measurement with the Graph500 executive committee's announcement of specifications for a more representative way to rate the large-scale data analytics at the heart of high-performance computing. An international team announced the single-source shortest-path specification to assess computing performance at the International Supercomputing Conference in Hamburg, Germany.
Nobel winner Roger Myerson's work on single-item auctions was groundbreaking research, but his question regarding the best way to organize an auction in which bidders are competing for multiple items has remained unanswered for decades. Massachusetts Institute of Technology researchers have developed an algorithm to generalize this problem.
A new computational model developed by a team of Virginia Tech researchers provides a framework to better understand responses of macrophage cells of the human immune system. The Virginia Tech team used the Metropolis algorithm, a computer simulation technique widely used in physics and chemistry, to enumerate possible molecular mechanisms giving rise to priming and tolerance.
Researchers in the Biometric Technologies Laboratory at the University of Calgary have developed a way for security systems to combine different biometric measurements—such as eye color, face shape, or fingerprints—and create a learning system that simulates the brain in making decisions about information from different sources.
Memory devices for computers require a large collection of components that can switch between two states, which represent the 1s and 0s of binary language. Engineers hope to make next-generation chips with materials that distinguish between these states by physically rearranging their atoms into different phases. Researchers at the University of Pennsylvania have now provided new insight into how this phase change happens.
The U.S. Department of Energy Office of Science and the National Science Foundation have committed up to $27 million to Open Science Grid, a nine-member partnership extending the reach of distributed high-throughput computing networks.
The National Nuclear Security Administration (NNSA) announced that a supercomputer called Sequoia at Lawrence Livermore National Laboratory was ranked the world's most powerful computing system. Clocking in at 16.32 sustained petaflops, Sequoia earned the No. 1 ranking on the industry standard Top500 list of the world's fastest supercomputers.
Two years ago, a fledgling social-networking site called Blippy accidentally posted the credit card numbers of its users online. While that was a particularly egregious example, such inadvertent information leaks happen all the time. Massachusetts Institute of Technology researchers have developed a new programming system that could help prevent such inadvertent information leaks.
Fujitsu Laboratories, National Institute of Information and Communications Technology, and Kyushu University jointly broke a world cryptography record with the successful cryptanalysis of a 278-digit (923-bit)-long pairing-based cryptography, which is now becoming the next generation cryptography standard.