A team at Cornell University has made a breakthrough in that direction with a room-temperature magnetoelectric memory device. Equivalent to one computer bit, it exhibits the holy grail of next-generation nonvolatile memory: magnetic switchability, in two steps, with nothing but an electric field.
Reality isn’t always what it seems, as we learned in the groundbreaking film The Matrix...
In a triumph for cell biology, researchers have assembled the first high-resolution, 3-D maps of...
For decades, the mantra of electronics has been smaller, faster, cheaper. Today, Stanford Univ....
An interstellar mystery of why stars form has been solved thanks to the most realistic supercomputer simulations of galaxies yet made.
Computers are good at identifying patterns in huge data sets. Humans, by contrast, are good at inferring patterns from just a few examples. In a recent paper, Massachusetts Institute of Technology researchers present a new system that bridges these two ways of processing information, so that humans and computers can collaborate to make better decisions.
An odd, iridescent material that's puzzled physicists for decades turns out to be an exotic state of matter that could open a new path to next-generation electronics. Physicists at the Univ. of Michigan have discovered or confirmed several properties of the compound samarium hexaboride that raise hopes for finding the silicon of the quantum era. They say their results also close the case of how to classify the material.
Stanford Univ. engineers have designed and built a prism-like device that can split a beam of light into different colors and bend the light at right angles, a development that could eventually lead to computers that use optics, rather than electricity, to carry data.
Planets orbiting close to low-mass stars are prime targets in the search for extraterrestrial life. But new research led by an astronomy graduate student at the Univ. of Washington indicates some such planets may have long since lost their chance at hosting life because of intense heat during their formative years.
Biological engineers have created a new computer model that allows them to design the most complex 3-D DNA shapes ever produced, including rings, bowls and geometric structures such as icosahedrons that resemble viral particles. This design program could allow researchers to build DNA scaffolds to anchor arrays of proteins and light-sensitive molecules called chromophores that mimic the photosynthetic proteins found in plant cells.
IBM has engineered a way for everyone to join the fight against Ebola—by donating processing time on their personal computers, phones or tablets to researchers. IBM has teamed with scientists at Scripps Research Institute in southern California on a project that aims to combine the power of thousands of small computers, to each attack tiny pieces of a larger medical puzzle that might otherwise require a supercomputer to solve.
Microbes of interest to clinicians and environmental scientists rarely exist in isolation. Organisms essential to breaking down pollutants or causing illness live in complex communities, and separating one microbe from hundreds of companion species can be challenging for researchers seeking to understand environmental issues or disease processes.
In 1997, IBM’s Deep Blue computer beat chess wizard Garry Kasparov. This year, a computer system developed at the Univ. of Wisconsin-Madison equaled or bested scientists at the complex task of extracting data from scientific publications and placing it in a database that catalogs the results of tens of thousands of individual studies.
During a thunderstorm, we all know it’s common to hear thunder after we see the lightning. That’s because sound travels much slower (768 mph) than light (670,000,000 mph). Now, Univ. of Minnesota engineering researchers have developed a chip on which both sound wave and light wave are generated and confined together so that the sound can very efficiently control the light.
Lawrence Livermore National Laboratory and the RAND Corporation will collaborate to expand the use of high-performance computing in decision analysis and policymaking. The two organizations signed a memorandum of understanding on Friday, Nov. 21. The arrangement provides a vehicle for the two organizations to explore the use of policy analysis methodologies with supercomputing applications.
The improvements in random access memory (RAM) that have driven many advances of the digital age owe much to the innovative application of physics and chemistry at the atomic scale. Accordingly, a team led by Univ. of Nebraska-Lincoln researchers has employed a Nobel Prize-winning material and common household chemical to enhance the properties of a component primed for the next generation of high-speed, high-capacity RAM.
Cyber-security researchers say they've identified a highly sophisticated computer hacking program that appears to have been used by an as-yet unidentified government to spy on banks, telecommunications companies, official agencies and other organizations around the world. The malicious software known as "Regin" is designed to collect data from its targets for periods of months or years.
New computer models that show how microtubules age are the first to match experimental results and help explain the dynamic processes behind an essential component of every living cell, according to Rice Univ. scientists. The results could help scientists fine-tune medications that manipulate microtubules to treat cancer and other diseases.
Researchers at Nano-Meta Technologies Inc. have shown how to overcome key limitations of a material that could enable the magnetic storage industry to achieve data-recording densities far beyond today's computers. The new technology could make it possible to record data on an unprecedented small scale using tiny "nanoantennas" and to increase the amount of data that can be stored on a standard magnetic disk by 10 to 100 times.
Rice Univ. is preparing to offer its researchers who deal in “big data” the opportunity to compute in the cloud with fewer barriers. Rice is installing the Big Research Data Cloud (BiRD Cloud), which will allow for cloud bursting. That means data-intensive tasks can spill over into outside cloud-computing systems when necessary, essentially providing unlimited computing capacity.
Carpe diem…seize the day. This Latin phrase, coined by the Roman poet Horace in 23 BC, is used often to encourage us to take full advantage of the opportunities each day provides. In modern times with seemingly limitless amounts of data on any conceivable subject available at our fingertips, organizations globally are developing strategies to leverage this growing data volume to enhance business success.
A team of New York Univ. and Univ. of Barcelona physicists has developed a method to control the movements occurring within magnetic materials, which are used to store and carry information. The breakthrough could simultaneously bolster information processing while reducing the energy necessary to do so.
The race to make computer components smaller and faster and use less power is pushing the limits of the properties of electrons in a material. Photonic systems could eventually replace electronic ones, but the fundamentals of computation, mixing two inputs into a single output, currently require too much space and power when done with light.
Researchers from the Queen Mary Univ. of London gave a computer program the outline of how a magic jigsaw puzzle and a mind-reading card trick work, as well the results of experiments into how humans understand magic tricks, and the system created completely new variants on those tricks which can be delivered by a magician.
Lawrence Livermore National Laboratory (LLNL) announced a contract with IBM to deliver a next-generation supercomputer in 2017. The system, to be called Sierra, will serve the National Nuclear Security Administration’s Advanced Simulation and Computing program. Procurement of Sierra is part of a DOE-sponsored Collaboration of Oak Ridge, Argonne and Lawrence Livermore national labs to accelerate the development of high-performance computing.
Not long ago, it would have taken several years to run a high-resolution simulation on a global climate model. But using some of the most powerful supercomputers now available, Lawrence Berkeley National Laboratory climate scientist Michael Wehner was able to complete a run in just three months. Not only were the simulations much closer to actual observations, but the high-resolution models were far better at reproducing intense storms.
While the Martinis Lab at the Univ. of California, Santa Barbara has been focusing on quantum computation, they have also been exploring qubits for quantum simulation on a smaller scale. The team worked on a new qubit architecture, which is an essential ingredient for quantum simulation, and allowed them to master the seven parameters necessary for complete control of a two-qubit system.
The tree has been an effective model of evolution for 150 years, but a Rice Univ. computer scientist believes it’s far too simple to illustrate the breadth of current knowledge. Rice researcher Luay Nakhleh and his group have developed PhyloNet, an open source software package that accounts for horizontal as well as vertical inheritance of genetic material among genomes.
Today, petabytes of digital information are generated daily by such sources as social media, Internet activity, surveillance sensors and advanced research instruments. The results are often referred to as “big data”—accumulations so huge that highly sophisticated computer techniques are required to identify useful information hidden within. Graph analysis is a prime tool for finding the needle in the data haystack.
- Page 1