Made from state-of-the-art silicon transistors, an ultra-low power sensor enables real-time scanning of the contents of liquids, such as perspiration. Compatible with advanced electronics, this technology boasts exceptional accuracy – enough to manufacture mobile sensors that monitor health.
Overheating is a major problem for the microprocessors that run our smartphones and computers....
Electrical engineers in Germany have demonstrated a new kind of building block for digital integrated circuits. Their experiments show that future computer chips could be based on 3-D arrangements of nanometer-scale magnets instead of transistors. In a 3-D stack of nanomagnets, the researchers have implemented a so-called “majority” logic gate, which could serve as a programmable switch in a digital circuit.
Researchers from the Univ. of Texas at Dallas have created technology that could be the first step toward wearable computers with self-contained power sources or, more immediately, a smartphone that doesn’t die after a few hours of heavy use. This technology taps into the power of a single electron to control energy consumption inside transistors, which are at the core of most modern electronic systems.
A team of researchers has discovered a way to cool electrons to -228 C without external means and at room temperature, an advancement that could enable electronic devices to function with very little energy. The process involves passing electrons through a quantum well to cool them and keep them from heating.
Chip designers are facing both engineering and fundamental limits that have become barriers to the continued improvement of computer performance. Have we reached the limits to computation? In a review article in Nature, Igor Markov of the Univ. of Michigan reviews limiting factors in the development of computing systems to help determine what is achievable, identifying "loose" limits and viable opportunities for advancements.
The more cores a computer chip has, the bigger the problem of communication between cores becomes. For years, Li-Shiuan Peh, a professor of electrical engineering and computer science at Massachusetts Institute of Technology, has argued that the massively multicore chips of the future will need to resemble little Internets, where each core has an associated router, and data travels between cores in packets of fixed size.
The basic element of modern electronics, namely the transistor, suffers from significant current leakage. By enveloping a transistor with a shell of piezoelectric material, which distorts when voltage is applied, researchers in the Netherlands were able to reduce this leakage by a factor of five compared to a transistor without this material.
In the fictional Star-Trek universe, the tricorder was used to remotely scan patients for a diagnosis. A new device under development in the U.K. could perform that function through the use of chemical sensors on printed circuit boards. This would replace the current conventional diagnostic method, which is lengthy and is limited to single point measurements.
Although neural networks have been used in the past to solve pattern recognition problems such as speech and image recognition, it was usually in software on a conventional computer. Researchers in Belgium have manufactured such a small neural network in hardware, using a silicon photonics chip. This chip is made using the same technology as traditional computer chips but uses light instead of electricity as information carrier.
Geneticists at the Univ. of California, Davis have decoded the genome sequence for the loblolly pine. The accomplishment is a milestone for genetics because this pine’s genome is massive. Bloated with repetitive sequences, it is seven times larger than the human genome and easily big enough to overwhelm standard genome assembly methods.
Ben Recht, a statistician and electrical engineer at the Univ. of California, Berkeley, looks for problems. He develops mathematical strategies to help researchers, from urban planners to online retailers, cut through blizzards of data to find what they’re after. He resists the “needle in the haystack” metaphor for big data because, he says, people usually don’t know enough about their data to understand the goal.
The USC Viterbi School of Engineering is home to the USC-Lockheed Martin Quantum Computing Center (QCC), a super-cooled, magnetically shielded facility specially built to house the first commercially available quantum computing processors. There are only two in use, and elaborate tests on the quantum processor, called D-Wave, indicate that it does use special laws of quantum mechanics to operate.
According to recent findings by an international team of computer engineers, optical data storage does not require expensive magnetic materials because synthetic alternatives work just as well. The team’s discovery that synthetic ferrimagnets can be switched optically brings a much cheaper method for storing data using light a step closer.
Last year, a physicist and a mechanical engineer at Northeastern Univ. combined their expertise to integrate electronic and optical properties on a single electronic chip, enabling them to switch electrically using light alone. Now, they have built three new devices that implement this fast technology: an AND-gate, an OR-gate and a camera-like sensor made of 250,000 miniature devices.
The physical implementation of a full-scale universal quantum computer remains an extraordinary challenge for physicists, mainly because existing approaches lose their “quantum-ness” as they are scaled up. At the Joint Quantum Institute, a new modular architecture is being explored that offers scalability to large numbers of qubits, and its components have been tested and are available.
Computer chips keep getting faster because transistors keep getting smaller. But the chips themselves are as big as ever, so data moving around the chip, and between chips and main memory, has to travel just as far. As transistors get faster, the cost of moving data becomes, proportionally, a more severe limitation. So far, chip designers have circumvented that limitation through the use of “caches”.
For aspiring electrical engineers, New Jersey Institute of Technology has pulled together in one “tall” infographic a brief history of the breakthroughs and impact of electrical engineering advances since the 1830s, when the telegraph marked the first time that electric currents were used to transmit messages. Since then, electrical devices have a dramatic effect on our daily lives.
European scientists from both academia and industry have begun an ambitious new research project focused on an alternative approach to extend Moore's Law. The research project, coordinated IBM Research in Zurich and called COMPOSE³, is based on the use of new materials to replace today's silicon, and on taking an innovative design approach where transistors are stacked vertically, known as 3-D stacking.
Seeking a solution to decoherence, scientists have developed a strategy of linking quantum bits together into voting blocks, a strategy that significantly boosts their accuracy. In a recently published paper, the team found that their method results in at least a five-fold increase in the probability of reaching the correct answer when the processor solves the largest problems tested by the researcher, involving hundreds of qubits.
Continuous miniaturization in microelectronics is nearing physical limits, so researchers are seeking new methods for device fabrication. One promising candidate is a DNA origami technique in which individual strands of the biomolecule self-assemble into arbitrarily-shaped nanostructures. A new simpler strategy combines DNA origami with self-organized pattern formation to do away with elaborate procedures for positioning DNA structures.
Scientists and engineers from an international collaboration have, for the first time, generated and manipulated single particles of light (photons) on a silicon chip. This accomplishment, which required shrinking down key components and integrating them onto a silicon microchip, is a major step forward in the race to build a quantum computer.
Scientists in Germany, inspired by the odor-processing nervous system of insects, have recently refined a new technology that is based on parallel data processing. Called neuromorphic computing, their system is composed of silicon neurons linked together in a similar fashion to the nerve cells in our brains. If the assembly is fed with data, all silicon neurons work in parallel to solve the problem.
“Cool it!” That’s a prime directive for microprocessor chips and a promising new solution to meeting this imperative is in the offing. Researchers with the U.S. Dept. of Energy’s Lawrence Berkeley National Laboratory have developed a process-friendly technique that would enable the cooling of microprocessor chips through carbon nanotubes.
The Advanced Institute for Computational Science at RIKEN has been selected by the Japanese government to develop a new exascale supercomputer. The new supercomputer, which is scheduled to begin working in 2020, will compute on the "exaflop" scale and will be about 100 times faster than the K computer which was the world’s fastest in 2011.
Researchers have proved the feasibility of a new type of transistor that could enable fast and low-power computing devices for energy-constrained applications such as smart sensor networks and implantable medical electronics. Called a near broken-gap tunnel field effect transistor, the new device uses the quantum mechanical tunneling of electrons through an ultra-thin energy barrier to provide high current at low voltage.
A recently developed plasma-based chip fabrication technique affords chip makers unprecedented control of plasma thanks to a population of suprathermal electrons. This is critical to modern microchip fabrication, but how the beam electrons transform themselves into this suprathermal population has been a puzzle. New computer simulations reveal how intense plasma waves generate suprathermal electrons.
- Page 1