The term "big data" is defined as a huge amount of digital information, so big and so complex that normal database technology cannot process it. The open-source software framework Apache Hadoop is a user-friendly approach for accessing vast amounts of data, but it is not able to query big datasets as efficiently as database systems that are designed for parallel processing. Researchers have recently introduced an aggressive indexing library for Hadoop that answers queries up to 100 times faster.
Say the word “drone” and the image most often conjured is a flying object that is...
Siemon, a global network infrastructure specialist, has introduced the new LC BladePatch fiber...
A world-class supercomputer called Stampede—which has already enabled research teams to predict where and when earthquakes may strike, how much sea levels could rise and how fast brain tumors grow—was officially dedicated this week at the University of Texas at Austin's Texas Advanced Computing Center. The new research tool will be utilized by thousands of research groups.
At the world's largest cellphone trade show in Barcelona this week, the 70,000 attendees are encouraged to use their cellphones—instead their keycards—to get past the turnstiles at the door. But very few people took the chance to do that. The process of setting up the phone to act as a keycard proved too much of a hassle. It's a poor omen for an industry that's eager to have the cellphone replace both tickets and credit cards.
The U.S. Department of Energy's ESnet (Energy Sciences Network) is now operating the world's fastest science network, serving the entire national laboratory system, its supercomputing centers, and its major scientific instruments at 100 gigabits per second, 10 times faster than its previous generation network.
Through a new website unveiled Wednesday, Google is opening a virtual window into the secretive data centers where an intricate maze of computers process Internet search requests, show YouTube video clips, and distribute email for millions of people. The photographic access to Google's data centers coincides with the publication of a Wired magazine article about how the company builds and operates them.
After leading mass spectrometer manufacturers agreed to license technology that has enabled researchers to develop software allows scientists to easily use and share research data collected across proprietary instrument platforms. Called the ProteoWizard Toolkit, this cross-platform set of libraries and applications is expected to bolster large-scale biological research and help improve the understanding of complex diseases like cancer.
Named for the Greek word for wisdom, Sophia is a software sentry developed at Idaho National Laboratory that can passively monitor communication pathways in a static computer network and flag new types of conversations so operators can decide if a threat is present. It is the first such cybersecurity technology for SCADA control system network administrators that is being evaluated for deployment to industry.
Design for remotely monitoring large infrastructures, the longest fiber-optic sensor network yet designed would measure 250 km in length and be equipped with multiplexing technology to allow multiple information channels to be carried. Theorized by a researcher in Spain, the network would allow long-distance analysis with requiring a power source for the sensors.
As data centers continue to come under scrutiny for the amount of energy they use, researchers at University of Toronto Scarborough have a suggestion: turn the air conditioning down. Their latest research suggests that turning up the temperature could save energy with little or no increased risk of equipment failure.
As cloud computing is becoming more popular, new techniques to protect the systems must be developed. Computer scientists in Texas have developed a technique to automatically allow one computer in a virtual network to monitor another for intrusions, viruses or anything else that could cause a computer to malfunction. Dubbed “space travel”, the technique bridges the gap between computer hardware and software systems.
Engineering researchers at the University of Arkansas have received funding from the National Science Foundation to create distortion-tolerant communications for wireless networks that use very little power. The research will improve wireless sensors deployed in remote areas where these systems must rely on batteries or energy-harvesting devices for power.
Researchers at the Max Planck Institute in Germany have developed a complex network computer that is equally capable of performing arbitrary calculations as conventional computer, but does this under completely different conditions. Instead of a 0s and 1s in a binary system, this computer can in principle compute from, or be built from, any oscillating system, like a pendulum.
Males of the Japanese tree frog have learned not to use their calls at the same time so that the females can distinguish between them. Scientists at the Polytechnic University of Catalonia have used this form of calling behavior to create an algorithm that assigns colors to network nodes—an operation that can be applied to developing efficient wireless networks.
Modern research tools like supercomputers, particle colliders, and telescopes are generating so much data, so quickly, many scientists fear that soon they will not be able to keep up with the deluge. A team of computer researchers from universities and national laboratories are fighting to keep up, and have recently developed a tool that is able to query a massive 32 TB dataset in just 3 secs.
The U.S. Department of Energy Office of Science and the National Science Foundation have committed up to $27 million to Open Science Grid, a nine-member partnership extending the reach of distributed high-throughput computing networks.
Two years ago, a fledgling social-networking site called Blippy accidentally posted the credit card numbers of its users online. While that was a particularly egregious example, such inadvertent information leaks happen all the time. Massachusetts Institute of Technology researchers have developed a new programming system that could help prevent such inadvertent information leaks.
Research at Carlos III University in Madrid is developing an algorithm, based on ants' behavior when they are searching for food, which accelerates the search for relationships among elements that are present in social networks.
Northwestern University researchers are the first to discover that very different complex networks—ranging from global air traffic to neural networks—share very similar backbones. By stripping each network down to its essential nodes and links, they found each network possesses a skeleton and these skeletons share common features, much like vertebrates do.
Popular firewall technology designed to boost security on cellular networks can backfire, unwittingly revealing data that could help a hacker break into Facebook and Twitter accounts, a new study from the University of Michigan shows. The researchers also developed an Android app that tells phone users when they're on a vulnerable network.
Calculating the total capacity of a data network is a notoriously difficult problem. However, information theorists are beginning to make some headway. In a recently published paper, a team of information theorists have shown that in a wired network, network coding and error-correcting coding can be handled separately, without reduction in the network's capacity.
In the online struggle for network security, Kansas State University cybersecurity experts are adding an ally to the security force: the computer network itself. The team is researching the feasibility of building a computer network that could protect itself against online attackers by automatically changing its setup and configuration.
Many U.S. Internet service providers have fallen in line with their international counterparts in capping monthly residential broadband usage. But according to a recent study conducted with the help of Microsoft Research, these pricing models offer few tools for consumers to manage their data usage, and lead to uninformed decisions.
Using off-the-shelf parts, a researcher in Canada has created a Star Trek-like human-scale 3D videoconferencing pod that allows people in different locations to video conference as if they are standing in front of each other. Called TeleHuman, the device projects a full body image that is viewable from 360 degrees.
The international Square Kilometre Array (SKA) will be the world’s largest and most sensitive radio telescope when it is built, and will require the processing power of several million of today’s fastest computers to collect the exabytes of data it will generate. IBM and the Netherlands Institute for Radio Astronomy (ASTRON) are embarking on a five-year project to solve this data collection problem.
According to the International Centre for Radio Astronomy Research, the world’s most powerful telescope—the Square Kilometre Array—will produce on exabyte of data every day when it begins operation. Though still awaiting construction, scientists involved in SKA are already planning on how to deal with such a tremendous influx of information.
Konrad Juethner, a software engineering consultant, recently used Windows HPC Server to run cluster-based analysis with COMSOL Multiphysics using the hardware he had available at home. His successful setup highlights a high level of accessibility for advanced supercomputing approaches.