Rice Univ. is preparing to offer its researchers who deal in “big data” the opportunity to compute in the cloud with fewer barriers. Rice is installing the Big Research Data Cloud (BiRD Cloud), which will allow for cloud bursting. That means data-intensive tasks can spill over into outside cloud-computing systems when necessary, essentially providing unlimited computing capacity.
As a laboratory technician or director, knowing the current status of your instrument or sample...
The final version of the U.S. Government Cloud Computing Technology Roadmap, Volumes I and II...
Today, big data is a hot topic within almost every industry. May saw the biggest ever European technologists conference on big data, Berlin Buzzwords, while the likes of O'Reilly's Strata conference pull in huge numbers of attendees keen to learn how to adapt to this new world. Despite all the interest, a great deal of confusion remains around big data.
Eight U.S. Dept. of Energy national laboratories are combining forces to use high performance computing to build the most complete climate and Earth system model yet devised. The project, called Accelerated Climate Modeling for Energy, or ACME, is designed to accelerate the development and application of fully coupled, state-of-the-science Earth system models for scientific and energy applications.
Software developed by Univ. of California, Berkeley computer scientists seeks to tame the vast amount of visual data in the world by generating a single photo that can represent massive clusters of images. This tool can give users the photographic gist of a kid on Santa’s lap, housecats, or brides and grooms at their weddings. It works by generating an image that literally averages the key features of the other photos.
An integrated, web-based platform for measuring research output and impact, monitoring trends and benchmarking, InCites is Thomson Reuters’ latest effort to allow users to easily assess and look beyond the global influence of a specific journal to conduct transparent analysis and make better decisions. The expanded assessment solution has been implemented on the 2014 edition of Journal Citation Reports.
In 2006, DARPA launched a long-term project called CORONET, which sought to develop a cloud-based technology that could enable affordable, fast bandwidth and ensure the survival of cloud networks in the event of system-wide failures. After years of work, scientists from AT&T, IBM and Applied Communication Sciences have announced a proof-of-concept technology that reduces setup times for cloud-to-cloud connectivity from days to seconds.
Big data can mean big headaches for scientists. A new library of software tools from Howard Hughes Medical Institute’s Janelia Research Campus speeds analysis of data sets so large and complex they would take days or weeks to analyze on a single workstation, even if a single workstation could do it at all. The new tool, Thunder, should help interpret data that holds new insights into how the brain works.
North Carolina-based Semiconductor Research Corporation (SRC) and Singapore’s Silicon Cloud International (SCI) are launching a new program aimed at globally advancing integrated circuit (IC) design education and research. The program will focus on increasing the quantity of IC designers in university systems worldwide, and enhancing expertise in secure cloud computing architecture.
“Big data” has yet to make a mark on conservation efforts to preserve the planet’s biodiversity. But that may soon change with a new model developed by Univ. of California, Berkeley, biologist Brent Mishler and his colleagues in Australia. This effort leverages the growing mass of data to take into account not only the number of species throughout an area, but also the variation among species and their geographic rarity, or endemism.
NIST has issued for public review and comment a draft report summarizing 65 challenges that cloud computing poses to forensics investigators who uncover, gather, examine and interpret digital evidence to help solve crimes. The report was prepared by the NIST Cloud Computing Forensic Science Working Group, an international body of cloud and digital forensic experts from industry, government and academia.
Fully automated "deep learning" by computers greatly improves the odds of discovering particles such as the Higgs boson, according to a recent study. In fact, this approach beats even veteran physicists' abilities, which now consists of developing mathematical formulas by hand to apply to data. New machine learning methods are rendering that approach unnecessary.
Microsoft Corp. and four other large American technology companies are using a Manhattan court case to draw a line in the cloud, saying the U.S. government has no right to seize computer data stored outside the country. U.S. companies that host services over the Internet and sell remote data storage say they stand to lose billions of dollars in business if emails and other files they house overseas are seen vulnerable to U.S. snooping.
Without a specific search term in mind, it can be surprisingly hard to find information on the Internet , or to know how to start searching. To help, computer scientists have created the first fully automated computer program that teaches everything there is to know about any visual concept. Called Learning Everything about Anything (LEVAN), the program searches millions of books and images to learn all possible variations of a concept.
Highlighting the impact of malicious software, Target suffered the largest retail hack in U.S. history during the Christmas shopping season of 2013. To help combat this worsening trend, Virginia Tech computer scientists have used causal relations to determine whether or not network activities have justifiable and legitimate causes to occur. The work effectively isolates infected computer hosts and detects in advance stealthy malware.
Beckman Coulter Diagnostics has announced a strategic partnership with hc1.co of Indianapolis to help laboratories turn large amounts of clinical data into actionable insights. The new technology combines Beckman Coulter’s clinical diagnostic systems with hc1.com’s software-as-a-service product, Healthcare Relationship Cloud.
Industrial plants must function effectively. Remedying production downtimes and breakdowns is an expensive and time consuming business. That is why companies collect data to evaluate how their facilities are doing. At the Hannover Messe Digital Factory, held April 7-11, researchers in Germany will show how operators can analyze these huge amounts of data and use it as an early warning system when problems threaten.
The White House on Wednesday announced an initiative to provide private companies and local governments better access to already public climate data. The idea is that with this localized data they can help the public understand the risks they face, especially in coastal areas. The government also is working with Google, Microsoft and Intel, to come up with tools to make communities more resilient in dealing with weather extremes.
The 360-degree views of the Grand Canyon that went live Thursday in Google's Street View map option once were reserved largely for rafters who were lucky enough to board a private trip through the remote canyon, or those willing to pay big bucks to navigate its whitewater rapids. But a partnership with the advocacy group American Rivers has allowed to Google to take its all-seeing eyes down nearly 300 miles of rich geologic history.
Ben Recht, a statistician and electrical engineer at the Univ. of California, Berkeley, looks for problems. He develops mathematical strategies to help researchers, from urban planners to online retailers, cut through blizzards of data to find what they’re after. He resists the “needle in the haystack” metaphor for big data because, he says, people usually don’t know enough about their data to understand the goal.
Named “Project Lucy” after the earliest known human ancestor, IBM’s new 10-year, $100 million initiative will bring the Watson computer and other cognitive systems to Africa in a bid to fuel development and spur business opportunities across the world’s fastest growing continent. Watson, whose design team won an R&D Innovator of the Year Award in 2011, improves itself by learning and quickly accessing big data resources.
Climate researchers in the U.K. have made the world's temperature records available via Google Earth. The new format allows users to scroll around the world, zoom in on 6,000 weather stations, and view monthly, seasonal and annual temperature data more easily than ever before. Users can drill down to see some 20,000 graphs—some of which show temperature records dating back to 1850.
Computer scientists at Trinity College Dublin and IBM Dublin have made a significant advance that will allow companies to reduce associated greenhouse gas emissions, drive down costs and minimize network delays depending on their wishes. The scientists have dubbed their new system “Stratus”. Using mathematical algorithms, Stratus effectively balances the load between different computer servers located across the globe.
Scientific innovation lifecycle management solutions provider Accelrys has added to its enterprise capabilities with the acquisition of Ireland-based QUMAS for $50 million in cash. QUMAS is a global provider of cloud-based and on-premises enterprise compliance software supporting regulatory and quality operations in life sciences and other highly regulated industries.
Even as Silicon Valley speaks out against the U.S. government's surveillance methods, technology companies are turning a handsome profit by mining personal data. Tarnished by revelations that the National Security Agency trolls deep into the everyday lives of Web surfers, companies like Apple, Facebook, Google, and Microsoft are aggressively battling any perception that they voluntarily give the government access to users' information.
DNAnexus has announced a collaboration with Stanford Univ. that has resulted in a new 1000 Genomes Project data set of genetic variation. Launched in January 2008, the 1000 Genomes Project was the first international effort to sequence a large number of individual genomes with the goal of developing a comprehensive and freely accessible resource on human genetic variation.
Einstein@Home creates a global supercomputer by connecting more than 350,000 participants that contribute to a variety of scientific projects, particularly astronomy, by conducting distributed analysis routines with their home computers. This resource has already found several pulsars hidden in radio telescope data. Now, “citizen scientists” have helped researchers discover four new gamma-ray pulsars.
- Page 1