Eight U.S. Dept. of Energy national laboratories are combining forces to use high performance computing to build the most complete climate and Earth system model yet devised. The project, called Accelerated Climate Modeling for Energy, or ACME, is designed to accelerate the development and application of fully coupled, state-of-the-science Earth system models for scientific and energy applications.
Software developed by Univ. of California,...
An integrated, web-based platform for measuring...
In 2006, DARPA launched a long-term project called...
Big data can mean big headaches for scientists. A new library of software tools from Howard Hughes Medical Institute’s Janelia Research Campus speeds analysis of data sets so large and complex they would take days or weeks to analyze on a single workstation, even if a single workstation could do it at all. The new tool, Thunder, should help interpret data that holds new insights into how the brain works.
North Carolina-based Semiconductor Research Corporation (SRC) and Singapore’s Silicon Cloud International (SCI) are launching a new program aimed at globally advancing integrated circuit (IC) design education and research. The program will focus on increasing the quantity of IC designers in university systems worldwide, and enhancing expertise in secure cloud computing architecture.
“Big data” has yet to make a mark on conservation efforts to preserve the planet’s biodiversity. But that may soon change with a new model developed by Univ. of California, Berkeley, biologist Brent Mishler and his colleagues in Australia. This effort leverages the growing mass of data to take into account not only the number of species throughout an area, but also the variation among species and their geographic rarity, or endemism.
NIST has issued for public review and comment a draft report summarizing 65 challenges that cloud computing poses to forensics investigators who uncover, gather, examine and interpret digital evidence to help solve crimes. The report was prepared by the NIST Cloud Computing Forensic Science Working Group, an international body of cloud and digital forensic experts from industry, government and academia.
Fully automated "deep learning" by computers greatly improves the odds of discovering particles such as the Higgs boson, according to a recent study. In fact, this approach beats even veteran physicists' abilities, which now consists of developing mathematical formulas by hand to apply to data. New machine learning methods are rendering that approach unnecessary.
Microsoft Corp. and four other large American technology companies are using a Manhattan court case to draw a line in the cloud, saying the U.S. government has no right to seize computer data stored outside the country. U.S. companies that host services over the Internet and sell remote data storage say they stand to lose billions of dollars in business if emails and other files they house overseas are seen vulnerable to U.S. snooping.
Without a specific search term in mind, it can be surprisingly hard to find information on the Internet , or to know how to start searching. To help, computer scientists have created the first fully automated computer program that teaches everything there is to know about any visual concept. Called Learning Everything about Anything (LEVAN), the program searches millions of books and images to learn all possible variations of a concept.
Highlighting the impact of malicious software, Target suffered the largest retail hack in U.S. history during the Christmas shopping season of 2013. To help combat this worsening trend, Virginia Tech computer scientists have used causal relations to determine whether or not network activities have justifiable and legitimate causes to occur. The work effectively isolates infected computer hosts and detects in advance stealthy malware.
Beckman Coulter Diagnostics has announced a strategic partnership with hc1.co of Indianapolis to help laboratories turn large amounts of clinical data into actionable insights. The new technology combines Beckman Coulter’s clinical diagnostic systems with hc1.com’s software-as-a-service product, Healthcare Relationship Cloud.
Industrial plants must function effectively. Remedying production downtimes and breakdowns is an expensive and time consuming business. That is why companies collect data to evaluate how their facilities are doing. At the Hannover Messe Digital Factory, held April 7-11, researchers in Germany will show how operators can analyze these huge amounts of data and use it as an early warning system when problems threaten.
The White House on Wednesday announced an initiative to provide private companies and local governments better access to already public climate data. The idea is that with this localized data they can help the public understand the risks they face, especially in coastal areas. The government also is working with Google, Microsoft and Intel, to come up with tools to make communities more resilient in dealing with weather extremes.
The 360-degree views of the Grand Canyon that went live Thursday in Google's Street View map option once were reserved largely for rafters who were lucky enough to board a private trip through the remote canyon, or those willing to pay big bucks to navigate its whitewater rapids. But a partnership with the advocacy group American Rivers has allowed to Google to take its all-seeing eyes down nearly 300 miles of rich geologic history.
Ben Recht, a statistician and electrical engineer at the Univ. of California, Berkeley, looks for problems. He develops mathematical strategies to help researchers, from urban planners to online retailers, cut through blizzards of data to find what they’re after. He resists the “needle in the haystack” metaphor for big data because, he says, people usually don’t know enough about their data to understand the goal.
Named “Project Lucy” after the earliest known human ancestor, IBM’s new 10-year, $100 million initiative will bring the Watson computer and other cognitive systems to Africa in a bid to fuel development and spur business opportunities across the world’s fastest growing continent. Watson, whose design team won an R&D Innovator of the Year Award in 2011, improves itself by learning and quickly accessing big data resources.
Climate researchers in the U.K. have made the world's temperature records available via Google Earth. The new format allows users to scroll around the world, zoom in on 6,000 weather stations, and view monthly, seasonal and annual temperature data more easily than ever before. Users can drill down to see some 20,000 graphs—some of which show temperature records dating back to 1850.
Computer scientists at Trinity College Dublin and IBM Dublin have made a significant advance that will allow companies to reduce associated greenhouse gas emissions, drive down costs and minimize network delays depending on their wishes. The scientists have dubbed their new system “Stratus”. Using mathematical algorithms, Stratus effectively balances the load between different computer servers located across the globe.
Scientific innovation lifecycle management solutions provider Accelrys has added to its enterprise capabilities with the acquisition of Ireland-based QUMAS for $50 million in cash. QUMAS is a global provider of cloud-based and on-premises enterprise compliance software supporting regulatory and quality operations in life sciences and other highly regulated industries.
Even as Silicon Valley speaks out against the U.S. government's surveillance methods, technology companies are turning a handsome profit by mining personal data. Tarnished by revelations that the National Security Agency trolls deep into the everyday lives of Web surfers, companies like Apple, Facebook, Google, and Microsoft are aggressively battling any perception that they voluntarily give the government access to users' information.
DNAnexus has announced a collaboration with Stanford Univ. that has resulted in a new 1000 Genomes Project data set of genetic variation. Launched in January 2008, the 1000 Genomes Project was the first international effort to sequence a large number of individual genomes with the goal of developing a comprehensive and freely accessible resource on human genetic variation.
Einstein@Home creates a global supercomputer by connecting more than 350,000 participants that contribute to a variety of scientific projects, particularly astronomy, by conducting distributed analysis routines with their home computers. This resource has already found several pulsars hidden in radio telescope data. Now, “citizen scientists” have helped researchers discover four new gamma-ray pulsars.
Google has become less likely to comply with government demands for its users' online communications and other activities as authorities in the U.S. and other countries become more aggressive about mining the Internet for personal data. Legal requests from governments for people’s data have risen 21% from the last half of last year.
Google says it is investing 450 million euros to expand a data center in southern Finland as part of Europe-wide development plans totaling hundreds of millions of euros. The investment comes on top of the 350 million euros Google Inc. has spent converting an old paper mill, which started operations as a data center in 2011.
In the hugely popular game Minecraft, players can freely build and create their own world by mining and stacking different types of bricks in a sandbox-like environment. Because of its customizable dynamic, the game has also become a background platform for many user-generated modifications, or "mods". Researchers and the developers of Minecraft have built a new Google-funded mod that introduces quantum mechanics into the game's landscape.
Over the past three years, 300,000 gamers have helped scientists with genomic research by playing Phylo, an online puzzle game. Now, the McGill Univ. researchers who developed the game are making this crowd of players available to scientists around the globe. The idea is to put human talent to work to improve on what is already being done by computers in the field of comparative genomics.
The San Diego Supercomputer Center at the Univ. of California, San Diego, has been awarded a grant from the National Science Foundation to build Comet, a new petascale supercomputer designed to transform advanced scientific computing by expanding access and capacity among research domains. Comet will be capable of an overall peak performance of nearly two petaflops, or two quadrillion operations per second.
- Page 1