Beyond gaming, GPU technology takes on graphs, machine learning
Graphics l processing units are familiar to dedicated gamers and supercomputer programmers, but these specialized chips may find use in big data science applications. Recent developments in NoSQL databases and machine learning services point the way. The potential of GPU technology to handle large data sets with complex dependencies led Blazegraph to build Blazegraph GPU, a NoSQL-oriented graph database running on NVIDIA general-purpose GPUs.
When big data gets too big, this machine-learning algorithm may be the answer
Big data may hold a world of untapped potential, but what happens when your data set is bigger than your processing power can handle? A new algorithm that taps quantum computing may be able to help.
SanDisk Maximizes Production Quality with Machine Learning and Analytics Powered by Cloudera Enterprise
PALO ALTO, Calif., Jan. 26, 2016 (GLOBE NEWSWIRE) -- Cloudera, the global provider of the fastest, easiest, and most secure data management and analytics platform built on Apache Hadoop and the latest open source technologies, announced today that SanDisk, a global leader in flash storage, has deployed Cloudera Enterprise as an enterprise data hub to store, process, analyze, and test all of its product quality data.
Artificial intelligence pioneer Marvin Minsky dies - FT.com
Marvin Minsky, a pioneer of artificial intelligence whose working life traced the long arc from early optimism to disappointment and eventual revival of enthusiasm for thinking machines, has died at the age of 88. As a founder of the AI field in the
AI Benchmark Will ask Computers to Make Sense of the World | MIT Technology Review
A few years ago, a breakthrough in machine learning suddenly enabled computers to recognize objects shown in photographs with unprecedented-almost spooky-accuracy. The question now is whether machines can make another leap, by learning to make sense of what's actually going on in such images.