Google Says Its Machine Learning Chip Leaves CPUs, GPUs in the Dust
Google's justification for developing a custom chip for inferencing started building about six years ago when they began incorporating deep learning into more of their core search products. Based on the scant details Google provides about its data center operations - which include 15 major sites - the search-and-ad giant was looking at additional capital expenditures of perhaps $15bn, assuming that a large Google data center costs about $1bn.
Machine Learning, Artificial Intelligence, and Deep Learning News around the world. We publish the latest developments and advances in these fields.
Wednesday, April 19, 2017
Google Says Its Machine Learning Chip Leaves CPUs, GPUs in the Dust
Subscribe to:
Post Comments (Atom)

No comments:
Post a Comment