Wednesday, April 19, 2017

Google Says Its Machine Learning Chip Leaves CPUs, GPUs in the Dust


Google Says Its Machine Learning Chip Leaves CPUs, GPUs in the Dust

Google's justification for developing a custom chip for inferencing started building about six years ago when they began incorporating deep learning into more of their core search products. Based on the scant details Google provides about its data center operations - which include 15 major sites - the search-and-ad giant was looking at additional capital expenditures of perhaps $15bn, assuming that a large Google data center costs about $1bn.

No comments:

Post a Comment