Google Says Machine Learning Chips Make AI Faster and More Efficient
Google recently bared the inner workings of its dedicated machine learning chip, the TPU, marking the latest skirmish in the arms race for AI hardware supremacy. Shorthand for Tensor Processing Unit, the chip has been tailored for use with Google's open-source machine learning library TensorFlow, and has been in use in Google's data centers since ...
Machine Learning, Artificial Intelligence, and Deep Learning News around the world. We publish the latest developments and advances in these fields.
Thursday, May 4, 2017
Google Says Machine Learning Chips Make AI Faster and More Efficient
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment