Google is building its own chip to power its AI bots

Google built its own chips to powerย machine learning algorithms

Perhaps Google thinks that the chips available in the market are unable to handle the needs of its machine learning algorithms because Google todayย announced at its I/O developer conference that the search giant has been building its own specialized chips to power the machine learning algorithms.

Google has been using the Tensor Processing Units (TPU) custom-built chips to power its data centers according to Googleโ€™s senior VP for its technical infrastructure Urs Holzle. Now it wants TPU to power its machine learning algorithms.

Google says itโ€™s getting โ€œan order of magnitude better-optimized performance per watt for machine learningโ€ and argues that this is โ€œroughly equivalent to fast-forwarding technology about seven years into the future.โ€

Google said that it has managed to speed up the machine learning algorithms with the TPUs because it doesnโ€™t need the high-precision of standard CPUs and GPUs. Instead of 32-bit precision, the algorithms happily run with a reduced precision of 8 bits, so every transaction needs fewer transistors.

You are also using TPU through Googleโ€™s voice recognition services. Googleโ€™s Cloud Machine Learning services also run on these chips. AlphaGo, which recently beat the Go world champion,also ran on TPUs.

Holzle said that Google decided to buildย these application-specific chips instead of using more flexible FPGAs because it was looking for efficiency.

Holzle sadly didnโ€™t want to disclose which foundry is actually making Googleโ€™s chips, though he was willing to say that the company is currentlyย usingย two different revisions in production and that they areย manufactured by two different foundries.

With TensorFlow, Googleย offers its own open source machine learning library and unsurprisingly, the company adopted it to run its algorithms on TPUs.

spot_img

Read More

Suggested Post