Google unveils tiny new AI chips for on-device machine learning

Two years ago, Google unveiled its Tensor Processing Units or TPUs — specialized chips that live in the company’s data centers and make light work of AI tasks. Now, the company is moving its AI expertise down from the cloud, and has taken the wraps off its new Edge TPU; a tiny AI accelerator that will carry out machine learning jobs in IoT devices.

The Edge TPU is designed to do what’s known as “inference.” This is the part of machine learning where an algorithm actually carries out the task it was trained to do; like, for example, recognizing an object in a picture. Google’s server-based TPUs are optimized for the training part of this process, while these new Edge TPUs will do the inference.

These new chips are destined to be used in enterprise jobs, not your next smartphone. That means tasks like automating quality control checks in factories. Doing this sort of job on-device has a number of advantages over using hardware that has to sent data over the internet for analysis. On-device machine learning is generally more secure; experiences less downtime; and delivers faster results. That’s the sales pitch anyway.

The Edge TPU is the little brother of the regular Tensor Processing Unit, which Google uses to power its own AI, and which is available for other customers to use via Google Cloud. 

Read more at The Verge blog