Google built Tensor Chip

Machine learning is a crucial chunk of the strategy employed by Google. It powers many of the Google’s highest applications. In fact, it’s very important that Google has been working for years behind the closed doors for developing a customized solution of its own to power company’s experiences. So in an effort to fetch faster machine learning technology, Google built Tensor Chip.

Machine learning, at present, can be banking on a traditional form of hardware. The Google custom chip could be useful for more than 100 features and products that are used by the company to improve capabilities and comprehend complex commands.

Tensor Chip for Faster Machine Learning

Machine learning offers the underlying appeal to many of the favorite applications developed by Google Inc. In fact, over 100 teams are using the technology of machine learning at Google at present. This is used in various applications namely Voice Search, Inbox Smart Reply, and Street View.

Google believes that for building a great hardware, creation of great software is necessary. This is the main reason why Google started a stealthy project several years ago in order to discover if the team could easily fulfill their own customized accelerators for the purpose of developing machine learning apps.

The search engine giant has been running several TPU inside Google Data Centers for more than a year. Following the research, the company has found that TPUs deliver a sequence of magnitude with a well-optimized performance / watt for the purpose of machine learning. This technology is roughly equivalent to Moore’s Law’s three generations, which is about fast-forwarding the technology about 7 years into the future.

Google built Tensor chip (TPU – Tensor Processing Unit) expressly for operating TensorFlow, which is an in-house system for machine learning, which was open-sourced by Google last year. The software will allow faster machine learning as it will place more processing speed on procedures, which involves analysis of big data, voice recognition, and photo identification.

For more than a year, TPUs have been in constant use at Google data centers. So far, Google has not been able to manage more operations into the silicon based on every second and receive improved results in minimum amount of time than previous solutions. The higher level of performance offers TPUs to aid powerful artificial intelligence programs, for instance, AlphaGo. This is the same Google’s AI software that defeated Fan Hui, the European Go Champion. And then, the world’s best Go player, Lee Sedol, also considered as the master of strategy, went through the sour taste of defeat from the same artificial intelligence software, almost after 3.5 hours of tough battle. The professional South Korean Go player, winner of international title for 18 times, conceded his defeat in the game, which was broadcasted live with YouTube being one of its stream giving an access to thousands of people around the world. The match was apparently the first to be played in DeepMind Challenge Championship, a worth $1M game. The game was played in Seoul at the Four Seasons Hotel.

Google built Tensor chip to ensure that it’ll be the heart of forthcoming services and technologies offered by Google. These services include machine learning and AI.

Also Read: HOW UNION SQUARE STORE IS A STATEMENT OF AMBITION FOR APPLE CEO TIM COOK