Google Tensor Processing Unit (Google TPU) - a tensor processor belonging to the class of neural processors , which is a specialized integrated circuit developed by Google and designed for use with the TensorFlow machine learning library . Presented in 2016 at the Google I / O conference, it was argued that the devices had already been used by Google for more than a year [1] [2] .
Description
Compared to GPUs , it is designed for a higher volume of calculations with reduced accuracy (for example, only 8-bit accuracy [3] ) with higher performance per watt and the absence of a module for rasterization and texture units [1] [2] .
It is claimed that tensor processors were used in a series of games in the AlphaGo program against Lee Sedol [2] and in the following similar fights [4] . The corporation also used tensor processors to process Google Street View photos for text extraction, it was reported that the entire volume was processed in less than five days. In Google Photos, one tensor processor can process more than 100 million photos per day. The device is also used for the self-training system RankBrain , which processes the responses of the Google search engine .
Architecture
The device is implemented as a matrix multiplier for 8-bit numbers, controlled by CISC instructions of the central processor via the PCIe 3.0 bus. It is manufactured using 28 nm technology, the clock frequency is 700 MHz and has a thermal design power of 28-40 watts. It is equipped with 28 MB of internal RAM and 4 MB of 32-bit batteries , accumulating the results in arrays of 8-bit factors, organized in a 256 × 256 matrix. Device instructions transfer data to a node or get it from it, perform matrix multiplications or convolutions [5] . 65536 multiplications per matrix can be performed per beat; per second - up to 92 trillion [6] .
Notes
- ↑ 1 2 Google's Tensor Processing Unit explained: this is what the future of computing looks like .
- ↑ 1 2 3 Jouppi, Norm Google supercharges machine learning tasks with TPU custom chip . Google Cloud Platform Blog . Google (May 18, 2016). Date of treatment January 22, 2017.
- ↑ Armasu, Lucian Google's Big Chip Unveil For Machine Learning: Tensor Processing Unit With 10x Better Efficiency (Updated) . Tom's Hardware (May 19, 2016). Date of treatment June 26, 2016.
- ↑ The Future of Go Summit, Match One: Ke Jie & AlphaGo on YouTube , starting at 6:03:10 (May 23, 2017)
- ↑ Norman P. Jouppi et al. In-Datacentre Performance Analysis of a Tensor Processing Unit . (44th International Symposium on Computer Architecture (ISCA), 2017)
- ↑ Ian Cutress . Hot Chips: Google TPU Performance Analysis Live Blog (3pm PT, 10pm UTC ) , AnandTech (August 22, 2017). Date accessed August 23, 2017.
Links
- The Google Tensor Processor chip will simplify the process of machine learning and restore power to Moore’s law . 3DNews (May 21, 2016). Date of treatment November 17, 2017.
- The second-generation Google TPU in machine learning tasks demonstrates higher performance than the Nvidia GV100 GPU . iXBT.com (May 19, 2017). Date of treatment November 21, 2017.
- Details of the tensor co-processor Google TPU . Servernews. (August 25, 2017). Date of treatment November 17, 2017.