Google's TPU - Tensor Processing Unit - surprising reveal yesterday. Any one have thoughts on die size and tech node? The size of the heat sink from the EE Times story seems to suggest about 3cm x 4cm heat sink which is approx 1200 sqmm. Is there a ratio like 10% from the size of heat sink to actual die size to estimate die size? What foundry is making this chip? How many TPUs can you slot in a 1RU server rack - say 2-socket Xeon? Seems like SATA connections are being used?