While some of the largest technology companies in the world are racing to figure out the next generation of machine learning-focused chips that will support devices — whether that’s data centers or edge devices — there’s a whole class of startups that are racing to get there first.

That includes Cerebras Systems, one of the startups that has raised a significant amount of capital, which is looking to continue targeting next-generation machine learning operations with the hiring of Dhiraj Mallick as its Vice President of Engineering and Business Development. Prior to joining Cerebras, Mallick served as the VP of architecture and CTO of Intel’s data center group. That group generated more than $5.5 billion in the second quarter this year, up from nearly $4.4 billion in the second quarter of 2017, and has generated more than $10 billion in revenue in the first half of this year. Prior to Intel, Mallick spent time at AMD and SeaMicro.

That latter part is going to be a big part of the puzzle, as Google looks to lock in customers in its cloud platform with tools like the Tensor Processing Unit, the third generation of which was announced at Google I/O earlier this year. Data centers are able to handle some of the heavy lifting when it comes to training the models that handle machine learning processes like image recognition as they don’t necessarily have to worry about space (or partly heat, in the case of the TPU running with liquid cooling) constraints. Google is betting on that with the TPU, optimizing its hardware for its TensorFlow machine learning framework and trying to build a whole…

[SOURCE]