Facebook looking at hardware

Since I now have some idea what ASIC and FPGA actually are after my Nvidia deep dive, this is interesting:


You don’t need to read it, the title pretty much says everything.

Ramifications for Nvidia? I have a feeling that the general-purpose GPU won’t be the chip-of-the-ai-future. It makes little sense that it would be, so I can see a lot of challenges coming up for Nvidia as those more specific chips become more ubiquitous. Not to say there won’t be a lot of opportunity for Nvidia.

As an aside, chip design by AI must be a thing right?



Good find. Interesting that Facebook will be testing the waters for this. My first thoughts, that the article also mentions, would be an ASIC for the Cafe2 Deep Learning Framework that Facebook developed and runs. Could also be looking at smaller IoT type devices where there is not a lot of good options.

They are an NVDA customer. I don’t know, I think more versatile chips, by definition, should find a home in more uses than a more tailored made application specific chip. That’s not to say there won’t be many an AI product that will run on more specific chips. But if you buy into the thesis that the AI market will be huge, there will be a huge chunk of that market who would not want or need to develop a specialized chip.

“As an aside, chip design by AI must be a thing right?”

A week or so ago I posted on exactly this about How Nvidia does it(innovating so rapidly in software and chip design)


From Nvidia’s mouth

“We’re also training neural networks to understand chipset design and very-large-scale-integration, so our engineers can work more quickly and efficiently.”

“Yes, we’re using GPUs to help us design GPUs.”

Interestingly Facebook built an exact replica of NVDA SATURN V in their data center. It is tied with NVDA at #31 fastest (#1 efficiency flop/w). supercomputer in the world according to Top500 2017 conference. Exact same build and stats.


So maybe it will help Facebooks new chipologist make a new chip? How’s the Oculus Rift thing doing?



How’s the Oculus Rift thing doing?

I noted a display for it at Best Buy this past Sunday. I think there may have been a sign saying something about an in-store demonstration.

On the same trip, I noted a number of empty slots where NVDA GPUs are supposed to be sold (implying that they were sold out). Here’s a few pictures of that observation.


As an aside, chip design by AI must be a thing right?

There are several types of AI. The most popular and best examples today are supervised learning where you train a deep neural network with thousands to millions of examples of something and the networks learns how to predict the correct answer…such as computer vision, natural language processing, voice recognition. This won’t work for chip design since we don’t have millions of examples of what we want to be designed.

As for unsupervised learning…this is where you the AI learns on its own…for example it might look at many images and partition them into groups…that it doesn’t know the names of.

In reinforcement learning (like learning to play a game of chess or Go) you give the rules of the game (or task) and the metric you want to maximize or minimize and the machine iterates over and over. The problem with having a machine do this is coming up with a concise machine understandable list of rules, requirements and guidelines (some of them are a flexible and some aren’t) and there are multiple metrics. And you have to be able to program it and build it with economical yields. And maybe be at least a little bit compatible with previous products. Or maybe that doesn’t matter because the machine will write all the drivers, middleware, compilers, tools, applications, etc.

My opinion is that AI will first be used, incrementally in small portions of the process and on small chips before anyone just says “Alexa, design a replacement server chip for yourself.”


1 Like