Wednesday, May 28, 2014

Scalability Problem of the Day: Neural Networks

Deep Learning is the next frontier in computer science. After some initial breakthroughs, scientists and engineers are running into a major scalability problem: increasing the number of neurons doesn't improve a neural network's performance.

(MTR, 5/21/14) We found that if you put a lot of GPUs [specialized graphics processors] together we could make a much bigger neural network—10 billion nodes, with 16 machines instead of 1,000.

We used that same benchmark [images from YouTube videos] that the Google team did. But even though we could train a much larger neural net, we didn’t necessarily get a better cat detector. Right now we can run neural networks that are larger than we know what to do with.

This is a typical situation in Silicon Valley (I described in an earlier post). We are at a point where Machine 1 (exponential growth in computing power) is ahead of Machine 2 (Applications). Most likely, the next S-curve, i.e. a new growth cycle, will begin within the next 5-7 years.

tags: problem, scalability, constraint, machine1, machine2, silicon valley,

No comments: