NVIDIA’s annual developer conference, the GPU Technology Conference kicks off next week in San Jose and even if you can’t be there in person you can watch the keynotes and other parts of the conference online. Many attendees will be repeat visitors to the conference and have a good idea of what to expect. First and foremost, this is our developer conference, so expect it to be jam packed with lots of technical sessions. With over 400 sessions, you probably will want to use the online session filter tool to search for the sessions you want to attend rather than reading through all 400+ session descriptions. Besides of course the keynotes, some of the sessions I’m most looking forward to our those presented by our customers. While we have some great NVIDIA speakers, many of the most highly rated repeat speakers come from our customer community. Just coming back from three weeks of travel meeting with many of our customers, here are some trends I expect will be highlighted during the conference.
Over the last several quarters we have had hundreds of customer trials of our virtual GPU (vGPU) technology. vGPU has nothing to do with virtual currencies (which I haven’t spotted any sessions on although I definitely expect to hear some buzz about during the show) but is our technology for bringing the full benefit of NVIDIA hardware-accelerated graphics to virtualized desktop solutions. Our vGPU technology brings GPUs and users desktops straight into the data center, be it an enterprise data center or a public cloud data center, bringing all of the advantages of virtualization that enterprises and public clouds have learned to love. In addition to hearing traditional commercial virtualization vendors like Citrix and VMWare talk about our GPU solutions, I expect quite a bit of discussion about using GPUs with open source solutions like OpenStack.
But vGPU usage is by no means the only area where NVIDIA technology usage in the data center has grown substantially since the last GTC conference a year ago. We have seen an explosion in the use of GPUs for machine learning and pattern recognition. Much of this is going on in Internet data centers. A great example is discussed in this Netflix Blog discussing their experiences using GPU Instances on Amazon Web Services to run distributed neural networks for their recommendation engine. Commercial companies aren’t missing out on this trend either. With all the international travel I have been doing, I especially enjoy using the credit card from the provider who I know is using NVIDIA GPUs to identify potentially fraudulent transactions.
NVIDIA’s new Maxwell GPU has received a lot of press over the last several weeks for the energy efficiency it brings to high performance gaming laptops. Energy efficiency is a hot topic everywhere from laptops to electric car vehicle control systems to multi-megawatt supercomputer centers and expect to hear a lot more about what NVIDIA and our customers are doing to drive energy efficiency.
And for my last trend to look for, think ARM. With many new ARM-64 processors already announced to ship this year, wherever I go in the world, everyone from small businesses to large governments ask me about using GPUs with ARM. There is so much innovation going on in this space right now, it is hard to bet against the success of ARM and the combined talents of all the companies working in the ARM ecosystem.
It promises to be an exciting week!