According to Microsoft, Brainwave has allowed performance increases of up to ten times for Bing AI, giving real-time performance. As well as latency reductions, it has enabled a tenfold model size increase. As well as the platform itself, performance is thanks to Intel’s FPGA chips. The same type of have been used in Bing for some time as part of Microsoft’s Catapult v2 server design, and come with AI-focused advantages.

In a demo of Catapult v2 at Ignite 2016, Microsoft revealed what they can do at scale. Using the full power of the company’s supercomputer, Dough Burger translated three billion words in a tenth of a second.

New Bing Features

Essentially, FPGA’s are more flexible than their predecessors and come at a higher power efficiency. This makes them ideal for tasks like natural language processing and machine learning. For Bing, it means the ability to analyze billions of documents across the web in a fraction of a second, while also delivering intelligent answers and better search results. The new abilities are liking a driving force between some of Bing’s new features, which Microsoft announced yesterday. The search engine will deliver upgraded intelligent answers by combining relevant facts from multiple sources. You can then hover over uncommon words for a definition, and search within an image to find similar elements across the web. Though Google still rules the search market, it’s clear Bing is working very hard to catch up, and the added speed and features will do a lot to bridge the gap.

Microsoft s Brainwave System Results in Tenfold Speed Increase for Bing AI - 27Microsoft s Brainwave System Results in Tenfold Speed Increase for Bing AI - 28Microsoft s Brainwave System Results in Tenfold Speed Increase for Bing AI - 5Microsoft s Brainwave System Results in Tenfold Speed Increase for Bing AI - 56Microsoft s Brainwave System Results in Tenfold Speed Increase for Bing AI - 77