So a simple question was present, such as “What are the different types of lighting for the kitchen?” It’s a pretty menial question. However, it’s how the question is heard, understood, and dealt with. That’s where the next-gen Bing app comes in.
The next-gen bing app had to understand the question, spoken in a more natural format vs a typical web search. There were more examples of this, but it’s nothing you would be too surprised about.
This step is obvious, Bing has to find the relevant information. However, it’s dealing with presenting the information is very dynamic. It pooled visual data such as images. It also generated a unique page of the key information. It then spoke a clear answer about that data. Much like Siri would do, but it seemed more fluid and accurate to me at least.
Clicking one of the images, they were then able to select any part of the image. Saw a lamp in the picture you like? Crop and it’ll find more just like it. Furthermore, it’ll find related information, such as for sale items, colour choices, and other contextual information.
It seems simple enough, but that’s the point. A fluid user experience, where the hardcore Nvidia tech is using deep-learning and AI to make it easy for you. What’s more, is you can demo this new search right now HERE.
Get more information on GTC 2019 here.
According to a new report, the GeForce RTX 5090 GPU will be very expensive. It…
A new AMD processor in the form of an engineering model has been leaked in…
SK Hynix has claimed to be the first company to mass-produce 321-layer NAND memory chips.…
SOUNDS GREAT – Full stereo sound (12W peak power) gives your setup a booming audio…
Special Edition Yoshi design Ergonomic controller shape with Nintendo Switch button layout Detachable 10ft (3m)…
Fluid Motion: These flight rudder pedals are smooth and accurate that enable precise control over…