NOV 22, 2024 /

Google’s search is going visual in a big way

Google has just held its annual I/O developer’s conference, where it lets the folks who do the hard programming work into making apps and services built on Google frameworks get together to learn what’s new.

At IO 2019, Google released new hardware such as the much more affordable Google Pixel 3a, which is available in Australia now. It also launched a new Nest-branded Home Hub with embedded camera that we won’t see for a few months.

Hardware isn’t really Google’s game, though, there’s little doubt that if Google lost search supremacy to Microsoft’s Bing or the DuckDuckGo engine, its business model would take a serious hit.

Text-based search may feel like a “solved problem” for Google. It’s actually something it constantly refines. This is at least to keep ahead of folks who try to “game” Google’s algorithm to ensure higher search visibility.

The business of ranking is only half the problem, however. There’s also how you present that information to a worldwide audience. At its I/O conference, Google announced smart new ways you’ll soon be able to see and experience search results.

The headline feature for search is the inclusion of Augmented Reality (AR) into search results. Google showed this off on stage with an Android phone searching for Great White Sharks.

You could go surfing off the coastline of Australia to try to get up close and personal with a Great White. That’s a risky proposition without a supply of oxygen, a shark cage and, frankly, nerves of steel.

The way Google’s AR search would have it, you’d end up with a virtual Great White projected via your phone’s display instead.
A good way to get a visual appreciation of a shark, with none of the risk. Google’s rolling out the new AR search over the coming months. If you’re on a compatible device, you should start to see AR results as an option — depending, of course, on whether anyone’s built an AR model of what you want.

Google is also making some significant visual changes to its Google Lens search system.

Google Lens uses your phone’s camera and some pretty smart machine learning to give you contextual information about whatever it’s looking at.

Point it at the Sydney Opera House and it can tell you plenty about this iconic Australian landmark. Point it at the MCG, and you might just get the cricket scores.

Google’s enhancing Lens by adding features like restaurant menu scans, so you can see images of popular dishes on a menu that Google recognises.

That works when (and if) people have saved photos of those dishes to Google’s drive service, because it can then link the geographic data (and in some cases the text that accompanies it) to the location you’re in. Doing so in real time is rather neat.

With an eye to the developing world, Google’s also bringing its Google Lens product to low-cost Android Go phones. Here in Australia we’re not developing, but you can pick up a number of Google Android Go handsets at low prices, and pretty soon you’ll be able to use Lens in the same way as a full Android phone. What’s more, Google Lens will be able to read text in signs out loud.

That’s quite a useful feature in countries where literacy rates are low, but one that could also be handy for folks here in Australia, whether it’s those learning English, or for when you’re faced with a sign in another language and you’re simply curious.

Photo of Alex Kidman
Alex Kidman
A multi-award winning journalist, Alex has written about consumer technology for over 20 years. He has written and edited for virtually every Australian tech publication including Gizmodo, CNET, PC Magazine, Kotaku and more.