Google has announced a new redesign of its search tool, which makes it more visible and adds additional contextual information about its results.
At its Search On event, the web giant also announced new features for Google Chrome and its Google Lens artificially intelligent photo software.
The main aesthetic change is visually browsable results for “searches where you need inspiration” such as “insert painting ideas,” says Google, which puts pictures at the top of search results without navigating to the images tab. will present a series of
It will also bring more relevant information in the coming months, including a new ‘things to know’ section that includes “different dimensions that people commonly search for”.
For example, for those looking to paint with acrylics, below the top result there will be a series of drop-down results that include a step-by-step guide, tips or style options. Google will also have “Refine this search” and “Expand this search” options, so that users can quickly move between different levels of relevant information.
Google is adding more context to search in other ways by expanding its ‘About this result’ panel, which can be accessed via the three-dot icon to the right of search results on desktop or mobile.
Currently it shows details of a source – eg “Granthshala”, or “from Wikipedia” – but Google now calls the sites themselves “in .” will allow to describe [their] own words”. Users will also be able to see what others have said about the website, including “news, reviews and other helpful background references,” which the company says will help users better evaluate sources .
Whether this turns out to be true or not remains to be seen; Twitter introduced a similar system called Birdwatch where users can annotate incorrect tweets, however. Many users simply voice opinions or flag baseless claims – such as voter fraud in the 2020 US election – as “not misleading”.. Granthshala Contact Google for more information on how such claims will be investigated.
As well as search, Google’s updates to Lens will let users search for information based on the content in a photo – such as taking a picture of a pattern and asking Google for “socks with this pattern”, or a broken bike. Picture of chain and asking google “how to fix it”. Google’s machine learning will now recognize the content of the image and search accordingly, a technology it calls multitask integrated model (MUM) who can better understand context and comparisons.
MUM will also be used for video search results, identifying topics that may be related to the video content, even if it is not explicitly mentioned.
For iPhone and Android users, the Lens update will be available in the Google app that makes all images on a page searchable through Lens. Google has a clear advantage over its Android operating system but is told Granthshala that the Google app is opened three billion times per month; For context, there are a billion iPhone users, who open multiple apps multiple times per day.
A new Lens update is also coming to Chrome on desktop in the coming months, where users will be able to select images, video and text content on a website with Lens to quickly view search results in the same tab-page. You are without leaving.
Finally, Google’s Shopping tab is getting more information about in-store items in local stores, along with an “In Stock” button to filter the results. It is launching today in the UK, US, Australia, Austria, Brazil, Canada, Denmark, France, Germany, Japan, the Netherlands, New Zealand, Norway, Sweden and Switzerland.
Credit: www.independent.co.uk /