Google Doubles Down On AI Overviews

Date:


Google Robot Dj Ray Aio Update

Google’s Head of Search, Liz Reid, wrote a blog post last night named “AI Overviews: About last week.” She basically said that overall the vast majority of the AI Overviews are really good and they did find examples of where they can make improvements. But AI Overviews are here to stay and Google will continue to show them in Google Search.

As you remember, Google launched AI Overviews a couple of weeks ago in the US. Then over time, many started to see and share weird and embarrassing (sometimes harmful) examples of AI Overviews, which led to Google updating its help documentation and Google’s CEO going on defensive.

She said, “We found a content policy violation on less than one in every 7 million unique queries on which AI Overviews appeared.” “We’ll keep improving when and how we show AI Overviews and strengthening our protections, including for edge cases, and we’re very grateful for the ongoing feedback,” she added.

Some are calling the improvements, the updates, made to these AI Overviews, the Ray Update. Mike King suggested the name on X, saying, “‘m gonna name the first algorithm update of the AIO era. We’re gonna call this one the “Ray Filter” or the “Ray update” named after Lily Ray.” Lily was instrumental at pushing Google to work harder on these AI Overviews by sharing countless examples of where they went wrong.

Here are some bullets on what Liz Reid said, I go a bit deeper on Search Engine Land and there is more coverage on Techmeme. I should note, this is what she said, not what I am saying:

  • Searchers like the AI Overviews and are engaging with them and the publishers referenced in them more
  • AI Overviews work very differently than chatbots and other LLM products
  • AI Overviews are integrated into core search and only show information that is backed up by top web results
  • AI Overviews generally don’t “hallucinate” or make things up in the ways that other LLM products might
  • When AI Overviews get it wrong, it’s usually for other reasons: misinterpreting queries, misinterpreting a nuance of language on the web, or not having a lot of great information available.
  • AI Overviews accuracy rate is as good as featured snippets
  • Google said they’ve “seen nonsensical new searches, seemingly aimed at producing erroneous results.
  • There have been a large number of faked screenshots shared widely
  • “But some odd, inaccurate or unhelpful AI Overviews certainly did show up,” Google admitted
  • There can be “data voids” and “information gaps” where Google might cite pages it should not, like satire documents (like in the case of “How many rocks should I eat?”
  • In some cases Google said the AI Overviews misinterpret language on webpages and present inaccurate information

So now what will Google do to improve AI Overviews?

  • Google won’t individually fix each one that goes bad, it updates its models to improve what went wrong so it works for other queries too
  • Google built a better detection mechanisms for nonsensical queries that shouldn’t show an AI Overview, and limited the inclusion of satire and humor content.
  • Google updated its systems to limit the use of user-generated content in responses that could offer misleading advice.
  • Google added triggering restrictions for queries where AI Overviews were not proving to be as helpful.
  • For topics like news and health, Google said it already have strong guardrails in place. For example, Google said it aims to not show AI Overviews for hard news topics, where freshness and factuality are important.
  • In the case of health, Google said it launched additional triggering refinements to enhance our quality protections.

Here are some posts on X on this:

Forum discussion at WebmasterWorld.





Source link

Share post:

[tds_leads title_text="Subscribe" input_placeholder="Email address" btn_horiz_align="content-horiz-center" pp_checkbox="yes" pp_msg="SSd2ZSUyMHJlYWQlMjBhbmQlMjBhY2NlcHQlMjB0aGUlMjAlM0NhJTIwaHJlZiUzRCUyMiUyMyUyMiUzRVByaXZhY3klMjBQb2xpY3klM0MlMkZhJTNFLg==" f_title_font_family="653" f_title_font_size="eyJhbGwiOiIyNCIsInBvcnRyYWl0IjoiMjAiLCJsYW5kc2NhcGUiOiIyMiJ9" f_title_font_line_height="1" f_title_font_weight="700" f_title_font_spacing="-1" msg_composer="success" display="column" gap="10" input_padd="eyJhbGwiOiIxNXB4IDEwcHgiLCJsYW5kc2NhcGUiOiIxMnB4IDhweCIsInBvcnRyYWl0IjoiMTBweCA2cHgifQ==" input_border="1" btn_text="I want in" btn_tdicon="tdc-font-tdmp tdc-font-tdmp-arrow-right" btn_icon_size="eyJhbGwiOiIxOSIsImxhbmRzY2FwZSI6IjE3IiwicG9ydHJhaXQiOiIxNSJ9" btn_icon_space="eyJhbGwiOiI1IiwicG9ydHJhaXQiOiIzIn0=" btn_radius="3" input_radius="3" f_msg_font_family="653" f_msg_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTIifQ==" f_msg_font_weight="600" f_msg_font_line_height="1.4" f_input_font_family="653" f_input_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMiJ9" f_input_font_line_height="1.2" f_btn_font_family="653" f_input_font_weight="500" f_btn_font_size="eyJhbGwiOiIxMyIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMSJ9" f_btn_font_line_height="1.2" f_btn_font_weight="700" f_pp_font_family="653" f_pp_font_size="eyJhbGwiOiIxMyIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMSJ9" f_pp_font_line_height="1.2" pp_check_color="#000000" pp_check_color_a="#ec3535" pp_check_color_a_h="#c11f1f" f_btn_font_transform="uppercase" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjQwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsibWFyZ2luLWJvdHRvbSI6IjM1IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiMzAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3Njh9" msg_succ_radius="2" btn_bg="#ec3535" btn_bg_h="#c11f1f" title_space="eyJwb3J0cmFpdCI6IjEyIiwibGFuZHNjYXBlIjoiMTQiLCJhbGwiOiIxOCJ9" msg_space="eyJsYW5kc2NhcGUiOiIwIDAgMTJweCJ9" btn_padd="eyJsYW5kc2NhcGUiOiIxMiIsInBvcnRyYWl0IjoiMTBweCJ9" msg_padd="eyJwb3J0cmFpdCI6IjZweCAxMHB4In0="]
spot_imgspot_img

Popular

More like this
Related