Learn how to connect search, AI, and PPC into one unstoppable strategy. While we’ll have to wait and see the full impact, Bing’s move sets the stage for a new chapter in search. “… our product is built on the foundation of providing the best results, and we will not compromise on quality for speed.
Comparison with Google’s AI Overviews
The system accesses additional documents to provide more accurate and up-to-date information, supplement, or verify information in the LLM’s training data. BGS leverages the underlying LLMs’ training data as well as Bing’s own search index. This can be good news for website owners if the query is competitive. If you’ve been following research into Google AIOs, this should sound familiar. On average, BGS summaries linked to 6.6 URLs ranging from 4 on the low end to 12 on the high end. To view the original source information, use the ‘Sources’ links.”
- The analysis of the SERP positions for the 18 URLs linked in the documents reveals a clear preference for top-ranked URLs, especially those in the top 2 positions.
- Microsoft says it continues to evaluate the impact that AI in search is having on websites in terms of direct traffic and readership.
- The answer summaries are shifted down the page to make room for ads at the top of the page for specific queries.
- This has a significant impact on the visibility of organic search results.
- Bing will soon prioritize AI-generated answers alongside traditional search results.
Web Page Result Mapping
Moreover, Bing has taken a much stronger stance than Google in providing visibility to organic results and transparency in sourcing summary content. It seems that the system either seeks out documents responsive to related queries or documents with text that verifies the statements in the summary. Several studies from Advanced Web Ranking, SE Ranking, and Authoritas found that AIO summaries included a significant number of links that were low or unranked for that query. The fact that the summaries can include documents that don’t rank for the query provides some important clues as to how the system might work. Bing has provided several examples of its generative search experience which help provide some first insights into the linked sources.
Overall, it's hard to see why this is better than a regular search engine with AI at the top of every search result. Copilot's current AI in search is much cleaner and shows the sites from which the information came directly under the result. This change follows the earlier addition of AI search summaries through Microsoft's Copilot technology. Returned in the relatedSearches.value key (array of related searches) in the official API, our API returns the equivalent in the top level related_searches key (array of related searches).
BGS adopts a more elaborate layout that takes up more real estate both horizontally and vertically on the search results page, offering detailed information about the query. Both BGS and Google’s AIOs aim to provide a summarized response to search queries, but they differ significantly in design and functionality. Organic search results are positioned to the right of the answer summary, ensuring they remain visible and relevant. Over the past few months, we have seen a mixed response to Google’s AI Overviews with summaries that range from inaccurate and false to downright disturbing and dangerous. To improve efficiency, we trained SLM models (~100x throughput improvement over LLM), which process and understand search queries more precisely.” While transformer models have served us well, the growing complexity of search queries necessitated more powerful models.”
Links in Bing Generative Search Result and Search Position
- I explore this in an earlier article highlighting how A Overviews explore related queries for documents to provide unique information.
- Over the past few months, we have seen a mixed response to Google’s AI Overviews with summaries that range from inaccurate and false to downright disturbing and dangerous.
- Organic search results are positioned to the right of the answer summary, ensuring they remain visible and relevant.
- To see the links, you have to select the arrow under each section.
- In AIOs, links may be provided in a carousel underneath the full summary or individual sections.
- While we’ll have to wait and see the full impact, Bing’s move sets the stage for a new chapter in search.
- Copilot’s current AI in search is much cleaner and shows the sites from which the information came directly under the result.
For instance, the press release features a representative search example for “How long can elephants live,” which showcases a layout markedly different from that of Google’s AI Overview (AIO). Currently, BGS only activates for a small number of queries. This feature is still in the preliminary stages, visible only for a select few queries as part of its initial rollout.
The next step in Bing generative search
The warning states “Generative AI is experimental.” For health- and finance- gtbet casino login related queries, the summar directs the searcher to seek out professional advice. In AIOs, links may be provided in a carousel underneath the full summary or individual sections. Below each section of the summary, Bing also provides the sourced documents. Both BGS and AIO summaries provide links to URLs cited as sources for the summaries. Both Chrome and Safari show organic results automatically, in Microsoft Edge you have to select See more on the right.
Bing aims to do that using smaller language models and advanced optimization techniques. As users ask more complex questions, search engines need to better understand and deliver relevant results quickly. Bing’s switch to LLM/SLM models and TensorRT optimization could impact the future of search. This is where TensorRT-LLM comes into play, reducing model inference time and, consequently, the end-to-end experience latency without sacrificing result quality.” This update aims to improve performance and reduce costs in search result delivery.
With TensorRT-LLM, the latency was reduced to 3.03 seconds per batch, and throughput increased to 6.6 queries per second per instance. TensorRT-LLM is a tool that helps reduce the time and cost of running large models on NVIDIA GPUs. Using LLMs in search systems can create problems with speed and cost. Leveraging both Large Language Models (LLMs) and Small Language Models (SLMs) marks a significant milestone in enhancing our search capabilities. “At Bing, we are always pushing the boundaries of search technology.
If you select the arrow next to ‘Learn more,’ the additional documents will display. As you hover over each annotation and section of text, a box opens providing direct links to the source verifying the statement. The BGS experience provides annotations for each section of AI-generated content. Additionally, Source Citations are prominently displayed throughout the AI-generated experience, providing greater transparency and easy verification of the information presented. The page features a Document Index that makes it easy to quickly navigate to Related Sections, enhancing user experience by making information more accessible.
For the query “what is a spaghetti western”, there were 6 URLs cited, and all of them are in the top 10 results for that query. 28.3% of URLs were found to not rank in the top 20 results for the query. In the 18 examples observed, it appears that Bing’s Generative Search doesn’t just pull information from the top-ranking documents related to a query but also includes lower-ranking and unranked documents.