Normal görünüm

Dün alınanlar — 30 Mart 2026

Google Search Console Performance Report Impressions Spiking???

30 Mart 2026 saat 14:51

Analytics Google

Google Search Console's performance report may have some sort of bug or tracking issue. When you set some filters, you may see a huge spike in impressions start to show up in the report.

This may be similar to that num 100 spike we saw last year.

Brodie Clark was the first to post about this and wrote on X - he said, "there is something bizarre going on with Google Search Console data right now." He said this happens when he filters by the merchant listing filter, "using the 'merchant listings' search appearance filter, which has historically been independent of the influence of rank trackers (because of an impression being recorded only after a grid result is selected), is now a mess," he wrote.

Here is his screenshot:

Google Search Console Performance Report Impressions Spiking

Brodie added that this is across several large-scale eCommerce sites, "CTR data is no longer accurate for desktop, with there now being many queries appearing that are clearly related to tools, with significant increases in impressions from this past week in particular."

I am not sure how widespread this is, if it is limited to just the merchant listing filter or not...

Here is his post:

Heads-up: there is something bizarre going on with Google Search Console data right now.

Similar to the changes that came to light after the disabling of &num=100, impressions are again skyrocketing for specific surfaces on desktop.

For example, using the 'merchant listings''... pic.twitter.com/1uYaHiW7rh

'" Brodie Clark (@brodieseo) March 30, 2026

Forum discussion at X.

Google Chimes In On Teen SEO Wants To Save Family Spain Vacation Rental

30 Mart 2026 saat 14:41

Spain Vacation Home

Google's John Mueller chimed in on Reddit when a teenager of a family vacation rental business said he was trying to save his family business through SEO after being burned by previous SEOs. In short, it seems the family has a vacation rental in Spain, and business is significantly down due to the loss of Google traffic.

The teenager asked a number of very specific SEO questions and added, "I am really lost and dont know how to proceed, and I CANT AFFORD to FAIL HERE."

The problem is, those specific SEO questions most likely won't make any difference. John Mueller from Google chimed in and wrote:

SEO is not magic:

Fundamentally, I think you need to keep in mind that any website with magical SEO won't necessarily rank highly in search results quickly, or necessarily drive clients to a business. If there were such a thing as making a website with perfect SEO that drives all the clients to one business, everyone here would be retired and living in ... idk, Spain :).

Competition is rough:

The online market for vacation rentals it hard, there's very strong competition from large aggregators, not just in terms of ranking, but also in terms of brand recognition. This is not to say that you can't get 40-50% more traffic from search (it really depends on the situation of the site), but this is (usually!) not a matter of just putting some meta-tags on a SEO optimized page that comes out of ChatGpt, targeting "house", "surroundings", "family vacation", etc. (Aside, keep in mind that many GenAI-made sites end up with JS frameworks that are traditionally more complex for SEO.)

And his advice:

My recommendation would be to find some more experienced folks who have time & interest in helping you check out the overall situation (what's the real headroom vs what has just changed in today's world? what's a realistic timeline - in the best/worst cases?), and help you to figure out a reasonable plan of attack. I don't think this is something that random reddit comments can solve (unless ... your site is actively blocking search engines, which it doesn't sound like it is). Ultimately, there's no guarantee that doing SEO well can solve this, so IMO it makes sense to go at this in a thoughtful & realistic way -- and perhaps, spend enough time working out alternative approaches.

You can scan through the thread and see John chime in more based on some advice around migrating stuff, which he thinks might be a bad idea.

These posts are always heartbreaking to read...

Forum discussion at Reddit.

Dünden önceki gün alınanlar

Google Adds Properties For Discussion Forum & QA Page Markup & Documents Robots Tags Outside HTML Head

25 Mart 2026 saat 14:41

Google Code

Google had made a few changes to its SEO developer help documentation. First it documented how Google Search handles and processes robots meta tags outside the HTML head. Second, it added more supported properties for Discussion Forum and QA Page markup.

(1) Google updated its robots meta tags documentation to add:

Note: Google Search doesn't enforce placement of meta robots in the HTML head and will respect robots meta tags in the body section of an HTML document as well.

Google said "behavior didn't change but was previously undocumented."

(2) Google also added new supported properties for Discussion Forum and QA Page markup.

I tried to do a compare of the old and new documents but the Wayback Machine was down at the time, but you can compare yourself.

Google said it made this update in order to provide "more clarity on comment thread structure to Google ingestion systems. This prevents misinterpretations in our handling of forum and Q&A content."

Forum discussion at X.

Report: Google AI Overviews Show Less Often For Breaking News

25 Mart 2026 saat 14:31

Robot Newspaper Bench

There is new data out from Newzdash that shows just how often AI Overviews appear within Google across new sections. And while AI Overviews shows 60%+ of the time for health queries, when it comes to breaking news and major headlines, it shows less than 6% of the time.

This data was shared by John Shehata on LinkedIn and here is the breakdown:

  • Health - 60.51%
  • Technology - 38.23%
  • Business - 36.58%
  • Entertainment - 24.19%
  • Sports - 21.51%
  • World - 16.86%
  • National - 14.40%
  • Breaking News & Major Headlines - 5.38%

Here is that chart:

Newzdash Ai Overviews Data

He broke it down by each channel within each section as well, here are more of the slides:

Forum discussion at LinkedIn.

Google Tests "Skip Digging, Start Guided Research" Driving Users to Web Guide-like Results

25 Mart 2026 saat 14:25

Skip digging, start guided research

Google first rolled out Web Guide in July of 2025 as a labs experiment, which uses AI to organize the search results. It initially triggered in the Web tab in the Google SERPs and not the default search results. And then Google expanded Web Guide to the All tab in the search results in December of 2025 (for those with the experiment active). It's an interesting SERP and can be helpful in certain situations. That said, I've been using it since the experiment launched and I often find myself clicking the "Classic Search" button to view the standard SERPs...

Well, now Google may be testing calls to action to drive people to a Web Guide-like SERP. @lenraleigh pinged me on X about the test he was seeing across several queries. In the test, Google provided a call to action saying, "Skip the digging, start guided research", which drives you to results that look very similar to Web Guide (organized by topic). I could not see the test across my accounts so it definitely feels like a test.

Google is testing or rolling out a new feature "Skip digging, start guided research" - seeing it on desktop.@rustybrick pic.twitter.com/Ksh3b4QJkm

'" Len (@lenraleigh) March 25, 2026

Here are some screenshots based on what Len saw:

Skip digging example

More examples of skip digging

I've been wondering lately what will happen with Web Guide. Again, I often click the "Classic Search" button to exit Web Guide... but for certain queries it does provide helpful results organized by AI.

Stay tuned.

GG

Google Adds Google-Agent User Agent

23 Mart 2026 saat 14:11

Googlebot Lizzi Image

Google has added a new user-agent to the user-triggered fetchers named Google-Agent. Google said the "Google-Agent user agent is rolling out over the next few weeks, and will be used by Google agents hosted on Google infrastructure to navigate the web and perform actions upon user request."

The Google-Agent user agent is associated is used by agents hosted on Google infrastructure to navigate the web and perform actions upon user request (for example, Project Mariner). It uses IP ranges from user-triggered-agents.json.

Google is also experimenting with the web-bot-auth protocol, using the https://agent.bot.goog identity.

Google Agent

Forum discussion at X.

Image credit to Lizzi

Google Officially Removes "What People Suggest" Health SERP Feature

17 Mart 2026 saat 16:20

Google removes What People Suggest

In March of 2025 Google announced a new SERP feature that would organize and surface health conversations from online discussions. For example, from Reddit, X, Facebook, and other social media sites. As Google said at the time:

"While people come to Search to find reliable medical information from experts, they also value hearing from others who have similar experiences. That's why we're making it even easier to find this type of information on Search with a new feature labeled 'What People Suggest.' Using AI, we're able to organize different perspectives from online discussions into easy-to-understand themes, helping you quickly grasp what people are saying."

Here is what the feature looked like:
what people suggest health search feature

Well, that didn't last very long. Google confirmed to The Guardian that they have indeed removed the search feature as part of a "broad simplification" of the search results. Health and medical is a sensitive area for Search, and sourcing conversations from across the web via AI can always be challenging. That said, Google did say that the removal had nothing to do with safety or quality of the feature.

Google explained:
'It had nothing to do with the quality or safety of the feature, and we continue to help people find reliable health information from a range of sources, including forums with first-person perspectives that people find incredibly useful.'

Well, this is at least one area where Reddit will be surfaced less frequently... Google's "Check Up" event is being held today and its Chief Health Officer is set to present. Let's see if anything new is announced on the SERP feature front.

GG

Google explains undocumented way to disavow entire TLDs (and why you probably shouldn't)

11 Mart 2026 saat 14:45

disavow tld google

Google's John Mueller revealed an interesting tip on Bluesky the other day for anyone still disavowing links. John explained that site owners can indeed disavow entire TLDs if they want to. Note, he's not referring to disavowing all links from a site (domain)... he's actually referring to ALL links from an entire TLD (like .xyz, .biz, .info., etc.)

When I commented that the technique wasn't documented in Google's disavow documentation, John explained that 'it's a big hammer' and they probably shouldn't document that technique.

News on the disavow front. It ends up you CAN disavow an entire TLD (like xyz, biz, etc.) This was never documented by Google but John Mueller explained you can actually do that. He also said that Google probably shouldn't document it since "it's a big hammer"... I agree & think that's crazy to do.

[image or embed]

'" Glenn Gabe (@glenngabe.bsky.social) March 7, 2026 at 9:26 AM

Note, Christian Kunz actually covered comments from John in 2021 about disavowing at the TLD level based on tweets from John at the time, but that technique still never made its way into the documentation.

disavow tld 2021

disavow tld 2021 don't recommend

I totally agree that disavowing at the TLD level is excessive. Google has explained many times that most sites owners never need to touch the disavow tool, let alone disavowing entire TLDs. Google's systems are very good at ignoring spammy, junky links. If you don't have a manual action, didn't set up unnatural links, etc., then you never should have to disavow any links. Also, Bing already removed the disavow tool in 2023 and Google could do the same at any time. It's already buried in the GSC interface, and that's by design.

Anyway, it was an interesting note by John since it was never officially documented. But be careful, disavowing at the TLD level is a huge hammer (like John said). Beware.

GG

Google Search Console Crawl Stats Date Toggle/Selector Buggy

10 Mart 2026 saat 14:11

Google Reports Annotations

Google seems to have a bug with the Crawl Stats report in Google Search Console. The date selector appears to be acting a bit weird, where it doesn't let you select the before or after date.

Gagan Ghotra spotted this and posted a video of the issue on X - he said that the "filters in crawl stat reports are buggy too." He said the date picker doesn't show up after clicking on any of the dropdown filters. You have to click next to that arrow down for date picker to show up.

Here is his video:

Google Search Console Crawl Stats Date Toggle

filters in crawl stat reports are buggy too
date picker don't show up on clicking one of dropdown filter
have to click next to that arrow down for date picker to show up pic.twitter.com/tqPOhlSMDE

'" Gagan Ghotra (@gaganghotra_) March 9, 2026

This feature is super deep inside Search Console. You need to go to settings, then crawl stats and then dig into one of those reports.

I can replicate the bug, I suspect it is an easy fix.

Forum discussion at X.

Google Search Console BigQuery Exports Not Working?

9 Mart 2026 saat 14:21

Google Schema Code

There are a number of complaints that the Google Search Console bulk data export to BigQuery is not working. The issue seemed to have started several days ago and is still not working today.

I spotted this via Valentin Pletzer who posted on X saying, "What's up with the GSC BigQuery exports? I've got and hearing a lot about failed exports for over a week now."

Here is his screenshot of the error:

Google Search Console Bigquery Exports Broken

There are also more complaints of these issues in the Google Webmaster Help Forums.

I suspect Google will address the issue in the near future.

Forum discussion at X.

Google: Most Sites Don't Need To Disavow Links But That's Not All Sites

6 Mart 2026 saat 15:51

Google Links

Google's John Mueller again spoke about the disavow link file. This time he said that while "most sites don't need it, he added but "that's not all sites." Some sites may indeed need to disavow links.

John said on Bluesky, "If you're conflicted and just want to be sure, it's totally fine to set up & use disavow files. If you notice that the bulk of the problems are from a few TLDs, you can also disavow the whole TLD." This was in response to someone who was unsure if they should disavow.

The SEO, Jacques Bouchard, asked, "I know you're not a fan of disavow files, but bear with me. A client is getting about 50 links/week redirecting to this kind of page a week. Should I include them in a disavow file, or nah? They technically don't link to the site."

John said, if you are conflicted - go for it.

Here is John's response:

If you're conflicted and just want to be sure, it's totally fine to set up & use disavow files. If you notice that the bulk of the problems are from a few TLDs, you can also disavow the whole TLD. The disavow file is a tool, not a religion :-). Most sites don't need it, but that's not all sites.

If you're conflicted and just want to be sure, it's totally fine to set up & use disavow files. If you notice that the bulk of the problems are from a few TLDs, you can also disavow the whole TLD. The disavow file is a tool, not a religion :-). Most sites don't need it, but that's not all sites.

— John Mueller (@johnmu.com) March 6, 2026 at 1:56 AM

As a reminder, John Mueller of Google often says that disavowing links is a waste of time. Heck, like Bing, Google may remove the disavow tool at some point. In fact, Google said the disavow tools hurts many more sites than it helps and has doubled down on not using it.

Forum discussion at Bluesky.

Loading Content With JavaScript Does Not Make It Harder For Google Search

5 Mart 2026 saat 15:51

Google Code

Google has removed a whole section from its JavaScript SEO documentation because it was outdated and Google says loading content with JavaScript does not make it hard for Google Search.

Google wrote it "Removed a section on accessibility from the JavaScript SEO basics documentation." The section was titled "Design for accessibility" and had a line in there that said, "Viewing a site as text-only can also help you identify other content which may be hard for Google to see, such as text embedded in images."

The whole section was removed because "The information was out of date and not as helpful as it used to be," Google added. "Google Search has been rendering JavaScript for multiple years now, so using JavaScript to load content is not "making it harder for Google Search". Most assistive technologies are able to work with JavaScript now as well," Google also wrote.

Here is what the section said:

Create pages for users, not just search engines. When you're designing your site, think about the needs of your users, including those who may not be using a JavaScript-capable browser (for example, people who use screen readers or less advanced mobile devices). One of the easiest ways to test your site's accessibility is to preview it in your browser with JavaScript turned off, or to view it in a text-only browser such as Lynx. Viewing a site as text-only can also help you identify other content which may be hard for Google to see, such as text embedded in images.

Google Javascript Accessibility Section

Forum discussion at X.

New Google Help Doc About Google's Web Crawling

4 Mart 2026 saat 15:31

Googlebot Lizzi Image

Google has posted a new help document named Things to know about Google's web crawling. This document currently lists 9 things on how Google's web crawling works.

Google said this document was created "Based on questions we've received over the years, we've put together a resource page with basic educational information about crawling to better highlight various resources about crawling that are available to site owners."

Here is the short version of this document but check out the full document as well:

  • What is crawling? In short, crawling is how Google "sees" the web
  • We have many crawlers; they each have important jobs
  • We perform repeat crawls to find the latest updates and to provide the freshest search results
  • Frequent crawling is a good sign!
  • Google's crawling has grown over time as pages have become more complex
  • We optimize crawling automatically
  • Google crawlers never go into paywall or subscription content without permission
  • Site owners have control over what gets crawled, and how
  • Our standard crawlers always respect websites' choices about how their content is accessed and used

This page is worth a read, read it over at Things to know about Google's web crawling.

Here is a screenshot of this page for archival purposes:

Google Doc How Crawlers Work

Forum discussion at X.

Image credit to Lizzi

How Google Search & Google Discover Picks Image Thumbnails

3 Mart 2026 saat 15:31

Google Images

Google updated its image SEO best practices and Google Discover documentation to clarify how Google picks a preferred image thumbnail for Google Search and Google Discover. Google wrote it "uses both schema.org markup and the og:image meta tag as sources when determining image thumbnails in Google Search and Discover."

Google added a whole new section to the image SEO best practices document named Specify a preferred image with metadata. It reads:

Google's selection of an image preview is completely automated and takes into account a number of different sources to select which image on a given page is shown on Google (for example, a text result image or the preview image in Discover).

You can influence which image gets selected by providing your preferred image through one of the following metadata sources:

Specify the schema.org primaryImageOfPage property with a URL or ImageObject.

Or specify an image URL or ImageObject property and attach it to the main entity (using the schema.org mainEntity or mainEntityOfPage properties):

Specify the og:image meta tag.

There is also this part for "When choosing your preferred image for use in schema.org markup or the og:image meta tag, follow these best practices:"

  • Choose an image that's relevant and representative of the page.
  • Avoid using a generic image (for example, your site logo) or an image with text in the schema.org markup or og:image meta tag.
  • Avoid using an image with an extreme aspect ratio (such as images that are too narrow or overly wide).
  • Use a high resolution, if possible.

Then in the Google Discover documentation, Google expanded on the images section and it reads:

Include compelling, high-quality images in your content that are relevant, especially large images that are more likely to generate visits from Discover. We recommend using images that meet the following specifications: At least 1200 px wide, High resolution (at least 300K), and 16x9 aspect ratio.

Google tries to automatically crop the image for use in Discover. If you choose to crop your images yourself, be sure your images are well-cropped and positioned for landscape usage, and avoid automatically applying an aspect ratio. For example, if you crop a vertical image into 16x9 aspect ratio, be sure the important details are included in the cropped version that you specify in the og:image meta tag).

Google also added this section:

Use either schema.org markup or the og:image meta tag to specify a large image that's relevant and representative of the web page, as this can influence which image is chosen as the thumbnail in Discover. Learn more about how to specify your preferred image.
- Avoid using generic images (for example, your site logo) in the schema.org markup or og:image meta tag.
- Avoid using images with text in the schema.org markup or og:image meta tag.

Forum discussion at X.

March 2026 Google Webmaster Report

2 Mart 2026 saat 15:21

Google Webmaster Report

Are you ready for the monthly Google Webmaster report? Well, here is the March 2026 edition, where I sum up all the larger Google organic changes in one story, as a recap for you and me. First off was the first ever Google Discover core update that took over three weeks to roll out, finishing last Friday. We also had a lot of ongoing and very heated Google search volatility that we covered over the past month but no, Google has nothing to share on that topic.

Google did have a brief serving bug that it confirmed, but it didn't seem too impactful. Google may have hit listicles sites or was it a reviews update? Google's update to it's file size limits which caused some confusion.

The Google Search Console AI-powered configuration tool went live. But it seems like Search Console is missing page indexing data.

But Google launched new link styles in AI Mode and AI Overviews. Google AI Mode added UCP-powered checkout, plus there are tons of new AI news listed below.

Google updated its business profile reviews policies, which may be why many reviews disappeared. We may soon see Google Posts recurring schedules for Posts.

And Google announced an impressive earnings report.

Those were some of the larger changes over the past month, make sure to check out the February 2026 Google webmaster report if you missed that.

Here are the bigger Google SEO stories from the past 30 days:

Google Algorithm Updates:

Google SEO: Google Search Console: Google Search Features: Google Local & Business Profiles: Misc Google:

Google Won't Use Sitemap Files If Its Not Convinced Of New/Important Content

23 Şubat 2026 saat 15:51

Google Sitemap

Google's John Mueller said that if Google is not convinced that there are new and important content to index on your site, then it won't use the sitemap file on your site.

Just because you have a sitemap file, it does not mean Google will index all the pages in that file. This isn't really new, we discussed it before.

John wrote on Reddit a few days ago:

One part of sitemaps is that Google has to be keen on indexing more content from the site. If Google's not convinced that there's new & important content to index, it won't use the sitemap.

We know Google does not index everything, in fact, very few sites have all of its pages indexed by Google (maybe unless it is a 5 page website).

So adding a sitemap file, while useful for many reasons, doesn't mean those pages will be indexed.

Also, here is a somewhat related post on Bluesky from over the weekend:

In the extreme case where Google can't crawl at all, then of course at some point pages start to drop out of the index. For everything else, our systems tend to find a good balance. I don't think it's possible to define an absolute cut-off point, & sites that care tend to watch out for speed too.

'" John Mueller (@johnmu.com) February 21, 2026 at 4:03 AM

You can calculate how long it would take to crawl the whole site assuming no duplicates, but imo don't think of this as being the problem - it's more like the symptom of a number of things.

— John Mueller (@johnmu.com) February 23, 2026 at 4:59 AM

Forum discussion at Reddit.

Google Search Console Page Indexing Report Missing A Chunk Of Data

23 Şubat 2026 saat 15:05

Google Servers

The Google Search Console page indexing report seems to be missing a chunk of data. The missing data is from December 15th and earlier, but we do have data from December 15th (14th) through today.

This seems to be impacting all Google Search Console profiles. So it must be some sort of bug on Google's end.

I spotted this first via Vijay Chauhan who wrote on X, "has Google made any changes to the GSC Page Indexing report? I'm seeing that data prior to Dec 15 is completely missing across multiple properties. Is this a bug or intentional?"

Here is a screenshot:

Google Search Console Page Indexing Report Missing Data

Google has not yet commented on this but a lot of SEOs are asking what is going on:

Seeing a possible reporting bug in Google Search Console ð" '" Page Indexing report shows zero/missing data between 25 Nov'"19 Dec 2025, then resumes normally.
Anyone else facing gaps in the 'indexing graph' graph? @googlesearchc @rustybrick @JohnMu #SEO #GSC pic.twitter.com/VEFMjoSy42

'" Nilesh Yadav (@Nilesh__Yadav) February 23, 2026

He's right. That's gone (for now anyway). Stay tuned. https://t.co/ZxbiiTitgF

'" Glenn Gabe (@glenngabe) February 23, 2026

If you see this, do not panic; it seems to be impacting everyone.

Forum discussion at X.

Update: John Mueller from Google replied saying, "This is a side-effect of the latency issue from early December. This isn't a new or separate issue."

This is a side-effect of the latency issue from early December. This isn't a new or separate issue.

— John Mueller (@johnmu.com) February 23, 2026 at 8:00 AM

Google Discourages Force Indexing Pages To Search

20 Şubat 2026 saat 15:51

Google Paper Shred

Google's John Mueller said that he discourages large sites from force indexing its pages to Google Search. He said on a comment on LinkedIn, "I strongly recommend not relying on trying to force indexing."

It is not a new question, we covered this at least twice before. In 2020, Google said sites that need to request manual indexing may have quality issues. And more recently we covered Google saying you don't need to reindex your pages using Google Search Console.

Similar here, John was asked by Rehman Ameen:

These are definitely some interesting strategic workarounds to consider for manual indexing issues.

John Mueller replied:

I strongly recommend not relying on trying to force indexing - it doesn't make sense for any reasonably large site. Use the existing mechanisms, use merchant center if you're selling products.

Forum discussion at LinkedIn.

Google: We Do Not Have A Bad Title Algorithm Filter Of Sorts

19 Şubat 2026 saat 15:41

Google Code Algo

Google's John Mueller responded to a concern about having bad title tags and how that might impact your site in Google Search. He said on Bluesky "I don't think our systems have a "we don't like this one guy's titles" filter."

In short, John is saying there is no blacklist or system to remove a site or page from search just for having a bad title. Of course, if your title is not relevant to the content on the page, that can confuse Google and you might not rank well.

John responded to a quest from 'ªRyan Webb who asked:

Over the course of the last 3 weeks I have updated page titles to various pages (genuinely top 10 visited), usually I see these changes update on Google within 24hours but these changes aren't showing in the SERPS. SEO plugins and source all show correct.

Titles, even yahoo, bing and duckduckgo are displaying the right title. But Google seems to have stopped updating. I would have put it down to some caching within the data centres but for 3 weeks now, it seems to long a time period! Is there some block applied if changes are made.

John replied to that saying:

The title link that's shown in Search is not necessarily the title element of a HTML page. We generate these automatically based on a variety of factors, to tweaking the text in a title element doesn't necessarily always result in the same change in Search.

Then this was Ryan's follow up question:

Thanks John. I appreciated there was more to it. Just I haven't seen this happen across so many pages before and wasn't sure of "blacklisting" so to speak.

Here is where John basically said there is no blacklist for bad titles - John wrote:

I don't think our systems have a "we don't like this one guy's titles" filter, but "we've seen some stuff" :). There are a lot of things that could go into this, so it's really hard to guess based on just what you mentioned.

Forum discussion at Bluesky.

Google Search Console Still Testing Branded Queries & Social Channels Feature

19 Şubat 2026 saat 15:21

Google Search Console Time

The other day, Google released the Google Search Console AI-powered configuration tool and we continued to wonder when the other features, like branded queries and the social channels would go live. Well, Google's John Mueller confirmed Google is still testing those two features before they are more fully rolled out to more users.

John said on LinkedIn, "both of these are going to take a bit more time," referring to branded queries and social channels. "We're working on making sure they both work well for all users. We sometimes release features in slow, small steps, so that we can collect feedback & iterate to improve them + their data before releasing them more broadly," he added.

  • Branded queries filter feature is designed to help analyze the queries driving traffic to your site by automatically differentiating between branded and non-branded queries
  • Social channels lets you review Search performance of social channels associated with your website directly within Search Console.

So they are still in the works and still coming but not being released today?

Forum discussion at LinkedIn.

❌