Okuma görünümü

Google: Pick A Reasonable Site Name To Rank For In Search

Google Ranking

Google's John Mueller said on Reddit that when picking a site name, make sure to pick something that you can reasonably expect to rank for. In short, don't go online and complain that your site is named Best SEO blog and you don't rank in the number one position in Google Search for [best SEO blog].

He said this on Reddit, where he wrote, "think about what's reasonable for "your site's name" in terms of search results."

The person on Reddit wrote, "But when I search my site name on Google, it doesn't appear at all."

John Mueller explained that having a generic and competitive name for your site can make it hard for your site to rank for its name. This is not new advice but he wrote:

One thing I've seen folks get confused about is that "searching for your site's name" can be very different depending on what you consider your site's name to be. If your site's name is "Aware_Yak6509 Productions" and if your homepage is indexed, then probably you'll find your site in the search results for that name (what else can a search engine reasonably show?). On the other hand, if your site's name is "best web online .com" then almost certainly just having your homepage indexed is not going to get your pages shown for those searches. The reason is primarily because search engines assume that people doing those searches ("best web online") are not actually looking for your homepage - it's a combination of generic words, not something that uniquely identifies your homepage.

So, when picking a site name, make sure it is something that you can expect to rank for.

Forum discussion at Reddit.

  •  

Google's John Mueller Working On Christmas (2025 Edition)

John Mueller Google Zurich

Every year, for the past 18 or so years, Google's John Mueller has been working on Christmas to provide support to those who need help with their Google Search ranking and SEO concerns.

This year is no different; John Mueller has responded to several concerns on Reddit and other social media platforms today.

John Mueller specifically has done this since at least 2007, so eighteen years and counting, and has done it again this Christmas.

Here are the previous years of John offering support on Christmas. He did it last year in 2024, and then in 2023, 2022, 2021, 2020, 2019, 2018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, 2008, and 2007.

Here are the threads John responded to over Christmas:

(1) expanding my SaaS using multiple ccTLDs on Reddit John wrote:

Taking a step back, I'd look around for some international SEO guides. There are some great ones out there, and they're a lot more than just local URLs + hreflang. The best time to fix international SEO issues is before they're live, the second best time is, well, you know.

It's a bit late, but I question whether you really need to split your site across ccTLDs. Having them reserved is one thing, but by separating your site across separate domain names, you both make things harder on yourself, but you also make it harder for search engines to understand each of these sites (because they're all separate sites). YMMV of course.

There's nothing wrong with putting them all into the same Search Console account. That's what the site-picker is for.

For x-default, you don't need to create a new generic default version, you can just pick a language that works well for most of your target audience. Maybe that's English, but it doesn't need to be. You don't need a separate x-default site. The more important part is that you make sure the hreflang elements are set correctly, including all back-links, including your important pages individually. (FWIW you can set up hreflang in sitemap files, if that makes it easier to maintain)

(2) Favicon is not visible in the google or any other search engine on Reddit John wrote:

I'd dig up Glenn Gabe's "favi-gone" article - he covers pretty much all of the variations of what can go wrong. Also, since you mention React, I'd make sure your favicon code is in the HTML template directly, and not added with JavaScript (to minimize another possible point of failure -- it's probably fine to use client-side-rendering for favicons, but you'll use them site-wide anyway, so might as well keep it simple).

(3) Google SEO indexing conversion from PHP site to NextJS on Reddit John wrote:

First off - there are a number of guides out there for how to deal with site migrations & SEO - I'd find them all and make plans. IMO the basics are the same across most guides, some of the more obscure things you might be able to skip.

You absolutely need to set up redirects, at least for the important pages as u/weblinkr mentioned. Without setting up redirects, you'll have a mix of old & new URLs in the search results, and the old URLs will drive traffic to your 404 page. It's normal for old URLs to remain indexed for a while, and you'll often struggle to have all links from ourside your website updated, so you really need to make sure they redirect.

If you set up redirects for this, ideally pick permanent server-side redirects (308 or 301) - avoid using JavaScript redirects.

If you're also moving images, and your site gets a lot of traffic from image search, make sure to set up redirects for the images too.

Since a move like this generally also means that at minimum your pages' layouts also change (assuming you can keep the primary content the same -- with updated links of course), keep in mind that page layout changes, as well as site structure changes (the way you deal with internal linking such as in header, sidebars, footer, etc) will have SEO effects. This is not necessarily bad, but all of this basically means you should expect some visible changes in how the site's content is shown in search, definitely short-term (even if you get the URL changes perfect, you will see changes), perhaps even longer-term (and to improve for longer-term changes, let it settle down first).

Finally, having a list of old URLs is great, but especially for a non-trivially sized site (100+ pages? I'm picking a number), you'll want to have something that helps you check & track semi-automatically. I'd use some sort of website crawler before you migrate (to get a clean state from before), and to use the clean state to test all the redirects (which you can do with many crawlers), and check the final state (again using a website crawler). Website crawlers like Screaming Frog are cheap, and well worth it for a site migration, you save so much time & get more peace of mind. Finally, depending on the site's size, it might make sense to keep a static mirror around for debugging for a while.

(4) My Blogger website is not showing on Google even after submitting sitemap '" what am I missing? on Reddit John wrote:

Folks here are mostly focusing on indexing (which, yes, is never guaranteed), but it can also be that some of your pages are technically indexed but just don't show up.

One thing I've seen folks get confused about is that "searching for your site's name" can be very different depending on what you consider your site's name to be. If your site's name is "Aware_Yak6509 Productions" and if your homepage is indexed, then probably you'll find your site in the search results for that name (what else can a search engine reasonably show?). On the other hand, if your site's name is "best web online .com" then almost certainly just having your homepage indexed is not going to get your pages shown for those searches. The reason is primarily because search engines assume that people doing those searches ("best web online") are not actually looking for your homepage - it's a combination of generic words, not something that uniquely identifies your homepage.

So in short, yes, understand how indexing technically works, because it's the basis for visibility, and understand that some things take time & more evidence to be picked up. But also, think about what's reasonable for "your site's name" in terms of search results.

(5) John replied to an indexing question on Bluesky saying:

The tool hides them in the search results, it doesn't change indexing. Id check if they really appear often in search first (it's not given that others from your site will show instead), but it's fine too use the tool if you just don't want those urls shown anymore.

It is so nice to see John do this every year because for someone to post concerns on Christmas, those people are probably pretty concerned.

Merry Christmas and Happy Holidays!

Forum discussions at threads above.

  •  

Google JavaScript Doc Now Says Non-200 HTTP Status Code Might Not Be Rendered

Google Code

Google updated its JavaScript SEO documentation for the third time this week, this time to say that "while pages with a 200 HTTP status code are sent to rendering, this might not be the case for pages with a non-200 HTTP status code."

The changes include adding the words "with a 200 HTTP status code" to this line "Googlebot queues all pages with a 200 HTTP status code for rendering."

Google also added this new note, that says:

All pages with a 200 HTTP status code are sent to the rendering queue, no matter whether JavaScript is present on the page. If the HTTP status code is non-200 (for example, on error pages with 404 status code), rendering might be skipped.

Here is a screenshot of the section of the page that was updated:

Googlebot Javascript 200

Google also updated its JavaScript canonical advice and noindex tag advice this week.

Forum discussion at X.

  •  

Google's Danny Sullivan & John Mueller On SEO For AI: It's The Same

Google John Mueller Danny Sullivan Others

John Mueller from Google had Danny Sullivan from Google on the Search Off the Record podcast to talk about "Thoughts on SEO & SEO for AI."

In short, they both said SEO for AI is the same as SEO for traditional search but then they got into it.

You should really build out great, unique, and original content for end users. Sure, there may be shortcuts that pop up, like with SEO, but those won't last. So your best bet is to create original content. People also want authentic content.

Also, don't just focus on textual content but also video, audio, visual, and other formats.

In fact, Danny Sullivan said he feels that AI for SEO, GEO or whatever you want to call it, should be under SEO, as a subset of SEO.

Here is the 37-minute part one recording:

I took notes on this and posted it yesterday on X but I figured I'd post them here as well:

  • Traditional SEO is the same as optimization for AI Search
  • If anything, GEO/AIO, etc would be a subset of SEO, under SEO
  • "it is still SEO" but the format is different
  • Still, don't worry about SEO, write for users
  • I do SEO but my clients are saying, I need the new stuff. So if you need to "dress it up" more, you should say the long term strategy is the same as it was with SEO.
  • Just because the format is changing, doesn't mean you need to change what you do.
  • Technical SEO is built into most CMS platforms now
  • So just focus on the content these days
  • Old days of SEO, you would make a version for each search engine but they werent that different
  • So the effort was not worth it
  • Over time, the differences became less
  • So stop focusing on specific search engines and on the user
  • With AI, focus on originality of your content (not new, not new for AI or SEO but...)
  • LLMs/AI systems are doing a good job of covering non-original content
  • Every year, publishers would create an article on "What time does the Super Bowl start" and this is something AI can do and is not original
  • People are seeking original content; videos, podcasts, first hand experience from forums
  • Expert takes should include this too
  • These are things to consider
  • Authentic content can't be artificially created
  • Does your content resonate with people (like what Danny sees on social media from who he follows)
  • The core is you are authentic to people who follow you, because you are simply authentic he added
  • Danny Sullivan hates the term "Multimodal", he doesn't like using that term to explain it
  • Instead, you search one way and get a response back another way
  • So maybe start to do more content in more formats, not just text, but video, audio, etc
  • How do you know if you are succesful in AI formats?
  • It is not just about clicks; it is about quality clicks/conversions
  • With these new formats, people are "more engaged"
  • We know "the time of visits," people are spending more time on those pages
  • AI formats are better at getting people to know what they are going to click on
  • Query fan out: John Mueller says it does a whole bunch of searches for you, so you don't have to
  • It then puts all that into an AI answer
  • AI Mode gives you a lot more context and you end up where you want to be
  • So maybe the clicks you got in traditional search were not as good as AI search clicks
  • AI formats feel good to searchers, because you search more the way you want to search
  • It is like you go to a library and you ask the librarian a question and then the librarian asks you questions to dig into what you really want
  • Danny Sullivan gives examples of using geo and search trends on Google today
  • Tracking needs to improve for AI Search
  • Including changes to Search Console
  • So site owners can understand what they are being found for and should they adjust their content
  • But search engines are smarter and can figure out your content, so you don't need to do this anymore
Part one ends with write for humans, for users, in the way they want to read it.

Forum discussion at X.

  •  

Google Updates JavaScript SEO Doc With Setting Canonical URL Advice

Google Code

Google updated its JavaScript SEO best practices document with a new section on how to set the canonial URL when using JavaScript. Google wrote, "The best way to set the canonical URL is to use HTML, but if you have to use JavaScript, make sure that you always set the canonical URL to the same value as the original HTML."

So Google should figure it out but if it doesn't, don't blame Google. Google wrote:

The rel="canonical" link tag helps Google find the canonical version of a page. You can use JavaScript to set the canonical URL, but keep in mind that you shouldn't use JavaScript to change the canonical URL to something else than the URL you specified as the canonical URL in the original HTML. The best way to set the canonical URL is to use HTML, but if you have to use JavaScript, make sure that you always set the canonical URL to the same value as the original HTML. If you can't set the canonical URL in the HTML, then you can use JavaScript to set the canonical URL and leave it out of the original HTML.

Google explained, "Canonicalization happens before and after rendering, so it's important to make the canonical URL as clear as possible. With JavaScript, this means setting the canonical URL to the same URL as in the original HTML or if that isn't possible, to leave the canonical URL out of the original HTML."

Always test to make sure Google can process your JavaScript and one way to do that is to test it using the URL Inspection tool in Google Search Console.

Forum discussion at X.

  •  

Google Search Console Page Indexing Report Now Up To Date

Google Pages

After almost a month delay with the page indexing report within Google Search Console, the report is now back to its normal timeframe. Yesterday, Google also fixed the delays with the performance reports.

This morning it was reading that the last updated date was November 21st. Now, it is reading December 14th, which is a normal delay for this report.

Here is a screenshot of what I see now:

Gsc Page Indexing Report Date

Here is what it looked like this morning:

Gsc Page Indexing Report Date Delay

You also probably started to receive emails from Google Search Console related to these reports. Here is an example I received earlier:

Gsc Indexing Email

So now you can go at it and do your indexing reporting. You can ignore the notice at the top of the page that says, "Due to internal issues, this report has not been updated to reflect recent data."

Again, the delay in the reporting has nothing to do with any issues with your site performance in Google Search. That was just a reporting delay and bug that is now resolved.

Forum discussion at Bluesky.

  •  

Google: Optimization For AI Search Is The Same As For Traditional Search

Google Nick Fox Ai Inside Podcast

Nick Fox is the SVP of Knowledge and Information at Google was interviewed again on the AI Inside channel by Jason Howel and Jeff Jarvis. He said a lot of interesting things but said no, Google won't be offering standarized licensing deals for all publishers. He also said that optimizing for AI Search is the same as what you do for web search and normal SEO.

Update: A Google spokesperson said Nick Fox never said that there won't be licensing deals. He did not outright reject the idea.

Here is the video:

Jason Howel also wrote this up on ZDNet and said Google search chief talks future of news content amid AI scramble (note, it previously was titled "Google's search chief rejects this strategy for licensing news content amid AI scramble" but that was inaccurate).

On optimizing for AI Search, Nick Fox said it is "the same" as what you'd do for optimizing for web search and Google Search, it is SEO. Build great sites and great content he said. Nick Fox said, "the way to optimize to do well in Google's AI experiences is very similar, I would say, the same as how as as how to perform well in traditional search. And it really does come down to build a great site, build great content."

Jeff Jarvin asked, "And and is is there are there is there guidance for enlightened publishers who want to be part of AI about how they should view should they view their content in any way differently?"

Nick Fox responded:

I deeply believe, we deeply believe, that journalism is important and that's important for society. It's important for accuracy. It's important for all of these things. And so it's important that we figure out a model that actually does work that supports the generation of high quality content.

We also need to as an industry figure out a way in which to do this in a way that's sustainable for the creation of high quality journalism. Our approach has been: let's work with a large number of sites. One of the things we announced this week is that we have commercial partnerships with over 3,000 organizations, publishers, publications, organizations around the world in 50 plus countries. And so, to your point, it's not just one or two at the head but rather there's a wide range," he added.

We partner with organizations primarily in two ways. I would say number one, the primary one, is driving clicks, and I believe that continues. I believe that users will continue to click through and read underlying sources. And then there's also commercial partnerships around the financial side of it as well. But it really needs to be both and I believe the core of the way that Google will partner with news organizations and websites overall will be through traffic and links within these experiences.

He then goes on about the true partnership is to send clicks nad traffic to these publishers. That is the partnership he sees.

He also double-downed on the questioning of the click studies on AI Overviews and AI Mode. He said the stats will vary by site and that some of the third-party studies were "one-offs" and "cherry-picked." He even said he saw studies that showed AI results send even more traffic to some sites.

Here are my main notes for this interview:

Expansionary moment:
(1) People are able to use Google more and ask more questions.
(2) More people using the web, means more people on the web and more overall usage.

Higher quality links? Can we trust Google?
(1) Stats will vary by site
(2) Third party studies are 'Cherrypicked' and 'one-offs'
(3) Also studies showed increases in traffic

How do small publishers make a content deal with Google for AI use?
(1) We need to figure out how to work together.
(2) You can't fight users and you need to deliver the experiences users want
(3) Journalism is important
(4) We need to figure out a model that supports journalism
(5) They have partnered with thousands of publishers/organizations in 50 countries
(6) We partner with publishers:
(A) Primarily by driving clicks and traffic to publishers
(B) Google improved those links this past week
(C) 'Short answer is no' for coming up with a standard licensing model for all publishers

How to rank better in AI Search?
(1) Optimizing for AI Search is the same as SEO for normal search
(2) Build great content and a great site

What have you learned in the last six months with AI Mode:
(1) The types of deeper queries people are doing
(2) User adoption is stronger in earlier launch markets
(3) Regions with less content on the web in that language prefer the AI responses because it gives a more satisfying answer
(4) Younger users resonate more with AI Mode

Hey all, today has been CRAZY! But I'm happy to finally tweet my article on ZDNET that covers our conversation with @thefox on today's special episode of the AI Inside podcast. Definitely some new nuggets in this one!https://t.co/w7WSojwxii@Techmeme @rustybrick

'" Jason Howell (@jasonhowell) December 16, 2025

Forum discussion at X.

  •  

Google Warns: Don't Use NoIndex Tags In Pages That Use JavaScript

Google Code

Google updated its JavaScript SEO documentation to warn against using a noindex tag in the original page code on JavaScript pages. Google wrote, "if you do want the page indexed, don't use a noindex tag in the original page code."

Google said, "While Google may be able to render a page that uses JavaScript, the behavior of this is not well defined and might change. If there's a possibility that you do want the page indexed, don't use a noindex tag in the original page code."

The updated text says:

When Google encounters the noindex tag, it may skip rendering and JavaScript execution, which means using JavaScript to change or remove the robots meta tag from noindex may not work as expected. If you do want the page indexed, don't use a noindex tag in the original page code.

The old version said:

If Google encounters the noindex tag, it skips rendering and JavaScript execution. Because Google skips your JavaScript in this case, there is no chance to remove the tag from the page.

Using JavaScript to change or remove the robots meta tag might not work as expected. Google skips rendering and JavaScript execution if the robots meta tag initially contains noindex. If there is a possibility that you do want the page indexed, don't use a noindex tag in the original page code.

Here is a screenshot of the before and after:

Google Javascript No Index Docs Change

This is different from never using JavaScript for schema for Google Shopping. As a reminder, Google had a bug with noindex ags on JavaScript pages. The recommendations on this topic has changed over the years.

Forum discussion at X.

  •  

Google Search Console Average Position Dropping Back Down

Google Fire Chart Down

Earlier this month, we documented that the average position sharply increased for many sites when Google made that num 100 change blocking a lot of automated queries. Well, for some sites, that number is dropping again, which may mean that there is a workaround for the blocking of those bots and automated queries.

Mike Sullivan on Bluesky and Glenn Gabe on X asked about this. Mike wrote, "something is happening... GSC has been extra slow at loading data since Thursday last, really long 'prompts' are appearing in the data, and impressions are up/positions down again (for me at least)."

Glenn Gabe explained this theory:

Remember the num=100 situation where Google removed support for the parameter based on AI scrapers and bots scraping the Google SERPs? When they did that, average position surged heavily and impressions dropped (since those scrapers were being excluded). Well, @AnalyticsEdge pinged me about a recent change for some sites. And I'm seeing it too for *some* sites (not all).

Right around 12/3 or 12/4, some sites are seeing a big drop in average position while impressions skyrocket again. That leads me to believe some scrapers are now getting through. Check your stats. You might be one of them. Now lets see if Google fights back as part of the cat and mouse game...

Glenn Gabe also posted this on LinkedIn and many in the comments are seeing this as well.

I see it to on many but not all sites I have access to in Search Console - here are some examples:

Gsc Performance Ave Pos1

Gsc Performance Ave Pos2

Gsc Performance Ave Pos3

Are any of you seeing this?

Hey @glenngabe.bsky.social , something is happening... GSC has been extra slow at loading data since Thursday last, really long 'prompts' are appearing in the data, and impressions are up/positions down again (for me at least).

[image or embed]

'" Mike Sullivan (@analyticsedge.com) December 11, 2025 at 10:25 AM

Remember the num=100 situation where Google removed support for the parameter based on AI scrapers and bots scraping the Google SERPs? When they did that, average position surged heavily and impressions dropped (since those scrapers were being excluded). Well, @AnalyticsEdge'... pic.twitter.com/hqWPjJjCY1

'" Glenn Gabe (@glenngabe) December 15, 2025

Is it reversing again this morning:

After I shared about this yesterday, many have seen the same thing. But, I think Google started addressing this already. The latest GSC data shows avg position increasing again and impressions dropping. It's a cat and mouse game with AI scrapers and bots and Google just countered'... https://t.co/S6RgbLIf8T pic.twitter.com/Vm1sOkbAdl

'" Glenn Gabe (@glenngabe) December 16, 2025

All three charts I posted above do show reversals of this pattern...

Forum discussion at Bluesky and X.

  •  

Google Search Console Performance Reports Finally Caught Up

Google Report Speed Clock

Google Search Console's performance report seems to finally be all caught up and up-to-date as of last night and this morning. I am seeing the normal two-hour or so delay, which has always fluctuated from an hour to five hours on a normal day.

But for the past few weeks, the performance report has been delayed by 50+ hours. While we have seen this report get stuck for that or more than that amount of time. We have not seen it recover for a few weeks, as we have now. I think this may have been the most extended period of delay (not in terms of the hour delay, we have seen over 100+ hours delays before), but in terms of how long, over a period of days/weeks, we had this delay.

In any event, it is now fixed.

Here is a screenshot from this morning:

Gsc Performance Report Normal

Now, the Page indexing report is so stuck, so delayed, that a few days ago, Google added a message to the top that reads, "Due to internal issues, this report has not been updated to reflect recent data." It has been stuck since November 21st or so, almost a month. This seems like a more serious issue and for those trying to check on indexing and validating fixes, this lingering delay is not fun.

Gsc Page Indexing Delay Notice

John Mueller from Google over the weekend wrote on Bluesky:

I don't have an update, sorry. Both the page indexing and the performance reports are currently delayed past what we'd consider normal. I realize it's frustrating & also makes things hard for you all, sorry about that. :-/

Forum discussion at Bluesky.

  •  

Google: Pre-Announcing Google Core Updates Isn't Possible

Google Calendars

Google's John Mueller said that it is impossible for Google to pre-announce quality improvements to Google Search, including core updates. He said, "having quality changes ready for a specific date/time is never a given, and pre-announcing for a fixed date isn't possible."

John added on Bluesky, "We try to launch quality improvements as soon as they're ready & evaluated." But preparing a specific date for a core update is not as easy as "Preparing a date for a launch of a logo-change, for example, is much easier," he added.

Here is his post:

We try to launch quality improvements as soon as they're ready & evaluated. Having quality changes ready for a specific date/time is never a given, and pre-announcing for a fixed date isn't possible. Preparing a date for a launch of a logo-change, for example, is much easier.

'" John Mueller (@johnmu.com) December 13, 2025 at 9:46 AM

Now, I will do the Barry thing, and say that on June 8, 2019, Danny Sullivan, the former Google Search Liaison, kinda said the opposite. He said Google will try to pre-announce future core updates. Although we know that never really happened.

Also, I can say, Google does pre-announce a lot of search features and I get embargoed statements from Google on some of those. I almost never get the same for core updates or quality updates. I think I may have received one for the first helpful content update, but even then, the date for when it would be released was not set in stone. I never get notices from Google that a core update will be released in the coming days, I just don't think Google knows for sure until it is fully ready and when it is, they just push it.

I am not sure why they can't get it ready and then decide to push it a day or two later. But maybe they just want it out as soon as possible?

Forum discussion at Bluesky.

  •  

Google Tests Property Listings In Search Results

Google Homes

Google is testing showing full detailed property sale listings within the Google Search results. This is getting a lot of interest from those in the real estate business.

Mike DelPrete posted about this on his blog saying, "Google putting for sale listings directly into search results '" even if it's a test '" is a big deal."

Google Search, in this test, is showing full property detail pages, links to request a tour, and contact an agent and more within the listings. It also says that this is a partnership with ComeHome.

Google wrote, "These results are a curated selection of properties brought to you in paid partnership between Google and ComeHome. Property results are not supplied or sponsored by listing agents or brokers."

Here is a screenshot from Mike:

Google Search Property Sales Listings

Here is the partnership disclaimer:

Google Search Property Sales Listings2

Matt McGee, who has been doing real estate SEO for probably two decades now, also posted about it:

Pretty sure this is a new Google ad type that shows property listings. I see it on mobile for "homes for sale chicago" and "homes for sale austin," but not PHX or SEA. Carousel shows 25 listings + "show more" keeps adding 25 more. Also has 2 agent bio cards in there.

[image or embed]

'" Matt McGee (@seosavvyagent.com) December 13, 2025 at 2:00 PM

Glenn Gabe also notified me about this:

Anyone focused on real estate seen this before. Mike believes this new, and a big deal -> Google Enters The Portal Wars

Looks like a partnership between Google and ComeHome. See the third screenshot below.

"Google putting for sale listings directly into search results '" even if'... pic.twitter.com/EldK73yaNw

— Glenn Gabe (@glenngabe) December 13, 2025

Forum discussion at Bluesky.

  •  

Google Preferred Sources Now Global & Adds Spotlighting Subscriptions

Google Newspapers

Google announced that Preferred Sources is now rolling out globally, after launching just in the US and India in August, following its beta period in June. Plus, Google announced Spotlighting subscriptions are coming to Gemini, AI Mode, and AI Overviews.

If you love this site, you can add this site as a preferred source on Google by clicking here.

Google wrote:

We're now launching this feature globally: in the coming days, it will be available for English-language users worldwide, and we'll roll it out to all supported languages early next year. This builds on the great early feedback we've heard from users and websites. People have selected a wide range of preferred sources '" nearly 90,000 unique sources, from local blogs to global news outlets. When someone picks a preferred source, they click to that site twice as much on average.

To activate Preferred Sources, search for a topic that's in the news and click on the icon to the right of Top Stories. Then search and select your preferred sources. Then you can refresh your results and see more from your favorite sites.

Google Preferred Sources

Google also posted a publisher resource section on this feature. Google wrote:

In Top stories, you can select your preferred sources. Next to the 'Top stories' header, click the Cards Star icon Preferred sources cards star icon. You can search for and choose the sources and outlets you'd like to find. For relevant news queries, these sources show up more often in 'Top stories' and 'From your sources.' Learn more about Top Stories in Search.

Google also announced Spotlighting subscriptions. Google wrote, "We're launching a new feature that highlights links from your news subscriptions, making it easier to spot content from sources you trust and helping you get more value from your subscriptions."

Google said it will "also prioritize links from your subscribed publications, and show these links in a dedicated carousel."

Google is first bringing this to the Gemini app in the coming weeks, with AI Overviews and AI Mode to follow.

Forum discussion at X.

  •  

Google: Discover Minimally Aligned To Search Ranking

Andy Almeida Google

Google has told us that core updates impact visibility within Google Discover, but that may have changed at some point. Andy Almeida from the Google Trust and Safety spoke about Google Discover and one of his slides wrote, "Minimal alignment to Search ranking..."

He presented this at the Google Search Central Live event at Zurich yesterday, I posted the slide on X and here it is:

Google Discover Min Align Search Ranking

Glenn Gabe asked me to find out more about this, so I did:

I was able to meet up with Andy Almeida and ask him to clarify this. He told me, and I am not quoting, but going off what I remember he told me, that the Google Discover team wants to be able to highlight and promote content from smaller publishers, ones that may not rank well in Google Search. He said Google Discover aims to do that and to do that, it can't just use Google's quality signals to only promote the most authoritative sites and content.

Instead, a site doesn't need to rank highly in Google Search for it to be promoted in Google Discover. This allows smaller, lesser known, publishers to do well in Google Discover, even if they don't currently rank well in Google Search.

He told me that you don't even have to rank for most of your queries to show up in Google Discover at the current state.

I think, this is maybe partially why Google Discover has such a spam problem. It is way easier to take a new domain and get it to rank in Google Discover than in Google Search because of this.

I asked if this is a delicate tweak between using Google Search trust signals and not, so you can balance showing new or lesser known publishers while also trying not to let spam junk up the Google Discover feed. He nodded.

That is how I understood these slides based on my follow up questions.

I am not there, but that second line sure caught my attention. We know that Discover is an extension of Search, but Google just presented "Minimal alignment to Search ranking gives us the tools we need to combat emerging abuse". I'm trying to find more info about that line, but I'... https://t.co/kNFbuBghbN

— Glenn Gabe (@glenngabe) December 9, 2025

Forum discussion at X.

  •  

Google Shopping Crawlers Are Too Fast For JavaScript Generated Structured Data

Google Speed Shopping

14 months ago, Google updated its documentation around its shopping related structured data warning not to generate your structured data dynamically through JavaScript. Yesterday, at the Search Central Live event in Zurich, a Google engineer explained why - it's because the shopping bot crawls a lot and fast.

Yes, the shopping crawler has to consume shopping structured data incredibly fast so it has up-to-date pricing, inventory, availability and so forth for the Google Shopping Graph. Because of the speed and quanity of feeds it needs to consume over-and-over again, it does not have time to wait for JavaScript to dynamically generated the structured data.

This is different from how normal Googlebot and Google Search handles it. Googlebot for Search will wait and consume your JavaScript to wait for it to render the structured data.

Here are those posts on this:

Google Shopping won't process structured data loaded through JavaScript because Shopping crawls a lot, fast, because it needs real time pricing/availability. It is different from Google Search.

'" Barry Schwartz (@rustybrick) December 9, 2025

Is client side rendered JS bad for structured data?

In search: they're parsing it. This invites drift tough if you have any content gap and is reflected in the page. They should be aligned.

For shopping : it's different because they crawl a lot. If they have to crawl'... pic.twitter.com/06q9sggrwK

'" Aleyda Solis ðï' (@aleyda) December 9, 2025

Forum discussion at X.

  •  

Google Search Still Working On New Core Update; Update Should Be Soon

John Mueller Google Zurich

Google's John Mueller said at the Google Search Central Live event in Zurich today that Google is still working on the next core update. This update should be out soon, but it is unclear if it will happen in the next couple of days or after the holiday season.

John joked that it might not come today, but he wouldn't be surprised if one launches in the coming weeks. He then quickly added that hopefully not before the holidays. Even Googlers don't like when these core updates launch around the holidays, because they are busy with it.

We previously had the June 2025 core update and then before that the March 2025 core update, and more recently the August 2025 spam update.

Of course, we had tons of Google updates, Google did not confirm but we are waiting for the next big core update. Meanwhile, Google does smaller core updates more often and does not confirm them.

A year ago at the same event, John Mueller said we can expect more core updates, more often but that obviously did not happen this year.

@JohnMu said there will be a core update in the future but he wouldn't be surprised if one launches in the coming weeks but hopefully not before the holidays.

'" Barry Schwartz (@rustybrick) December 9, 2025

Forum discussion at X.

  •  

Google Search Console Adds Weekly & Monthly Views Granular Data

Google Data Pipes

Google announced today at the Search Central Live event in Zurich that Search Console is rolling out more granular data. Google launched weekly and monthly views in Search Console for performance reports.

This goes beyond the 24-hour view and lets you dive deeper into this data in a weekly or monthly view.

Here are some photos from the presentation:

Google Search Console New Week Monthly Views1

Google Search Console New Week Monthly Views2

Google Search Console New Week Monthly Views3

Meanwhile, the Google Search Console performance report and some other reports are significantly delayed. Google is working on fixing this but it has not been fixed yet.

Forum discussion at LinkedIn.

I see the new Google Search Console views are live for me - https://t.co/6byDZdcOWB pic.twitter.com/tzvITv8TDq

'" Barry Schwartz (@rustybrick) December 9, 2025

Update: The following day, Google posted about it saying, "This new functionality allows you to adjust the time aggregation of any of the performance charts, helping you smooth out daily changes and focus on the overall trend of traffic to your website."

  •  

Google Search Console Insights Adds Social Channels

Gsc Reporting Laptop Ai

Google announced it is adding social channels within the Google Search Console Insights report. This "lets you review Search performance of social channels associated with your website directly within Search Console," Google wrote.

Note, this is a limited roll out and not everyone will see this yet. Like the other features Google recently rolled out. Meanwhile, there are still huge reporting delays.

Here is a screenshot of the report:

Youtube Social Search Console

Google said this report gives you:

  • Total reach: total clicks and impressions driving traffic from Google to your social channel.
  • Content performance: top social channel pages, as well as those trending up or down.
  • Search queries: top and trending queries leading users to your social profiles.
  • Audience location: top countries where users are clicking on your social channel in Search results.
  • Additional traffic sources: total clicks your site receives from additional sources such as Image Search, Video Search, News Search, and Discover.

Here is how you can social channels ot the Insights report:

Search Console Insights Social Channels

Google explained, "At this first stage, the insights are available only for sites and channels that Search Console has identified automatically. On the Search Console Insights report, you will be prompted to add the social channels that Search Console has automatically identified and associated with your website."

Facebook too!

— Daniel Waisberg (@danielwaisberg) December 9, 2025

Forum discussion at X.

  •  

Google Search Console Performance Report Tests AI Powered Configurator

Google Robot Looking At Reports

Google is testing a new AI-powered configuration tool, as it calls it, to help you build instant reports based on your natural language questions. So basically Google Ads and Analytics Advisor but for Search Console, plus this just builds reports and doesn't do actions - yet.

Google wrote, "we're excited to announce an experimental feature in the Performance report designed to reduce the effort it takes for you to select, filter, and compare your data: AI-powered configuration."

The Google Search Console AI-powered configuration tool is an experimental feature that uses AI to customize the Search Performance report by requesting the data you would like to see. Instead of manually applying filters, comparisons, and selecting metrics, you can ask the assistant to configure the report for you and it should. So you are restricted to the filters that are available today, so no, you won't be gaining access to any new data, like AI Overviews or AI Mode.

"Powered by AI, this feature lets you describe the analysis you want to see in natural language. Your inputs are then transformed into the appropriate filters and settings, instantly configuring the report for you," Google added.

Note, I don't see it yet, it is "experimental" and is available for a "limited set of websites" but Google said it "will be gradually expanding it over time."

Here is a GIF of it in action:

Google Search Console Performance Report Tests AI Powered Configurator

When you do get access to it, you are limited to 20 requests per day. Then here is how to use it:

(1) Click the filter icon in the Performance report header to open the tool in the side panel.

(2) Type your request into the prompt field. The tool will then suggest the corresponding filters, comparisons, and metric settings. You must then confirm the suggestion to apply those settings to your report.

Google does warn that this is AI, so it might not work properly and it might give you the wrong information. Google wrote, "Since this is an AI feature, the tool may generate filters that don't match your request. Always review the filters applied to the report to ensure they match your intended query."

The AI-powered configuration feature is designed to streamline your analysis by handling three key elements for you:

  • Applying filters: Narrow down data by query, page, country, device, search appearance or date range.
  • Configuring comparisons: Set up complex comparisons (like custom date ranges) without manual setup.
  • Selecting metrics: Choose which of the four available metrics'"Clicks, Impressions, Average CTR, and Average Position'"to display based on your question.

Here are some of the documented limitations:

  • Scope: it supports only the Performance report for Search results. It is not available for Discover or News reports.
  • Accuracy: AI can sometimes misinterpret requests. Always review the suggested filters to ensure they match your intention before analyzing the data.
  • Limitations: The feature is designed for configuration (filters, comparisons, metrics). It cannot perform actions like sorting the table and exporting data.

You can learn more in this help document but again, you probably won't see this feature for some time.

Jimmy Hartill has access to this feature, which is neat, he posted screenshots on LinkedIn. So does Pedro and I asked him how he likes it:

Ha! That was fast
GSC AI-driven reports pic.twitter.com/F4AWpHSUrf

'" Pedro Dias (@pedrodias) December 5, 2025

Also, it won't do any advanced regex for you. It applies basic filtering most times

— Pedro Dias (@pedrodias) December 5, 2025

Forum discussion at LinkedIn.

  •  

Google Search Console Average Impression Increased For Many

Google Chart Up Fire

If you look at some of the profiles/sites in your Google Search Console performance reports, you may notice a significant increase in the average position being reported. This happened when Google removed the num=100 parameter and is likely due to scrapers not messing up your data.

Here is an example showing the increase/improvement in the average position in this report:

Gsc Avg Position Jump

I posted about this across X, LinkedIn and other networks and the answer is the num 100 change.

We had this explanation earlier, which I covered here but now a couple months later, it is much more noticeable.

Elie Berreby wrote:

With the parameter's removal, Google results are now paginated for all queries, meaning an impression should only be counted when a URL appears on a page a user actually views.

This has led to the drop in total impressions for many websites (not just yours), but each impression now represents a more accurate measurement of actual user visibility.

Dan Lauer wrote:

Yup, I am seeing this across clients Barry Schwartz and it 100% aligns with the Google 100=NUM parameter change, however, impression data is inconsistent --- some see no change after the big impression drop on 9/10, others more recently have seen impressions almost get back to pre 9/10 levels - especially in the last 2 to 3 weeks --- assuming that is more seasonality with BF/CM...... I do see a correlation recently with big impression spikes coinciding with avg. position declines for those date(s).

Alexander Rodionov wrote:

num=100 was depreciated = SEO tools could not crawl easily anymore = less impressions from bots. I see a variety of results across accounts, mostly affected are the ones where the search terms the website ranks for are popular among those using SEO tools.

Is this when they dropped "&num=100"? (resulting in fewer impressions after position 10)

'" Cyrus Maxx (@zyppy.com) December 4, 2025 at 3:31 PM

That Average Position UP / Impressions DOWN trend you're seeing is the "Mathematical Illusion" in action. Google killed the &num=100 parameter, cleaning out spammy "ghost impressions" from Page 2+. Your ranking didn't improve, the data just got cleaner. Don't panic, but DO reset'...

'" Eric Smith (@ESmithdigital) December 4, 2025

This is from the removal of &num=100 query parameter. Avg ranking improves because all of the rank tracking tools that looked at deeper pages stopped working overnight. There was no real clicks coming from those tools, so clicks are unchanged. Imp diff is negligible as well. https://t.co/Z0TJoADfXE

'" Robert Ramirez (@ramirez_robert) December 4, 2025

It's from Google dropping the num=100 parameter. A lot of tools either stopped scraping or limited their checks to 1-2 pages, and those tools were the biggest viewers of the page 3+ SERPs.

'" Zak Kann (AI Automation) (@zrkann) December 4, 2025

I've been seeing this for a while now and when I asked my husband (also an SEO) he guessed it had to do with the results = 100 change but I didn't look into it further to confirm

'" Taylor Berg Chapa bluesky @taylorberg (@taylorannberg) December 4, 2025

It's because num=100 parameter was removed on that exact day. Go read on it, it's just reporting

'" Dylan Ander | CRO & SplitTesting (@DylanAnder) December 4, 2025

Seeing it across many sites. But impressions are also down. No site with an average position lower than 10 anymore, not even the testing ones. Seems related to num=100 removal.

'" Mirela Iancu (@SEOPuzzleSolver) December 4, 2025

Could it have anything to do with Google removing the ability to show up to 100 results per page in Mid-Sept? Essentially pushing up avg position. pic.twitter.com/WkaPoipgDG

'" Best. Doug. Ever. (@Baxter23603538) December 4, 2025

Yes. I think is for the change of 100 limit of results on Google too, but... interesting thing: This site over the competition (blue line) has also greatly improved visibility on Semrush pic.twitter.com/U0hRhwG0sy

'" Rafa Martin ð'½ð-- (@rafainatica) December 5, 2025

Yeah, the num=100 fix removed many botted impressions.

'" Joe Manna ðµ (@JoeManna) December 5, 2025

So don't be shocked when you see this in your Search Console performance reports. Many sites are seeing this.

Forum discussion at X and LinkedIn.

  •