fbpx

Google No Longer Automatically Indexes Websites – WTF?

by Nate Hoffelder

After a decade of running The Digital Reader, Nate is a veteran web publisher with experience in design, maintenance, recovery, and troubleshooting. What little he doesn't know, he can learn.

May 30, 2023

Can you please take a few minutes today, go to Google, and search for your website using the site search filter? (EX: site:mysite.com) Do you have about the right number of pages showing up, or just a few pages, or worse yet – none?

The reason I am asking you to do this is that I’ve noticed recently that Google no longer automatically indexes websites, and thus is not sending traffic to those sites.

Here are a few examples:

  • A couple months a new client came to me and asked me why her 4-month-old website wasn’t showing up in Google. It was in Bing and DuckDuckGo, but not Google. (I fixed this, but I neve did find the cause.)
  • Yesterday I helped out someone on Reddit who could not figure out why his two-month-old website had not been indexed by Google. (I told him to submit a sitemap in Search Console, and see if that helped.)
  • This morning I discovered that a site where I am the volunteer webmaster had not been indexed by Google. The site is a year old, perfectly functional, and I had already submitted a sitemap in Search Console, and yet even though Google has a sitemap showing 9 pages, Google had only indexed the home page. (DDG had indexed the entire site.)

Folks, I have been a web designer since 2016, and before that I was a blogger for 6 plus years. I have been deeply interested in Google SEO since 2010, and in all that time I’ve never heard of Google not indexing a site. Oh, sure, sometimes the website owner sets a noindex tag (accidents happen) but I have never heard of Google failing to index a site for no reason.

Google used to just index every site whether you wanted them to or not. That’s why the noindex and nofollow tags even exist, to give website owners more control over what Google did on their site.

This is so unheard of that I honestly didn’t use to bother to check whether a site shows up in Google unless I was asked to do so. I’ve literally never needed to do that when working on SEO (it would be like making sure gravity still works). in fact, Google’s own help pages say that you can pretty much assume that Google will index your site.

So if you have time today, you really should make sure that your site is showing up in Google. If Google neglected to index your site, you’re missing out on traffic.

So how do you fix this?

TBH, I am not sure. I would have thought that submitting a sitemap would fix this issue, but that didn’t help with the convention website. So what I am currently trying as a solution is using the “URL Inspection” feature in Search Console to force Google to index each page. It’s time and labor intensive, but I don’t have another option.

EDIT: The URL Inspection feature does work to get pages added to Google search results, and that is a relief. Alas, Google is only indexing the specific page I put into URL Inspection, which means I still have to solve this one page at a time. That is going to suck for anyone who has this problem with a huge site containing hundreds of pages.

EDIT: This post got a lot of attention on Hacker News and elsewhere, so much so that my site crashed several times on Monday morning. Several people listed the steps you are supposed to take to get indexed by Google. Those steps did not work for me, but they might help you, so:

How do you get your site listed in Google’s search results?

  1. Set up a Google Analytics account for the site, and install the GA code so that Google can get data.
  2. Set up a Google Search Console account, and install the GSC tag on the site.
  3. Add a sitemap to that GSC account.
  4. Email a link to the site from a Gmail account.
  5. Tweet a link to the site, or share it on FB.
  6. Link to the site from other sites.

Hi, I'm Nate.

I build and fix websites for authors, and I am also a tech VA. I can build you a website that looks great and turns visitors into fans, and I can also fix your tech when it breaks. Let me fight with tech support so you don’t have to.

My blog has everything you need to know about websites and online services. Don’t see what you need. or want personalized help? Reach out.

You May Also Like…

7  Website SEO Tips for Authors

7 Website SEO Tips for Authors

Search Engine Optimization, or SEO, is a vast and complicated topic with ever changing best practices. Google changes...

How to Find an Indie Bookstore

How to Find an Indie Bookstore

Back when I was a blogger, I used to have a beef with the American Bookseller Association. You see, the ABA had a...

13 Comments

  1. Kaz

    Checked to see what parts of my writing website Google indexed.

    Literally just my NSFW writing page. Googling my site name only yielded the nsfw index and six Tumblr posts that had a sfw story *in the recommendations*. Good grief.

    Reply
  2. Neal

    My sites show hundreds of results and my logs show it is being indexed often by Googlebot.

    Google seems to use different methods for finding web sites.
    The most common method follows links to sites. If your site isn’t mentioned by anyone else or linked by anyone else, then it won’t be indexed by Google. They also have a lot of unpublished rules (that seem to change often) for deterring people from planting fake sites or gaming the page rating system.

    Another method includes sending the URL via gmail or google chat to other people. If you paste the URL, Google will go out and get a preview of the URL. Google will also add the URL for indexing later. (This is how they found one of my honeypots.)

    A third method is to register your domain using Google’s Domain Tools. But that’s not Google “finding” the site as much as you telling Google to about the site.

    Reply
    • Nate Hoffelder

      I used to think putting the sitemap into Search Console would do the trick. That doesn’t work any more.

      Also, that convention website I mentioned had been shared via Gmail multiple times, and Google still only indexed the homepage.

      Reply
  3. Paul Lutus

    >> “EX: site:mysite.com”

    This is a mistake. I just submitted my site (arachnoid.com) as in the example:

    site: arachnoid.com

    And got one valid link, the remainder of links were medical references to words like “arachnoid” But if I submit a full link including “https”, then I get a full index of my site’s pages. So, to date at least, my site is still being indexed.

    The article’s text might not show the actual search string, of course. If true, this reply falls into the broad category of tempests in teapots.

    Reply
    • Alexander Williamson

      We deal with courses, articles and jobs on our site. A few days ago we found the jobs functionality which Google scans pages for certain schemas, and the indexing of those completely died worldwide so Jobs I’m Google Search died, and Google didn’t know much about it. That got fixed in the end.

      We also use paid Google Indexing API which you can bulk submit updated and deleted URLs to tell Google to reindex a page.

      Reply
  4. Greg

    This “phenomenon” has been ongoing since at least 2021. Back then I remember seeing a client’s newly published posts taking forever to get indexed (like, 6 months) and also discussing the issue with other SEOs.

    This particular client had a low-authority domain/site so I suspect these problems are predominantly with new sites, rather than with more authoritative ones.

    In GSC one often sees “Crawled, currently not indexed” which IMO is Google’s way of saying, “Meh, content on this page doesn’t add anything to search results so… let’s not index this at the moment.”

    You have to see the issue from Google’s side: a deluge of average posts regurgitating the same old content are being published at an ever increasing rate, and the issue is only getting worse with AI. Even sites with great content are unfortunately affected by this.

    Reply
    • Joe

      Greg is exactly correct. I have seen this on my own sites and some clients for the past few years but esp. since 2022. It’s what I consider the post-COVID “E-E-A-T” algo update which aggressively expanded the emphasis of topical authority, instead of just domain authority. In other words it’s getting harder and harder to get pages indexed if they’re not unique, topically authoritative (who are you?), and good quality. For example you could have a high domain authority site about dogs, but if you dared blog about COVID that blog post probably didn’t get indexed.

      I tried launching a new discussion forum website last year about web design and the traffic started growing rapidly for a few months and then suddenly the entire domain was deindexed, no reasons given in GSC. I assume from “thin content” or something like that due to new domain… it’s insane because this is just rewarding the “domaining” gurus who buy dropped domains to spam stuff now, or companies who buy backlinks and PR like Forbes, and hurting average Joe’s who just want to start an average blog or forum.

      In other words Google is no longer populist and the rich are getting richer, the poor are getting poorer. I also refer to this huge shift in their corporate values as the “WEF algo update” as it’s incredibly elitist and is turning Google into a information filter and narrative shaper like never before, with spammers and elites being rewarded most.

      Reply
  5. Sonia Margolle

    I used the “URL inspection” MULTIPLE times for a website (a designer portfolio) where I did a lot of modifications with new pages.
    It was last september, all the new pages are still in the “Crawled, currently not indexed” … SO infuriating

    Reply
  6. Rami

    Hey Nate,

    Google has to be very picky about the websites it indexes given that the size of the internet is only increasing. Most likely there will be issues with duplicate content or if the website isn’t optimized then Google will most likely ignore it.

    If you manually index your web pages then they will show up in Google Search within 3 days. Manually indexing is very tedious so I recommend using Google’s Index API directly so you can write some code around it that will automatically index all of your web pages, Or you can pay for a tool to do it or you can hire an offshore assistant to tackle it for you.

    Reply
  7. DJ Mat

    Also observed this for my personal website. After a revamp somewhere in 2023, Google kept indexing the old content (which was preserved with 301 redirects), and never indexed the new content. This situation is still the same almost one year later. The whole new website appears on Bing and DDG. Luckily it’s not a high-traffic website, more like a personal blog at this point. But still, there is some information in there that might be useful to someone. Maybe 😀

    Reply
  8. Abbad Zaheer

    To resolve the sitemap issue for your convention website, try these steps:

    1. Check for Crawl Errors in Google Search Console.

    2. Verify Indexing with “site:yourwebsite. com” in Google.

    3. Review Robots.txt to ensure it’s not blocking important pages.

    4. Improve Internal Linking for better crawlability.

    5. Update and Resubmit the Sitemap.

    6. Check for Manual Actions in Google Search Console.

    7. Ensure SEO Plugins** are properly configured.

    Reply

Submit a Comment

Your email address will not be published. Required fields are marked *

Skip to content