Blogger (blogspot) default robots.txt ignores pages

Why my Blogger "pages" are not indexed by Google (or any other search engines)

The Blogger generates a default robots.txt file which is visible at your blog URL/robots.txt
 e.g.  https://etlx.blogspot.com/robots.txt

It will look something like below.

------------
User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Disallow: /share-widget
Allow: /

Sitemap: https://etlx.blogspot.com/sitemap.xml
------------

A major problem is, as of September 2025, sitemap.xml will only contain the list of "posts" and ignores "pages". 
As a result, all your pages will never be indexed by Google (or any other search engines). 

How to get my Blogger pages indexed by Google

There is a separate sitemap for Blogger "pages", available at URL/sitemap-pages.xml which you need to add to your robots.txt.  In other words, you cannot rely on the default robots.txt, but have to customize it yourself.

My robots.txt now looks as below  (I just added sitemap-pages.xml)

--------------
User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Disallow: /share-widget
Allow: /

Sitemap: https://etlx.blogspot.com/sitemap.xml
Sitemap: https://etlx.blogspot.com/sitemap-pages.xml
--------------

This is a terrible design by Blogger. They should fix it so that the default robots.txt includes sitemap-pages.xml