Developer forum

Forum » Ecommerce - Standard features » Google robot
Tomas Gomez
Reply

Hi,

The Google robot is not correctly crawling our website. The Google search "site:mysite.com" displays all the PDFs of the website but only five pages of about 5000.

According the post, we are only using robots.txt. We don't use robots meta tag in the master template, neither the X-Robots-Tag header in web.config.

The robots.txt is:

Sitemap: https://mysite.com/Sitemap.xml
User-agent: *
Disallow: /Files/Papirkurv/
Disallow: /*?Promotion=True
Disallow: *.pdf$
Disallow: /*?cartcmd=*

Sitemap.xml contain the URLs to all categories, products, etc.

What we may do to list all the site pages in Google searches?

Regards,
Tomas

 


Replies

 
Nicolai Pedersen
Reply

We cannot tell you why. I believe this is a Google related question and you need to use their tools to find out why you are not indexed.

 

You must be logged in to post in the forum