Developer forum

Forum » Ecommerce - Standard features » Robots.txt for an e-commerce setup

Robots.txt for an e-commerce setup

Davy Capiau
Reply

Hi,

What is an ideal robots.txt for an e-commerce setup where you don't have any special additions to the file system? Asking for DW9.

 

These are the things I came across inspecting other DW-webshops:

 

Sitemap: [sitemap]
User-agent: *
Disallow: /Files/Papirkurv/
Disallow: /*?cartcmd=*
Disallow: /*?GetRevealPage=KbasnmPermission*
Disallow: /admin
Disallow: /files/system
Disallow: /files/templates


Replies

 
Nicolai Pedersen Dynamicweb Employee
Nicolai Pedersen
Reply
This post has been marked as an answer

I would not add this much - as it also reveals internals which can be a security risk. Usually we need to remove everything from robots.txt when we get 'security' reports.

I would not anything except sitemap.xml unless there is a specific need to do so.

Votes for this answer: 1
 
Davy Capiau
Reply

HI Nicolai,

Thanks for the feedback.

So you would tune it down to the default like this?

Sitemap: [sitemap.xml url]
User-agent: *
Disallow: /Files/Papirkurv/
Disallow: /*?cartcmd=*

 

You must be logged in to post in the forum