Skip to content

Commit 043bb8a

Browse files
committed
Added how do i configure a robots.txt file in shopware 6 on hypernode
1 parent e40d98a commit 043bb8a

File tree

1 file changed

+82
-0
lines changed

1 file changed

+82
-0
lines changed
Lines changed: 82 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,82 @@
1+
---
2+
myst:
3+
html_meta:
4+
description: Learn how to configure a robots.txt file in Shopware 6 on Hypernode to optimize search engines.
5+
title: 'How do I configure a robots.txt file in Shopware 6 on Hypernode? '
6+
---
7+
8+
<!-- source: https://support.hypernode.com/en/ecommerce/shopware/how-do-i-configure-a-robots-txt-file-in-shopware6-on-hypernode/ -->
9+
10+
# How do I configure a robots.txt file in Shopware 6 on Hypernode
11+
12+
In this article, we explain how to set up a `robots.txt` file in Shopware 6 on Hypernode to optimize the indexing of your webshop by search engines.
13+
14+
A `robots.txt` file allows you to instruct search engine crawlers on which parts of your website they are allowed or not allowed to index. This helps prevent duplicate content, saves additional resources, and results in more efficient indexing and better SEO performance.
15+
16+
## Setup for a single robots.txt file without multistore
17+
18+
Create a new text file named `robots.txt` and place this file in the `public` directory of your Shopware 6 installation.
19+
You can insert the example text below into this file:
20+
21+
```text
22+
User-agent: *
23+
Allow: /
24+
Disallow: */?
25+
Disallow: */account/
26+
Disallow: */checkout/
27+
Disallow: */widgets/
28+
Disallow: */navigation/
29+
Disallow: */bundles/
30+
31+
Disallow: */imprint$
32+
Disallow: */privacy$
33+
Disallow: */gtc$
34+
35+
Sitemap: https://YOUR_DOMAIN/sitemap.xml
36+
```
37+
38+
Adjust the `Sitemap` rule with the exact URL to your sitemap.
39+
40+
Use the [robots.txt tester](https://support.google.com/webmasters/answer/6062598) in Google Search Console to verify if the configuration is correct and the desired pages are being blocked or allowed.
41+
42+
## Setup for a Multistore Robots.txt shopware 6 Webshop
43+
44+
In the root of your Shopware 6 installation, navigate to the `public` directory. Create a new folder called `robots`:
45+
46+
```bash
47+
mkdir public/robots
48+
```
49+
50+
Within the newly created `robots` directory, create a separate `robots.txt` file for each domain. For example:
51+
52+
- `public/robots/www.example.com.txt`
53+
- `public/robots/shop.example.com.txt`
54+
55+
Each file should contain the rules specific to the corresponding domain. Below is an example for `www.example.com`:
56+
57+
```text
58+
User-agent: *
59+
Allow: /
60+
Disallow: */?
61+
Disallow: */account/
62+
Disallow: */checkout/
63+
Disallow: */widgets/
64+
Disallow: */navigation/
65+
Disallow: */bundles/
66+
67+
Disallow: */imprint$
68+
Disallow: */privacy$
69+
Disallow: */gtc$
70+
71+
Sitemap: https://www.example.com/sitemap.xml
72+
```
73+
74+
Adjust the `Sitemap` URL and other rules according to the specific requirements of each domain.
75+
76+
Edit or create a new NGINX configuration file at `/data/web/nginx/server.robots` and add the following rewrite rule:
77+
78+
```nginx
79+
rewrite ^/robots\.txt$ /robots/$host.txt;
80+
```
81+
82+
This rule ensures that requests to `/robots.txt` are dynamically redirected to the correct file based on the domain making the request.

0 commit comments

Comments
 (0)