Optimizing your website's search engine visibility starts with managing its indexing settings correctly. One crucial file for this purpose is the robots.txt file, which guides search engines on which pages to crawl or ignore.
This guide walks you through the step-by-step process of creating and configuring a robots.txt file using Ciriks Site Builder.
To begin, follow these steps to navigate to the Static Pages section in your Ciriks Site Builder dashboard:
Now, let’s create a new static page that will host the robots.txt file:
In the robots.txt page settings, configure the following details:
"robots.txt"
in the title field.https://yourdomain.com/robots.txt
Important: Do not add anything in the CSS and JavaScript sections.
Here’s an example of basic robots.txt rules:
User-agent: Googlebot Disallow: /nogooglebot/ User-agent: * Allow: / Sitemap: https://www.example.com/sitemap.xml
For more details on robots.txt rules, you can check out Google's official guidelines.
Once everything is set up, click Save to store your robots.txt page settings.
A well-configured robots.txt file helps search engines understand which pages to crawl and which to exclude, improving your site's SEO. With Ciriks Site Builder, managing this file is simple, allowing you to take full control over your website’s indexing preferences.