Adding additional directive to your Robots.txt file

Adding additional directive to your Robots.txt file

Overview

The robots.txt file controls how search engines access and index pages on your website. Vacation Labs provides a default robots.txt to maintain functionality, but you can add custom rules to refine search engine behavior.

Why Add Custom Rules?

Adding custom directives helps optimize SEO by guiding search engines to focus on important pages while avoiding unnecessary ones. This improves crawl efficiency and protects sensitive information.


Default Robots.txt in Vacation Labs

The default robots.txt file in Vacation Labs includes essential rules to prevent indexing of pages that should not appear in search results.

Default Robots.txt Content:

User-agent: *
Disallow: /itineraries/bookings
Disallow: /search
Disallow: /*/dates_and_rates
Disallow: /*/date_based_upcoming_departures

Sitemap: https://www.econatuarls.life/sitemap_index.xml

Default Rules and Their Purpose:

  • Disallow: /itineraries/bookings – Prevents search engines from indexing itinerary booking pages to avoid cluttering search results with transactional pages.

  • Disallow: /search – Blocks search result pages to prevent indexing of duplicate content that can dilute SEO value.

  • Disallow: /*/dates_and_rates – Stops indexing of real-time pricing pages, which can change frequently and are not useful for search engine results.

  • Disallow: /*/date_based_upcoming_departures – Prevents indexing of date-based departure pages to avoid outdated or irrelevant pages appearing in search results.

Warning
Note: The default robots.txt file cannot be modified or removed. However, you can add additional directives to customize search engine behavior.

How to Add Custom Rules

To refine the robots.txt file further, follow these steps:

  1. Go to the Vacation Labs dashboard.

  2. Navigate to SEO & Analytics Settings.

  3. Locate the Additional Content for Robots.txt section.

  4. Add your custom rules.

  5. Save changes. Search engines will update during their next crawl.


Example Custom Rules

  • Blocking unwanted crawlers:

    User-agent: BadBot  
    Disallow: /pricing
  • Allowing specific search engines:

    User-agent: Googlebot  
    Allow: /
    User-agent: Bingbot
    Allow: /
NotesTip: Avoid blocking critical pages, such as product listings or key content, to maintain a strong SEO presence. Blocking important pages may reduce visibility in search results and impact traffic.


    • Related Articles

    • Adding/Modifying pages on the website

      Your Vacation Labs website builder subscription includes a minimum of 45 pages. Beyond the standard pages, you can add more pages to expand your website and publish additional content. To add a new page, log in to your Vacation Labs account and open ...
    • Adding Social Media Links to the website

      Social media can help you build a strong relationship with your business partners and your trusted customers. Furthermore, if you are running corporate social profiles, then the best way to get more traffic is through social media links. Also, a ...
    • Adding sliders to the homepage banner

      Overview Vacation Labs lets you add multiple banner slides to your website (Homepage, Static Page, etc.) so you can showcase products in the first fold of the page. You can display different types of sliders, such as: Tour Slide – Shows tour details ...
    • Adding a chat widget to the website

      Overview There are two options to add a chat widget to the website: If you are operating in a market where WhatsApp is prevalent, you can try adding a basic WhatsApp chat widget to your website (this is built into the Vacation Labs platform). ...
    • Adding a contact form on your website

      Overview Contact form vs Inquiry form Adding Custom SMTP details to your VL Back Office Enabling Contact Form on the desired page Configuring captcha for the contact us form Verifying the contact form submission Contact form vs Inquiry form There are ...