Adding additional directive to your Robots.txt file

Adding additional directive to your Robots.txt file

Overview

The robots.txt file controls how search engines access and index pages on your website. Vacation Labs provides a default robots.txt to maintain functionality, but you can add custom rules to refine search engine behavior.

Why Add Custom Rules?

Adding custom directives helps optimize SEO by guiding search engines to focus on important pages while avoiding unnecessary ones. This improves crawl efficiency and protects sensitive information.


Default Robots.txt in Vacation Labs

The default robots.txt file in Vacation Labs includes essential rules to prevent indexing of pages that should not appear in search results.

Default Robots.txt Content:

User-agent: *
Disallow: /itineraries/bookings
Disallow: /search
Disallow: /*/dates_and_rates
Disallow: /*/date_based_upcoming_departures

Sitemap: https://www.econatuarls.life/sitemap_index.xml

Default Rules and Their Purpose:

  • Disallow: /itineraries/bookings – Prevents search engines from indexing itinerary booking pages to avoid cluttering search results with transactional pages.

  • Disallow: /search – Blocks search result pages to prevent indexing of duplicate content that can dilute SEO value.

  • Disallow: /*/dates_and_rates – Stops indexing of real-time pricing pages, which can change frequently and are not useful for search engine results.

  • Disallow: /*/date_based_upcoming_departures – Prevents indexing of date-based departure pages to avoid outdated or irrelevant pages appearing in search results.

WarningNote: The default robots.txt file cannot be modified or removed. However, you can add additional directives to customize search engine behavior.


How to Add Custom Rules

To refine the robots.txt file further, follow these steps:

  1. Go to the Vacation Labs dashboard.

  2. Navigate to SEO & Analytics Settings.

  3. Locate the Additional Content for Robots.txt section.

  4. Add your custom rules.

  5. Save changes. Search engines will update during their next crawl.


Example Custom Rules

  • Blocking unwanted crawlers:

    User-agent: BadBot  
    Disallow: /pricing  
  • Allowing specific search engines:

    User-agent: Googlebot  
    Allow: /  
    User-agent: Bingbot  
    Allow: /  
NotesTip: Avoid blocking critical pages, such as product listings or key content, to maintain a strong SEO presence. Blocking important pages may reduce visibility in search results and impact traffic.


    • Related Articles

    • Adding sliders to the homepage banner

      Vacation Labs allows you to add multiple banner slides to your webpage (i.e Homepage, Static page) so that you can showcase your products to your visitors once the website loads on the first fold itself. You can display a Tour page slider, collection ...
    • Adding Multiple Sections to your Website

      You add multiple sections on any static page to show more content with different styles and layouts. For example: you may want to display more than 2 carousels for the Tour cards, or collection cards, etc. Please follow the below steps to add ...
    • Getting started with your Vacation Labs account

      Welcome to Vacation Labs! Build your website or manage your bookings & operations with our best features. Let us start with the most basic tasks that you must complete before you begin working on your account. Update organization details, logo and ...
    • Configure your account settings

      The SETTINGS tab, also know as Global Settings is divided into multiple parts and which allow you to control many important aspects of your Vacation Labs account. Let's go through them all: Payment Processors Payment Processors allow you to collect ...
    • Connecting Google Analytics to your Vacation Labs Website

      Connecting Google Analytics to your Vacation Labs Website Vacation Labs allows you to integrate Google Analytics with a few steps. Quick Overview Optional: Create Google Analytics account and Property Get tracking code from Google Analytics: Click on ...