What Is Robots.txt?

by | Dec 29, 2020

Robots.txt Tells Websites To Crawl Your Website In Parts Or In Entirety

Robot.txt, also known as the Robot exclusion protocol, is a standard system that helps websites communicate with web crawlers to determine whether they should be crawled or not. It instructs engine crawlers which pages can or can’t be requested from your site.

Robots.txt is found on the suffix of a domain, like: https://corkboardconcepts.com/robots.txt. A simple example of Robots.txt is as follows:

Its purpose is to limit requests on your page, which helps prevent your site from overloading with requests. Robots.txt acts as a guide to instruct search engines and other crawlers to index certain pages, as such it is a valuable tool for search. Popular WordPress SEO plugins create Robots.txt files and Sitemaps by default. It’s a foundational step of SEO.

It’s important to note that robot.txt is not a way of keeping a web page out of Google.

Marketing Glossary

What is Bing Ads?

by Corkboard Concepts

In Digital Marketing Platforms

What is Google Analytics?

by Corkboard Concepts

In Digital Marketing Platforms

What is Google Search Console?

by Corkboard Concepts

In Digital Marketing Platforms

What are 3rd Party Cookies?

by Corkboard Concepts

In Common Marketing Terms

What are Ad Extensions?

by Corkboard Concepts

In Common Marketing Terms

What are HTML5 Ads?

by Corkboard Concepts

In Common Marketing Acronims

What are 1st Party Cookies?

by Corkboard Concepts

In Common Marketing Acronims

What is Google Microsoft Clarity?

by Corkboard Concepts

In Digital Marketing Platforms

What does MFA stand for?

by Corkboard Concepts

In Common Marketing Acronims