Skip to main content

Disallow

Disallow concept

What is Disallow?

The Disallow, whose literal translation is Reject, serves to deny access to a specific page or directory. It is associated with the Robots.txt file and is known as robot exclusion protocol, which prevents the robots of some search engines from crawling content that we do not intend for them to index in their results.

At the opposite point is the Allow, which allows the crawler to indicate a URL or directory to which it can index or crawl. Both contain specific rules that only apply to agents that we have specified above. It is feasible to include multiple Disallow lines to different user agents.

To create a Disallow All in robots.txt, the User-Agent (*) and Disallow (/) are used. The first makes the instruction apply to all robots. The second manages to block access to all the pages that are available in the web domain.

The downside is that, when using the app, the rules entered are not mandatory. Despite programming the Disallow, the robot can choose whether to pay attention or not. Therefore, some browsers may index the page or link despite indicating otherwise. To guarantee a good blocking of all robots, you can use the meta tags and add the codes in certain templates.

What is Disallow for?

It works for everything someone who is developing a project or web venture and, being in the development stage, you do not want the links to be displayed and appended in search engines. From this dynamic, you will be able to guarantee the privacy of your work until it is ready to be released to the public.

Also if the web portal contains certain information that only some users can enter, or you want to achieve a selective audience according to country, region and age, among other characteristics. There may be several reasons why you decide to use this instruction. Ideally, you should know what certain codes are for and how to use them correctly.

Disallow Examples

There are different types of Disallow to totally or partially block a link, page or web portal. One of them is the Disallow / admin, which excludes crawling of the administrator directory of a web portal. There is also the Disallow / * .gif, which prevents access to the GIF files on the web portal.

Each added code has different functionalities, from blocking images to blocking access to entire directories.

More information about Disallow

If you are interested in learning how to use the robot.txt file and correctly encode the lock codes, we suggest reading the following post: