Friday, December 6, 2024
HomeC#How To Create A Robots.txt File In ASP.NET Core

How To Create A Robots.txt File In ASP.NET Core


A robots.txt file tells the crawlers the place they’re allowed to scrape the web site. You’ll be able to inform the major search engines which hyperlinks you don’t need to index. You’ll be able to point out the place a sitemap is situated.

A robotic’s textual content file will not be necessary, but it surely’s really helpful since you make crawling simpler for robots. For instance, perhaps your sitemap is situated elsewhere than traditional.

A robotic’s textual content file is fairly simple. Search for an instance on this web site robotic’s file.

Person-agent: *
Disallow:

Sitemap: https://programmingcsharp.com/sitemap_index.xml

You point out the crawler’s person agent and what’s allowed or disallowed to crawl.

The best way to configure robots.txt file in ASP.NET Core

In ASP.NET, there are numerous options to configure your robots.txt file. You’ll be able to create it by hand and place the file in wwwroot folder. Documentation for writing a robotic’s file will be discovered on the Google web site.

One other strategy is to make use of a library to do that for you. I personally use the RobotsTxtCore library.
RobotstxtCore library is a middleware that can convert your C# code to a legitimate robots.txt file.

First, set up the NuGet bundle:

Set up-Package deal RobotsTxtCore

After that, it is best to add the service. Take a look at this instance:

providers.AddStaticRobotsTxt(b =>
{
    b
        .AddSection(part => part
        .AddUserAgent("Googlebot")
        .Permit("/"))
        .AddSection(part => part
        .AddUserAgent("Bingbot")
        .Permit("/"))

    .AddSitemap("http://yourwebiste.com/sitemap.xml");

    return b;
});

I specify that Google and Huge bots are allowed to crawl the whole lot. I additionally specify the trail to the sitemap file.

The ultimate step is so as to add the middleware within the pipeline:

app.UseStaticFiles(new StaticFileOptions
{
    OnPrepareResponse = ctx =>
    {
        const int durationInSeconds = 60 * 60 * 24 * 7;
        ctx.Context.Response.Headers[HeaderNames.CacheControl] =
            "public,max-age=" + durationInSeconds;
    }
});
app.UseRouting();

app.UseRobotsTxt();

Any further, you may entry the trail to the robots.txt file.

 

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments