"Restricted Pages Found" refers to pages on your website that return 401 (Unauthorized) or 403 (Forbidden) HTTP status codes, yet are still accessible via inter
By Seoxpert Editorial · Published · Updated
When restricted pages are linked from public-facing areas or included in sitemaps, search engines may attempt to crawl them, wasting crawl budget and potentially exposing sensitive or administrative URLs. This can also confuse users who encounter inaccessible pages, leading to a poor user experience and possible security concerns.
Search engines encountering restricted pages may flag them as crawl errors, reducing crawl efficiency. Users who follow internal links to these pages may see error messages, which can erode trust. In some cases, sensitive URLs could be exposed to the public, increasing security risks.
SEO crawlers and site audit tools detect restricted pages by following internal links and checking the HTTP status codes of the destination URLs. If a page returns a 401 or 403 status, it is flagged as restricted. Additionally, restricted URLs found in sitemaps are also reported.
Problem: Public page linking to a restricted admin page
<!-- index.html -->
<a href="/admin/dashboard">Admin Dashboard</a>Fix: Remove or conditionally render the link based on user a
<!-- index.html -->
<!-- Only show this link to authenticated users -->
<!-- Pseudocode for template logic -->
{% if user.is_authenticated %}
<a href="/admin/dashboard">Admin Dashboard</a>
{% endif %}Problem: Restricted page included in sitemap.xml
<url>
<loc>https://example.com/admin/dashboard</loc>
</url>Fix: Remove restricted page from sitemap.xml
<!-- Remove this entry from sitemap.xml -->Restricted pages are often found because they are linked from public-facing pages or included in sitemaps. Crawlers follow these links or sitemap entries and attempt to access the pages, even if they are protected.
You can use robots.txt to disallow crawling of restricted pages, but this does not prevent them from being discovered or accessed if they are linked publicly. Robots.txt is not a security measure—always use proper authentication and authorization controls.
While it's common for login pages to exist, you should avoid linking to admin or sensitive pages from public areas and exclude them from sitemaps. Exposing such URLs can increase security risks.
Configure your sitemap generation tool to exclude URLs that require authentication or are meant to be private. This may involve updating your CMS or sitemap plugin settings.
If certain pages should only be accessible to authenticated users, ensure links to these pages are only visible after login or via user-specific navigation. Do not include them in public-facing navigation or sitemaps.
Run a scan to see if Restricted Pages Found affects your pages.
Scan my website →