Overview
Checks if a page is crawlable, indexable, followable for robots to ensure SEO performance
The SEO Robots Inspector is a powerful Chrome extension designed to help SEO professionals and webmasters ensure their webpages are properly configured for search engine crawling, indexing, and following. This tool evaluates crucial SEO factors and provides detailed insights to optimize your site's performance on search engines. Key Features: - Meta Robots Tag Analysis: Checks if the page has a meta robots tag and determines its directives (index, noindex, follow, nofollow). - Canonical URL Detection: Identifies the canonical URL and verifies if it matches the current page URL to prevent duplicate content issues. - Local Robots.txt Parsing: Fetches and analyzes the robots.txt file locally, ensuring that important URLs are not blocked from being crawled without relying on external APIs. - Visual Indicators: Provides clear visual feedback using color-coded indicators for indexability, followability, crawlability, and canonical status directly in the browser's extension icon. - Blocked URLs Highlighter: Highlights URLs blocked by the robots.txt file directly on the webpage for easy identification. - HTTP Headers Inspection: Fetches and displays the X-Robots-Tag header to check for additional indexing and following rules set at the server level. - On-Demand Analysis: Perform on-demand analysis of any active tab to get up-to-date information on the page's SEO status. How It Works: - Active Tab Analysis: Automatically analyzes the active tab when the page loads, when the user activates the extension, or when tab/window focus changes. - Data Collection: Collects meta robots content, canonical URL, X-Robots-Tag header, and robots.txt information. - Local Robots.txt Processing: Fetches and parses robots.txt files locally, applying rules for different user agents including Googlebot. - Status Display: Displays the SEO status in a popup window with easy-to-understand indicators and detailed information. - Highlight Blocked URLs: Allows users to highlight all URLs blocked by robots.txt on the page for immediate visibility. User Interface: - Results Section: Shows the indexable, followable, crawlable status, meta robots content, X-Robots-Tag content, robots.txt info, and canonical URL. - Actions Section: Includes a button to highlight blocked URLs on the page. - Explanation Section: Provides a legend for the visual indicators used in the extension icon and analysis results. Permissions: - Active Tab: Needed to analyze the content of the currently active tab. - Scripting and Storage: Used for injecting scripts and storing tab status locally. - Host Permissions: Required to fetch and analyze robots.txt files and X-Robots-Tag headers from all websites. Installation and Usage: - Install the Extension: Add the SEO Robots Inspector from the Chrome Web Store. - Analyze Pages: Navigate to any webpage and click on the extension icon to analyze its SEO status. - View Results: Check the detailed results in the popup window. - Highlight Blocked URLs: Click the "Highlight Blocked URLs" button to see which links are blocked by robots.txt. Advanced Features: - Real-time Updates: The extension icon updates in real-time as you navigate between tabs or pages. - Efficient Robots.txt Handling: Implements caching and timeout mechanisms for efficient robots.txt fetching and parsing. - Detailed Rule Matching: Provides specific information on which robots.txt rule matched for the current URL. Enhance your SEO workflow and ensure your website's pages are optimized for search engines with the SEO Robots Inspector! Get instant insights into crawling, indexing, and canonical issues without leaving your browser.
5 out of 58 ratings
Google doesn't verify reviews. Learn more about results and reviews.
Héctor AbrilJul 8, 2024
- Report illegal content
I have been using it for a month now in my day to day work. I think I've gotten used to it. The color system is very convenient for detecting inconveniences. It helps me to be a little more productive.
David GarciaJun 6, 2024
- Report illegal content
Simple but very useful application to discover urls blocked by robots.txt. It has become a tool to work with in my day to day life.
Daniel PerisJun 5, 2024
- Report illegal content
Awesome Chrome SEO extension to check anything related to SEO robots (crawling, indexing) Good job Esteve!
Details
- Version1.7
- UpdatedJuly 2, 2024
- Size13.05KiB
- LanguagesEnglish
- DeveloperEsteve CastellsWebsite
Carrer de Sant Joan, 49, 2n 1er Arenys de Mar, Barcelona 08350 ESEmail
esteve@estevecastells.comPhone
+34 627 27 74 54 - TraderThis developer has identified itself as a trader per the definition from the European Union.
- D-U-N-S472033643
Privacy
SEO Robots Inspector has disclosed the following information regarding the collection and usage of your data. More detailed information can be found in the developer's privacy policy.
SEO Robots Inspector handles the following:
This developer declares that your data is
- Not being sold to third parties, outside of the approved use cases
- Not being used or transferred for purposes that are unrelated to the item's core functionality
- Not being used or transferred to determine creditworthiness or for lending purposes
Support
For help with questions, suggestions, or problems, visit the developer's support site