Google on Monday introduced the Google Search Console URL Inspection API to provide programmatic access to URL-level data for properties that marketers can manage.
"Please,
please, please all SEO Crawlers," Aleyda Solis, an international SEO consultant, tweeted. “Integrate to provide status along with the one you show from crawl simulations and log
files.”
The idea, according to Google, is to focus on the ability to debug and optimize pages, as well as to programmatically add the URL inspection tools to the brand’s content
management system, tools, dashboards and third-party tools.
The APIs give developers or marketers access to the data outside of Search Console, through external applications and products.
Some users are already utilizing the APIs to build custom solutions to view, add, or remove properties and sitemaps, and run advanced queries on Search performance data.
There are
limits to the number of API calls that developers can run daily. The API use limits run up to 2,000 queries per day, and 600 queries
per minute.
While building the new API, Google said it consulted various search engine optimization professionals and publishers to determine how they would use the API to create solutions
with this data.
Here are some options:
- SEO tools and agencies can provide ongoing monitoring for important pages and single-page debugging options. For example, checking if
there are differences between user-declared and Google-selected canonicals, or debugging structured data issues from a group of pages.
- CMS and plugin developers can add page or
template-level insights and ongoing checks for existing pages. For example, monitoring changes over time for key pages to diagnose issues and help prioritize fixes.