Hello, I am trying to remove some pages in my sitemap that are causing google crawl errors. Is there anyway to change the auto-generated sitemap? I'd rather not create a new one because my site is so large.
Hey,
If you can't re-generate sitemap.xml then add urls in robot.txt to tell crawler to ignore that urls.
Example:
Disallow /my/url/here
You should start url with /
If don't know robot.txt then try this tool to validation: http://tools.seobook.com/robots-txt/generator/
Reference: http://www.robotstxt.org/robotstxt.html
We've already done that but the URLs are still indexing and we are getting an error from Google Search Console.