Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

Website won’t crawl in SEMRush & other SEO tools

Ally Adamson
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
September 17, 2018
We keep getting a crawl error when trying to use SEMRush and other audit tools to check our site for issues and errors. I’ve checked our codes and our robots.txt and nothing is being blocked and I’ve allowed the SEMRush bots crawl access. The site will crawl in some online tools, but with others we get errors like “URL is invalid” and in SEMRush we continue to get a crawl error due to a “server error” however we can’t find anything wrong on our end. Any insight as to what the issue may be?

1 answer

0 votes
Alyss
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
September 18, 2018

@Ally Adamson Diagnosing the issue may be dependent on figuring out a more specific server error that SEMRush is getting. 

You could start with Google's URL inspection tool to identify obvious indexing issues.

If SEMRush can't provide a specific server error/status code (401, 500, etc), then you could try to recreate the error. The most straightforward way would be to make a GET HTTP request to the URL and set the user-agent request header for a web crawler.

If you aren't comfortable with cURL requests or terminals, I'd recommend Postman as a more user-friendly interface. 

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events