Star InactiveStar InactiveStar InactiveStar InactiveStar Inactive
 

This website unfortunately got a larger number of dead links when it was restructured. Seach engines don't like this. That's why I wrote a go program which helped me to locate the pages with dead links and also creates a sitemap.

Following pages are reported:

1) Internal pagelinks which are not OK (404, 403, ...)

2) External pagelinks which are not OK (404, 403, ...)

3) External pagelinks used on the website which are OK (200)

 

and in addition every page which refers to the page. It's boring to fix this but finally I was able to remove these invalid/dead links.

 

If you want to use it - either just to create a sitemap or to detect invalid links on a website call invoke the cralwer on a Linux x86 or Raspberry inthe following way. There is no go installation required.

 

curl https://raw.githubusercontent.com/framps/golang_tutorial/master/genSitemap/startCrawler.sh | bash -s -- https://<website>>

 

Now you can restart the crawler every time with ./startCrwaler.sh. Option -worker numberOfWorkers can be used to change the default number of workers.