How It Works?
Almost all of the sites are all software has bugs link in the same manner of death or disconnected. Error occurs, change, change file names and locations, and external links to change. And until you check the link for some time, and learned what to break them, you are likely to repeat the classic mistake regularly.
Link checkers work like a search engine spider. They "crawl" your website to find internal and external links are disconnected. Crawl is recursive, which means spider establish a link from a page, causing the page tree until all the branches were discussed. Spider continues until it reaches the defined end point. For example, when all internal branches were reviewed, or when all of the first stage of external links have also been examined. A good link checking broken links also produce reports, enabling you to determine where the broken links, and why the link is down.
Note: Your server may be configured to use .htaccess file robots.txt to block "robot" (search engine spiders). If so, you will not be able to scan for broken links, unless you can configure your server link checker cheat, or remove the temporary block from the server. Some scanners, such as the Xenu's Link Sleuth does not respect the "no robot" files so that they if the robot is "blocked" work.