If the Dashboard shows that more URLs have been found than have been analyzed, it’s usually due to the project settings. Here are the main reasons for the discrepancy:
Insufficient URL volume
Subdomains are excluded
URLs are excluded via robots.txt
URLs are excluded via blacklist
Only certain URLs are crawled according to whitelist
A subfolder has been specified
You may check all of those in your project settings. At the end of your project settings you may use the tab “Previous analyses” to see if the discrepancy has always existed or if only newer crawls are affected. This might help you figure out if the discrepancy is due to a recent change in the settings.
To see if robots.txt are followed or subdomains have been excluded from the crawl, please review the section “What should be analyzed?” in your project settings.
A bit further down you may see the tab “Advanced analysis.” Here you can check to make sure that you haven’t limited the analysis to a single subfolder (“Advanced analysis” -> “What to analyze” -> “Analyze subfolder”) or see whether or not you listed your subdomains (“Analyze Subdomains”).
You may also check your blacklist and whitelist to see if there is anything to influence the crawl (“Advanced analysis” -> “Ignore /Include URLs”).