The term crawler describes a type of software the visits Web sites available on the Internet and checks content and information with the aim of creating search engines indexes and keeping them up-to-date. One of the particular features of a crawler is being able to visit each single page of a Website by also following any additional external links present.