Crawling

Must know software terms | Verb

another creepy thing Google does in addition to stalking. Search engines use automated scripts called “web crawlers” (also known as “bots” or “spiders”) to visit websites and index their content for ranking in the search engine result pages - SERP. As crawlers visit websites, they use the links found on those pages to discover other websites. So they literally crawl from link to link. Crawlers also take into account quality metrics, such as number of broken links, duplicate content, sitemap accuracy. That's why people invest in Search Engine Optimization (SEO) to boost their website ranking.

 

Developer: “This task is killing me. I`m gonna go crawl in a corner and die.” SEO Specialist: “You can`t crawl there. We blocked it from being indexed using a “no index” meta tag.

Added by Get IT Guy