Definitions[edit | edit source]

A software robot (also called a bot, web bot, web crawler, and spider) is

a computer program which operates across the Internet to perform searching, copying and retrieving functions on the web sites of others.[1]
one common type of bot that systematically searches the Internet and downloads copies of web pages, which can then be indexed by a search engine.[2].

Overview[edit | edit source]

The term bot is actually a generic term for a program that performs any function automatically. Accordingly, bots can be benign or malicious, and multiple types of malware technically can be referred to as bots (or zombies).[3]

A software robot

is capable of executing thousands of instructions per minute, far in excess of what a human can accomplish. Robots consume the processing and storage resources of a system, making that portion of the system’s capacity unavailable to the system owner or other users. Consumption of sufficient system resources will slow the processing of the overall system and can overload the system such that it will malfunction or ‘crash.’ A severe malfunction can cause a loss of data and an interruption in services.[4]

References[edit | edit source]

  1. eBay, Inc. v. Bidder’s Edge, Inc., 100 F.Supp.2d 1058, 1060 (N.D. Cal. 2000) (full-text) (citations omitted).
  2. hiQ Labs, Inc. v. LinkedIn Corporation, 938 F.3d 985, 991 n.2 (9th Cir. 2019); see also Associated Press v. Meltwater U.S. Holdings, Inc., 931 F.Supp.2d 537, 544 (S.D.N.Y. 2013); Definition of web crawler, Merriam-Webster Dictionary (full-text).
  3. Malicious bots are generally created by finding vulnerabilities in computer systems, exploiting these vulnerabilities with malware, and inserting malware into those systems.
  4. eBay, at 1060 n.2 (citations omitted).

See also[edit | edit source]

Community content is available under CC-BY-SA unless otherwise noted.