Ralf D. Kloth, Fight SPAM, catch Bad BotsArchived 2006-06-01 at the Wayback Machine: "Generating web pages with long lists of fake addresses to spoil the spam bot's address data base is not encouraged, because it is unknown if the spammers really care and on the other hand, the use of those addresses by spammers will cause additional traffic load on network links and involved innocent third party servers."
SEO GlossaryArchived 2010-12-28 at the Wayback Machine: "A spider trap refers to either a continuous loop where spiders are requesting pages and the server is requesting data to render the page or an intentional scheme designed to identify (and "ban") spiders that do not respect robots.txt."
ohohlfeld.com
downloads.ohohlfeld.com
Hohlfeld, Oliver; Graf, Thomas; Ciucu, Florin (2012). Longtime Behavior of Harvesting Spam Bots(PDF). ACM Internet Measurement Conference. Archived(PDF) from the original on 2014-07-25. Retrieved 2014-07-18.
robotcop.orgArchived 2019-10-20 at the Wayback Machine: "Webmasters can respond to misbehaving spiders by trapping them, poisoning their databases of harvested e-mail addresses, or simply block them."
Hohlfeld, Oliver; Graf, Thomas; Ciucu, Florin (2012). Longtime Behavior of Harvesting Spam Bots(PDF). ACM Internet Measurement Conference. Archived(PDF) from the original on 2014-07-25. Retrieved 2014-07-18.
SEO GlossaryArchived 2010-12-28 at the Wayback Machine: "A spider trap refers to either a continuous loop where spiders are requesting pages and the server is requesting data to render the page or an intentional scheme designed to identify (and "ban") spiders that do not respect robots.txt."
Ralf D. Kloth, Fight SPAM, catch Bad BotsArchived 2006-06-01 at the Wayback Machine: "Generating web pages with long lists of fake addresses to spoil the spam bot's address data base is not encouraged, because it is unknown if the spammers really care and on the other hand, the use of those addresses by spammers will cause additional traffic load on network links and involved innocent third party servers."
robotcop.orgArchived 2019-10-20 at the Wayback Machine: "Webmasters can respond to misbehaving spiders by trapping them, poisoning their databases of harvested e-mail addresses, or simply block them."