Blocking Bots – Good or Bad for Innovation and Performance.
On 6th October 2015 at as part of Double header : Talk 1 – Blocking bots; Talk 2 – Progressive Enhancement
Malicious bots are becoming an increasing problem for Website Engineers when seeking to improve performance. Whether that is by competitor repurposing data, computer programs price scraping, bad-bots can lead to capacity issues and site slow downs, wiping out hard fought performance gains.
On the other hand many start-ups and fledging websites require seed data and content at the start of their journey, is it really that wrong to crawl some corporation or government website particularly if you are going to use that data in new and interesting ways ? After all isn’t web-crawling / data scraping how google was established ?
In this talk Tudor discusses the history of web-scraping, what makes a bot ‘good’ or ‘bad’ and how blocking bad-bots can aid web-performance.