Blocking Bots – Good or Bad for Innovation and Performance.

On 6th October 2015 at as part of Double header : Talk 1 – Blocking bots; Talk 2 – Progressive Enhancement

Malicious bots are becoming an increasing problem for Website Engineers when seeking to improve performance. Whether that is by competitor repurposing data, computer programs price scraping, bad-bots can lead to capacity issues and site slow downs, wiping out hard fought performance gains.

On the other hand many start-ups and fledging websites require seed data and content at the start of their journey, is it really that wrong to crawl some corporation or government website particularly if you are going to use that data in new and interesting ways ? After all isn’t web-crawling / data scraping how google was established ?

In this talk Tudor discusses the history of web-scraping, what makes a bot ‘good’ or ‘bad’ and how blocking bad-bots can aid web-performance.

Slides

Presented by

Tudor James

Tudor James has worked and run technical operations teams for high traffic websites

Tudor James (Distil Networks) has worked and run technical operations teams for high traffic websites since the late 1990s. During this long IT career, running Directory Publishing websites in the UK and US, he encountered more than his fair share of bad-bots taking his company's data. Having seen how Distil manages to mitigate this problem far better than his own efforts he was delighted to join and help other website teams defend against these bad actors.

Event

Double header : Talk 1 – Blocking bots; Talk 2 – Progressive Enhancement

Date

6th October 2015

Skill level

Beginner