Bradley M. Kuhn

When the Powers That Be Try to Rewrite History

Bradley M. Kuhn at

I just discovered again, as has happened to me more than five times I can recall, someone sent me a URL that was public and existed when they emailed it to me, but in the 24 hours it took me to read their message.

In my work in GPL enforcement, I work a lot of beats where people tend to try to rewrite history by taking stuff off the Internet. While it's impossible to take most things off the Internet, I tend to want to look at stuff that very few people are interested in (e.g., a GPL violation report, or once-public discussions about some GPL issue) that there are a very few people (e.g, GPL violators) who want to take this off the Internet as quickly as possible.

My thought just now was that perhaps I should have a script that looks at all my incoming email, finds URLs in it, and does two things:

(a) immediately downloads a copy,

(b) submits the URL to

Does anyone know of such a thing, or do I need to write this myself?

Meanwhile, as a public service announcement to those who care about the GPL and other things: EVERY TIME YOU SEE a URL that has SOMETHING about the GPL that looks politically important, be it evidence of a GPL violation or anything else, submit that URL to ASAP!

Stephen Michael Kellat, Carlos Augusto ARES, Claes Wallin (韋嘉誠), Jason Self likes this.

Stephen Michael Kellat, Olivier Mehani, Claes Wallin (韋嘉誠), Claes Wallin (韋嘉誠) and 1 others shared this.

Show all 8 replies

You're probably better off having a filter that matches for URLs and either auto responds reminding emailers to make a local copy and submit the URL to or that creates a queue of URLs for you to review. Or do both :)

Auto downloading links sounds like a bad way to go for many reasons.

der.hans at 2015-05-28T22:21:20Z

Stallman wrote that he use something like (a).

mnd at 2015-05-28T22:38:59Z

Andrew E likes this.

If you can find the funds to pay me as your secretary/OA I'll gladly monitor your e-mail for you. My current employer has been getting weirder than usual lately.

Stephen Michael Kellat at 2015-05-29T02:24:20Z

Claes Wallin (韋嘉誠), ghostdancer likes this.

One other problem... A robots.txt "Disallow:" line will prevent from saving a given URL, per, which says the Internet Archive "does respect robots.txt instructions, and even does so retroactively".

Dan Scott at 2015-06-03T04:08:54Z

Claes Wallin (韋嘉誠) likes this.