@Edent That's a good summary of the situation and I certainly agree scrapers should be checking robots.txt - including Big Centralized Social should be doing it too.I don't think randomly checking the validity is likely to work well, but perhaps just user-reports would be fine.Or perhaps a standard where a website can publish a public key with which they sign all their OpenGraph cards so their validity can be checked?Though guess you're still ddosing with fetching the robots.txt and or the public key. Suppose they can both be static at least.