Battling spam
From Crankshaft Coalition Wiki
(→External resources for handling wiki spam) |
|||
Line 8: | Line 8: | ||
Most wiki spam comes from [http://en.wikipedia.org/wiki/Spambot spambots], automated computer programs specifically designed to spam wikis by editing articles to include links to spammy websites. | Most wiki spam comes from [http://en.wikipedia.org/wiki/Spambot spambots], automated computer programs specifically designed to spam wikis by editing articles to include links to spammy websites. | ||
− | Occasionally, a human spammer will also spam a wiki, manually editing an article to include links to | + | Occasionally, a human spammer will also spam a wiki, manually editing an article to include links to a website. Such occurrences are relatively rare, and most anti-spam efforts focus on deterring spambots. |
==Current anti-spam methods in use== | ==Current anti-spam methods in use== | ||
Line 16: | Line 16: | ||
*[http://www.mediawiki.org/wiki/Extension:SpamBlacklist SpamBlacklist] | *[http://www.mediawiki.org/wiki/Extension:SpamBlacklist SpamBlacklist] | ||
*[http://www.mediawiki.org/wiki/Extension:SpamBlacklist TorBlock] | *[http://www.mediawiki.org/wiki/Extension:SpamBlacklist TorBlock] | ||
− | + | *[http://www.umasswiki.com/wiki/UMassWiki:Bad_Behavior_2_Extended Bad Behavior 2 Extended] -- added on 07/08/09, working well. | |
The SpamBlacklist extension includes a cleanup.php script for cleaning up spam, which we haven't yet used. | The SpamBlacklist extension includes a cleanup.php script for cleaning up spam, which we haven't yet used. | ||
Line 24: | Line 24: | ||
===Blank user agents forbidden=== | ===Blank user agents forbidden=== | ||
Via a .htaccess file in the root wiki directory, 403 Forbidden errors are returned to anyone connecting with a blank user agent. | Via a .htaccess file in the root wiki directory, 403 Forbidden errors are returned to anyone connecting with a blank user agent. | ||
− | |||
− | |||
− | |||
− | |||
− | |||
==Spamblocking methods to add if the above does not suffice== | ==Spamblocking methods to add if the above does not suffice== | ||
− | + | If the current methods don't work, we can add CAPTCHAS for unregistered users. This would likely consist of using [http://www.mediawiki.org/wiki/Extension:ConfirmEdit ConfirmEdit], and see additional details [http://www.umasswiki.com/wiki/UMassWiki:Blocking_Spam_in_MediaWiki#ConfirmEdit here]. We will probably also need to use [http://www.mediawiki.org/wiki/Extension:ReCAPTCHA ReCAPTCHA]. | |
+ | |||
+ | Also, we could use [http://www.mediawiki.org/wiki/Extension:SpamRegex the SpamRegex extension], although it requires memcached. | ||
+ | |||
+ | ==Miscellaneous== | ||
+ | Once a certain spam handling protocol proves reasonably successful, it might be possible to unprotect most of the [[Special:ProtectedPages|protected pages]]. | ||
==External resources for handling wiki spam== | ==External resources for handling wiki spam== | ||
Line 40: | Line 40: | ||
*http://meta.wikimedia.org/wiki/Proxy_blocking | *http://meta.wikimedia.org/wiki/Proxy_blocking | ||
− | === | + | ===MediaWiki extensions=== |
*http://www.mediawiki.org/wiki/Extension:Lockdown | *http://www.mediawiki.org/wiki/Extension:Lockdown | ||
*http://www.mediawiki.org/wiki/Extension:PageSecurity | *http://www.mediawiki.org/wiki/Extension:PageSecurity | ||
Line 46: | Line 46: | ||
*http://www.mediawiki.org/wiki/Extension:EditSubpages | *http://www.mediawiki.org/wiki/Extension:EditSubpages | ||
− | === | + | ===MediaWiki manual pages=== |
*http://www.mediawiki.org/wiki/Manual:Preventing_access | *http://www.mediawiki.org/wiki/Manual:Preventing_access | ||
*http://www.mediawiki.org/wiki/Manual:User_rights | *http://www.mediawiki.org/wiki/Manual:User_rights | ||
*http://www.mediawiki.org/wiki/Manual:Combating_spam | *http://www.mediawiki.org/wiki/Manual:Combating_spam | ||
*http://www.mediawiki.org/wiki/Manual:$wgGroupPermissions | *http://www.mediawiki.org/wiki/Manual:$wgGroupPermissions | ||
+ | |||
+ | [[Category:Wiki information]] |