Spam is really a problem in many Wiki communities, often forcing at least temporary to restrict editing rights. Most of the recent attempts to find a solution focus around captchas and spam lists. Captchas may be efficient to some extent; the problem is that to make them unreadable for bots, they must be twisted enough to become also difficult for humans to read. Lists seem less and less efficient, often accumulating thousands of entries and still leaving enough gaps for spammers. Spammers frequently use the Wiki search box to check if there is already some spam on the site - this shows that Wiki may be purely maintained and they can add more. Hence it may make sense to implement the delayed indexing but it also delays indexing of legitimate content. Blocking IP addresses is also no longer useful due DHCP.
One of the solutions may be to use combined protection rather than relying on some single "killer" approach. The rationale is to make spammer to invest more and more work into building the spam bot. Requiring a complex bot does not make the attack impossible but may statistically eliminate significant percent of spammers that are not willing to invest enough resources.