Slashdot (and perhaps Advogato): Why bother?

Posted 30 Jul 2000 at 00:41 UTC by edw Share This

As I was reading the posts in Slashdot about Bruce Perens' post on Technocrat.net, I thought about an interesting post I could make on Advogato: what is the point of all of these different forums?

Now, don't get me wrong; I realize that Advogato was created as an experiment in peer review mechanisms (or something like that), but fundamentally, it's an intimate Slashdot with a higher signal-to-noise ratio (for now). But let's face it: aren't newsgroups or mailing lists more suitable to whatever it is we're trying to do when we come to Slashdot, Technocrat.net, Advogato, Freshmeat, et. al. and read or post?

(I threw in Freshmeat there at the end even though it's a bit different; the focus is on updates to programs. Gee, has anyone ever heard of an announcements mailing list?)

Is the web fundamentally better than these older technologies for carrying on discussions? Taking Slashdot as an example, I would have to say "No way!" Following a thread is painful and slow (here come the GIFs!), and the medium itself (the Slashdot web site, that is) doesn't encourage the development of threads at all. As a conceptual tool Slashdot encourages non-linear, context-free "conversations" almost as bad (and sometimes worse) than what happens on IRC. Of course, IRC can be great if you have only a few people, and similarly, Advogato is great in part because not everyone is banging on the keyboard at the same time, writing the same thing. But as more and more people join a conversation, it becomes less and less interesting (and insightful, and intelligent, and...).

With a newsgroup or a mailing list, you can always just split it when things get too messy. No one splits a web site. Which gets me to thinking... No one has ever gotten rich off of creating a newsgroup or mailing list. But there are probably other reasons for the popularity of web sites over these older, sometimes more-capable tools. Stuff on the web has a definite "there-ness". A site appears as a thing with a permanence lacking in a newsgroup or mailing list. A mailing list feels more like pure Heraclitean flux. (Please excuse the obscure pre-Socratic philosopher reference; such things are almost required in posts like this.)

Another possible reason for the shift to the web for such conversations can be explained by the old adage that states that everything starts looking like a nail once you get a hammer in your hands. Put another way, technology is easily stricken by the sickness of haute couture. Java, the Web, B2C, B2B, P2B, "community", "click and mortar", whatever the flavor of the month is, it becomes an all-consuming worldview that demands that everything be explained in terms of itself.


Discussion or News?, posted 30 Jul 2000 at 01:08 UTC by ZachGarner » (Observer)

I've always thought of slashdot style sites as news sites, primarily. And as a news feed, the usenet really isnt my favorite (email is). With slashdot, you get the news topics and brief desciptions on one page and discussion on subpages. With usenet, its more chaotic, people start offtopic discussions at the root level and such. Email is good for news, i feel, but when discussions of the level that slashdot has starts coming in, its just too much to handle. I prefer OpenBSD's web based archive of the mailing lists to the actual mailling lists.

Mainly the slashdots of the world can provide a good mixture of news and discussion in a way that makes it easy to read the news only for those who want to ignore the discussion. And, in theory i guess, discussion could be better handled as well (moderation and neat things like the web of trust here).

When it comes to freshmeat... I primarily use them for their appindex and have came accross intesting new software by glancing over their front page. For the software that I care to hear about new releases, i follow the changes directly. Freshmeat is about finding software that you never knew existed.

In short, i think that slashdot is NOT about discussion, its about news.

Why did you bother?, posted 30 Jul 2000 at 03:59 UTC by deekayen » (Master)

Another interesting question might be, why did you bother posting on Advogato?

Sites like Slashdot are around because they provide a news medium that's friendly to the eye. If I were to get the volume of news that Slashdot puts out daily combined with replies, I'd end up setting up a mail filter. I use email for personal communication and the web for news and that's just the best way to do it IMO.

Sites like /. have also started going beyond just news distribution on their websites by giving headlines on the sides of the page, or diaries on this site. We could probably do a whole story on why people make entries in their diaries.

Why did I bother?, posted 30 Jul 2000 at 04:48 UTC by edw » (Apprentice)

Another interesting question might be, why did you bother posting on Advogato?

I thought you were being a smart-ass until I started thinking about my answer to your question: why the hell do I post at Advogato?

I'm probably not alone when I say that I go where the conversation is. Compared to Slashdot, Advogato is an oasis of intelligent conversation. Each post seems to be a good faith attempt to add to the conversation, not merely some sort of existential "I exist!" post.

The reason I read and post on Slashdot, Advogato, etc. compared to mailing lists and newsgroups is accessibility and visibility. It takes thirty seconds to see if there's anything new at a web site. It can take several minutes to switch out of the browser, and into an ssh window to run mail (yes, I read mail with /usr/bin/mail) or open Outlook Express on my Mac to read news.

However, I've never seen the sort of comaraderie on a web-based discussion forum that was present on sci.skeptic and comp.sys.mac.programmer when I was hanging out on USENET back in '91.

proposal: syndication, posted 30 Jul 2000 at 17:40 UTC by higb » (Observer)

I'd like to propose something I touched on back in the thread RIP: The Free Software Community: Put the message base (articles, respones, diary entries) into an open database, and allow different sites to provide their views of the same data. As it happens, I'm 'higb' here and on Slashdot. If I post to Slashdot, why shouldn't it appear here under my 'person' page? If Slashdot likes an Advogato article, why shouldn't they be able to link it into their pages, and in their case present (a superset of) responses based upon their metric?

'Community' would become centered on the pespective one shares.

Clique, posted 30 Jul 2000 at 17:47 UTC by tetron » (Journeyer)

I know this is sort of hypocritical since I'm a relative newbie (found out about here through salon) but I'm starting to think that the cliquish nature of advogato may be a good thing, and here's why: the fewer people that are actually posting, the greater the amount of scruteny that each post will get by readers. Therefore, people posting will put a bit more effort into their posts, and tend to be more intellegent - they know people are going to read and judge them based on their words. You don't have that sort of dynamic in a crowd. Slashdot stories generally rank up at least 100 comments. How many people really read all of those? Think about it! Do you? Of course not. There's no time for it.

Slashdot's group filtering doesn't really work for this problem. A lot of comments never really get read or commented on at all, so when you browse through at +2 or +3 you're not necesarily skimming the cream of the comments, you're just reading the crap that floated to the top (lovely image, I know.)

So rather than pulling out a few good comments out of a lot of mediocre ones, maybe it's more useful to try to raise the quality of all comments, and one way to do that is to have a relatively small group and use peer pressure to "encourage" people to make intellegent posts, possibly with some sort of feedback system to comment authors.

<pie in="the" sky="mode"> Then to make this scale, you create a whole bunch of parallel cliques - when someone joins, they get assigned to a specific one. These groups are fairly small, say maybe a hundred people (most are going to be lurkers), and they primarily see each other's posts. Over time, because you're mainly discussing subjects with these other people, you have more of an opportunity to get to know people and their opinions, so you can have more in-depth discussions with people you actually know! Couple this with some sort of moderation metric you could promote the best comments from each group into a higher-level comment space that everyone reads, so you get a mix of comments from your own group, and the -best- comments from ther other groups. Best of both worlds, and no one is overwhelmed by 300 crappy comments. </pie>

Anyway, social systems have scalability problems just like computer systems. And like computer systems, we need to design around these basic issues of there being only 24 hours in the day (unless you use EvStacks! **)

(** obscure GGI humor)

I've been reading slashdot for years. Before they had user accounts at ALL, and there would be maybe 20 comments to a story. The quality of comments used to be somewhat better, or at least you could get your own voice heard. Now Slashdot is big. Size matters ;-p

Slashdot has been Slashdotted., posted 30 Jul 2000 at 20:46 UTC by edw » (Apprentice)

It has probably been observed many times before, but I think it's interesting that Slashdot itself has come to be paralyzed by the Slashdot effect.

To be less charitable, I think Slashdot represents the oafish bourgeois who wander in a herd, trampling to death anything that it finds interesting.

Higb, I like your idea. The role of a site in scenario would be to exercise editorial judgement. Some sites may reflect particular individual's opinion, whereas others would reflect the collective opinions of one group or another.

The Web has advantages, posted 31 Jul 2000 at 03:56 UTC by gnuchris » (Journeyer)

There is one obvious reason for choosing the web for discussions... You can view a website from anywhere in the world... every computer has a webbrowser, there is no configuring needed and it is quick and easy... I can sit down at any PC at my office and check Advogato, now I can't say the same for newsgroups.. I need to configure a newsreader.

Also the web benifits from Hyperlinks.I can post a hyperlink in a discussion thread, and it is easy for someone to follow the link.. the web and html does have soem advantages over the older technologies.

The Web has other advantages also..., posted 31 Jul 2000 at 08:03 UTC by tja » (Journeyer)

Other useful advantages of web fora include the ease of disengaging. If the uninformed rantings on Slashdot get you down, you have the easy and immediate option of not reading slashdot. No action required on your part, except the exercise of will to avoid doing something. However, if you're on a mailing list and don't like the SNR, it takes effort to unsubscribe.

A web forum also makes it easy to browse. No commitment. You can have a quick peek at Slashdot every now and then to see how the SNR is getting on (ie to confirm your opinion :)), without having to commit yourself to doing so ever again. Subscribing to a mailing list represents a commitment that browsing a web site doesn't.

A site like Advogato has a degree of complexity that no mailing list or usenet group could capture. You get to pick and choose which (if any) diary entries you read, for example, with whatever frequency you wish. You can ignore some people, and then later change your mind and go and look at their diaries. You couldn't get anywhere near that level of flexibility with any other medium, even with a hyperintelligent procmail filter.

I also like Higb's suggestion, except for the phrase "into an open database". The use of the singular bothers me. It has to be distributed, otherwise it's useless and we may as well all just read the newspapers or watch the TV news (which reminds me: Channel 9 in Australia boast in their advertising that "more Australians get their news from Channel 9 than from any other source". Very frightening. Decentralise, please!). Everyone must be able to add content, and have that propagated to anyone who is interested, with the various layers of editorial filtering being applied via whatever view a particular user wants to use. This would imply a vast distributed database, with automatic content swaps going on all the time... A nice project for someone with a few developer-years free :-). For something more modest that goes at least one step along the road to this, have a look at Alan Cox's portaloo if you haven't already.

Not to pound on the disadvantages..., posted 31 Jul 2000 at 14:50 UTC by edw » (Apprentice)

...but one thing that's really a pain is keeping track of the last thing you read. For example, when I came to Advogato this morning, I saw that this discussion had eight posts. "Hmm... I think that's more than last night. I'll click on it... Here are the posts, did I read this one? Yes, yes, yes, maybe..." This isn't just a problem with discussion web sites: whenever I go to Slate, I spend a minute or so figuring out if there's something I missed.

I like your addition to Higb's idea, but it's starting to sound a lot like USENET. Or maybe a discussion tool could be layered upon Freenet (although I do have serious reservations about it, but that's for the other discussion...).

why syndicate?, posted 31 Jul 2000 at 15:37 UTC by mobius » (Master)

The idea of a few[?] central databases to hold the news until sites like /. put their own spin on it seems rather useless.
It would just add an extra layer of complexity in between the source of the news and the readers. Currently, the news gets filtered by the news sites(ZDNet, etc), then gets filtered by the forums. It seems a bit ridiculous to add another layer of obfuscation, where the database maintainers put their own spin on the articles. Personally, I'd rather be closer to the source than farther from it.

Not to pound on the disadvantages..., posted 31 Jul 2000 at 16:20 UTC by jooon » (Journeyer)

...but one thing that's really a pain is keeping track of the last thing you read.

I agree, this is very disturbing. You can solve it with some effective cookie handling, but the downside is that you then have perhaps too much action on the server. You would like the client to take care of this. Anyway, it can be done on the web.

Hey, a good reason for me to peek at the mod_virgule source. :-)

Advogato is the prototype for future services, posted 1 Aug 2000 at 10:26 UTC by PaulJohnson » (Journeyer)

I see Advogato as a prototype, rather than a finished product. As such it does contribute to the problems of web-based discussion foums, as well as suffering from the classic ones itself (this "Post a reply" web form being an example). However, unlike Slashdot, Advogato was explicitly designed as an experiment in a certain kind of social engineering, and its working because we are learning things. It works even when it "fails" as a forum, because in the process we learn how not to do things.

I've been around Usenet since the early 90s, and I remember well the feeling of community. Then September 93 happened, and its been September ever since. Its been September for so long that these days its hard to remember what non-September time felt like. (For the uninitiated, September was the time when lots of freshman undergrads discovered the Internet for the first time). Then I discovered Slashdot, and that seemed to have re-invented the elite feeling with lots of knowledgeable postings. That feeling has now gone.

The problem seems to be that there is a certain optimum size of active on-line community: too small and you don't get the critical mass of postings and responses. Each posting generates < 1 followup on average, and discussions just die. Too big and you get too many posts to read. IME, once a forum exceeds 100 posts per day it takes too long to sift through the header lines to identify the good stuff. What happened with Slashdot was that it segmented off a sub-population of users, and so brought the numbers below the upper limit. Now Slashdot is popular and well-known it has itself gone back over the limit. Meanwhile Advogato has sectioned off another sub-population of users, and so things are interesting here (for the time being).

Technology can help with the information overload. I remember using "rn" to page through every posting on a newsgroup (which left me with the useful skill of being able to scan a page of text and make a "read/reject" decision in a second or so). Then I discovered threaded newsreaders, and suddenly groups that I had abandoned due to volume became accessible once more. (Note to Advogato: please can you add Threading here?) Now we need to invent the next step up in information management. In the rest of this post I'd like to suggest a few possibilities:

  • Contributor reputation management. The point is that when you are trying to decide which drops to sip from the firehose, "who posted it" can be a very useful metric: people who have posted good stuff in the past will generally post good stuff in the future. I apply this informally on Usenet by scanning for author names that I know, but my memory can only hold so many names, and I don't know what anyone else thinks about other people. What is needed is some kind of communal opinion of who is worth reading, based on votes for their postings. Slashdot includes an element of this in the "+1 for karma" posting option: if your "karma" (sum of moderation) exceeds 20 then your posts automatically start with a moderation of 2 rather than the standard 1. My karma is now 90 (smug grin), which I hope says something about my general posting quality. But Slashdot karma above 20 doesn't earn any extra posting points.

  • Multi-faceted reputations. I'm rated "Journeyman" here, which is a bit embarassing because I'm not currently working on any Free Software. However I like to think that I can make interesting and well-thought-out contributions to this kind of discussion. If I could be rated separately for the two dimensions then the model would be more accurate. But there needs to be a standard-but-extensible set of rating areas, possibly with rules for proximity of areas. So if I'm rated as a Guru in OO software engineering then by default I'm considered Fairly Expert in general software, and Mostly Harmless in systems engineering. Also there should be a "community services" area where you earn repuation for general good works (such as file hosting, free software, web site running etc).

  • A way for good posts from Newbies to be recognised. If everyone is only reading posts from the Gurus then Newbies won't be read, and hence won't be able to climb the ranks. Maybe part of "community services" could be "talent spotting": if you rate a newbie then you gain karma points according to how well your rating predicts the ratings of others. So if you are the first to spot a newbie who deserves Guru status, you gain karma. Amongst other things, talent spotting requires some expertise in the subject under discussion, so effective talent spotting would boost reputation in the field in which you certify them.

  • A way to prove you have the karma in other contexts. So if I gain "Guru" status in the OO community I can use this on my CV under my list of qualifications. Amongst other things, this would make such status valuable and hence worth seeking and protecting.

  • Forum independence. Ideally the mechanism would work with a number of forums, so it could be incorporated into (say) Slashdot and Usenet, and a good post in one forum would boost reputation in all forums. This leads to the vision of a specialist reputation server which merely accepts karma queries and certifications, and leaves the mechanism of posting or whatever to other software.

Incidentally, has anyone else here read "Distraction" by Bruce Sterling? It includes a description of a society based on such reputation services. Its rather vague on details, but if you are interested in the subject then its a fascinating read.

Paul.

Optimal size for groups, posted 1 Aug 2000 at 17:43 UTC by tetron » (Journeyer)

Paul, you touched upon a point I was trying to get at in my previous posting ("Cliques") that a lot of people don't seem to notice - there IS an optimal size for a discussion groups. Too small and discussion may be only a trickle, too large and it is a flood. People have tried to remedy this using the various techniques you have described, but I think one point hasn't actually been raised - how do you deal with the situation that there are too many intellegent comments on a subject? In other words, a perfectly working moderation systems would still produce 100 pretty clever comments on a certain topic. What then?

Essentially, if you don't assume that there are smart people, and that there are dumb people, but that there are smart people, and smarter people, then you can't simply filter comments based on quality. You have to find other metrics.

Or we attack the problem at its root. A forum is too big. How about we make it smaller? When a group becomes too big, split it. Over time, you will get a collection of parallel groups discussing the same topics, but there is the capacity for individual, personal conversations within a group, because the size is not overwhelming. Smaller groups also tend to encourage much more intellegent posting - you're not spamming a large audience, and you might get to know some of the people you're talking to after a while.

We can keep talking about moderation and reputation and trust systems, but the fact of the matter is that human social behavior is not well suited to large-scale, semi-anonymous groups discussions. People want to be heard, and if an intellegent comment won't get attention, than an immature one probably will.

Because it is fast (in a way...), posted 1 Aug 2000 at 19:58 UTC by penguin42 » (Journeyer)

Fundamentally the web is a faster mechanism than say Usenet. The time to propogate on Usenet is a problem in todays society - I might check /. 3 or more times a day. We can then argue with each other about an article and get replies in fast. Doing that with Usenet tended to result in getting follow up articles to articles which hadn't propogated to you!

With Usenet and a few day propogation time your friends round the planet (or on other ISPs) won't be looking at the same articles at the time. With these web newspapers you can email a friend with the 'have you seen that rediculous article....' and chat about it.

It is rather unfortunate that a centralised server is faster than a distributed propogation system but heck thats the net for you.

Hypertext redresses the balance to some degree; instead of the webpapers publishing copies of articles (as you often found on Usenet) you've got the hypertext links.

Accountability is the key, posted 1 Aug 2000 at 21:48 UTC by Talin » (Journeyer)

A number of people have made the connection between population size and discussion quality. (One of my sig lines says "politeness doesn't scale").

In large groups, interactions between individuals tend to be anonymous - there is only so many people you can know intimately. In small groups, our interactions tend to resemble primate behavior - stroking, flirting, nit-picking, contesting for territory and dominance. There is a feeling of being part of a tribe. But in large groups, our behavior tends to be more game-theoretic, more based on optimal winning strategies against faceless, objectified opponents.

But there is another important factor which I think makes Advogato special, and that's accountability. The best part about the certification is not that you can be given certification, but rather that it can be taken away. Think what Advogato would be like if you could never decrease anyone's cert. In this case, all you would have to do is trick a few people into calling you a Master, and then it's "first post!" everywhere.

Accountability is a vital part of a stable society, which is what Slashdot and the others have fogotten. Of course, there is always a place for anonymity, but it should cost. Free and easy anonymity is a quick road to social breakdown.

Group size, accountability, and good behaviour, posted 1 Aug 2000 at 22:50 UTC by PaulJohnson » (Journeyer)

Everyone interested in these topics and how they relate should go and read "The Origins of Virtue" by Matt Ridley (author of "The Red Queen"), especially the chapter "Public Goods and Private Gifts" and the final chapter "Trust".

The book is about how co-operation evolves, and it starts with chromosomes (there is such a thing as a parasitic chromosome which breaks the "trust" of the other chromosomes) and works up through cells, organs, social insects, pack animals, and eventually reaches human societies. Its a stunning tour-de-force and a major eye-opener. One of the big points of the book is that the same evolutionary balance between competition and co-operation is at work at every level.

Anyway, the "Public Goods" chapter looks at the provision of meat in hunter-gatherer societies. Briefly, vegetables are considered private property, but meat is public property because a large animal is a rare catch which cannot be consumed by a single person before it rots. In effect a large animal is a public good within the tribe, and the Tragedy of the Commons would appear to be a problem for any hunters. To get around this the men effectively swap meat for kudos within the tribe. This kudos brings many benefits, of which sex is one and reciporical sharing by the next lucky hunter is another. OTOH hunter who tries to hoard meat will find himself excluded by everyone else, and one who is rarely successful will find himself at the back of the queue when choice portions are handed out by others.

This works as long as the society is small enough for each individual to keep track of the kudos of everyone else. However once the society goes over that critical size (which seems to be in the 100-150 range) the system starts to break down. You keep meeting people, and having to deal with them, without a clear idea of where in the hierarchy they fit.

(Aside: I recently heard a lecture from someone who had been part of a company that grew past this critical size, and he mentioned the odd feeling of encountering other employees who he didn't know, and didn't know if he could trust with important work. Companies which grow past this size have to institute bureacracies to manage things which were previously managed informally).

The basic concept behind a reputation server is to augment human memory and communication in order to get past this scaling problem. The vision is that whenever I encounter someone for the first time I will be able to check his reputation on my local server and instantly know how trustworthy, knowledgeable and reliable he is. Furthermore he will have an interest in living up to that reputation when dealing with me, because he knows that I will be feeding my experiences back into the database, and trying to earn reputation for the accuracy of my insights as I do so. And vice versa of course.

Paul.

The web is addressable. Usenet is not, posted 2 Aug 2000 at 03:11 UTC by kmself » (Journeyer)

See subject. It's a key distinction between the web and Usenet.

Usenet is a data transfer protocol (NNTP), it is not a protocol for marking up the content (though some clients support hyperlinks and/or varying levels of markup). More importantly, NNTP doesn't provide for addressability of Usenet posts. Once a post has been generated, there is no equivalent to a URL -- Universal Resource Locator -- to identify, track, call up -- the initial post. It's far more difficult, by comparison, to refer to, call up, or identify a particular Usenet post without duplicating its content entirely.

Usenet is a very good discussion and forum mechanism. The addition of archives to Usenet (first on an ad hoc group-by-group basis, later on a generalized model with Deja and Remarq) makes it a truly valuable tool. One of the benefits of Deja was that it did create a means to persistantly address a particular post -- something I found useful, though it appears to be broken now.

I see the next mode of discussion tools as having elements of both current weblogs, old-style Usenet, and other technologies, best currently described as vaguely similar to Wiki, Everything2 (at Slashdot), or Kuro5hin (it will return). Oh, and this funky site with a cat on it.

Key elements:

  • Distributed content -- perhaps not universally distributed as Usenet, but definitely taking advantage of caching and distributed storage architectures.
  • Unique Resource Identifiers -- A URL by another name: something that distinctly identifies each piece of content.
  • Persistance. Maybe not infinite persistance, maybe variable by piece, but the ability to retrieve data after an arbitrary amount of time.
  • Filtering tools. Blacklists, killfiles, whitelists, both individual and distributed. Collaborative filtering tools as well (K5 moderation and mojo, Advogato trust metrics).
  • Arbitrary choice of ordering criteria: not rigidly defined as time, topic, or interest.
  • Some level of distributed user directory. Not necessarily One True ID carried to all sites, but the option to adopt or refuse a persona across one or more portions of the network.
  • Evolutionary linkages. The ability to revise and update links between constituant parts over time.
  • Evolutionary content. The ability to revise and update content over time.

$0.02.

NNTP does (sort of) have a URL mechanism., posted 2 Aug 2000 at 16:16 UTC by cmacd » (Journeyer)

Actually, there is a way to refer to a given usenet post...

Every article has a message ID which is a unique alpha-numeric "number", followed by an at sign and the name of the host that generated the post. Thus you can look for news://news.your.isp/aa34986744BYX@posting.isp.com..

The newsserver is optional, and in many cases will default to the your main news host.

this means that if your site gets any of the sports skating newsgroups you should be able to get the last version of my "welcome" faq by asking for news:/sports/skating/welcome_963220159@rtfm.mit.edu (or is it news:sports/skating/welcome_963220159@rtfm.mit.edu ?) (the host name in this case being the robot that posts it for me.)

Of course this is mitigated by the fact that the volume of usenet results in many site expiring articles in less time than a week. The FAQ posts are a special case on many sites.

augmenting reputation management, posted 3 Aug 2000 at 13:38 UTC by aigeek » (Journeyer)

I think the points about accountability and group size are right on target. I also agree with PaulJohnson when he frames reputation managers as tools to augment our natural abilities in this realm. Maybe there are other, simpler things we can do to help.

One of the biggest problems I have with building and maintaining impressions of other people I only know online is that faceless text all looks the same. I read a name, I read a message. I read another name, I read another message. Writing style and content helps differentiate, but not enough. It all blends together and I can't remember who said what. When I have a conversation in real life, just different word choice, there's also a different voice using a different style of inflections, and a different face making a different set of expressions.

I could use more help in associating a message with its author. Signatures help. I've tended not to use them, because I always thought they were redundant, but the truth is that names in headers aren't prominent enough or distinct enough to make an impression.

Even just printing those usernames in a larger and bolder typeface would help. Better yet would be to print them every few lines in a margin next to the message.

It would help even more if the appearance of our messages were more personalized. For example, we could attach some distinct decoration to our messages. It could be a small graphic, a photo, or even just having our names appear in custom color schemes (which might have to pass an automatic contrast tester). I would find it much easier to accumulate separate impressions for people this way.

Until it got abused, that is, and then it would suck. Masquerading would be easier, since while it's easy to prevent duplicate usernames, it's hard to prevent similar graphics. (Real physical faces are hard to fake, which is why we're good at recognizing them.) I don't think there's a good automatic solution, and moderated systems probably wouldn't scale. But maybe someone else can think of something.

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

X
Share this page