W3C considering Fee Based Standards

Posted 30 Sep 2001 at 06:18 UTC by jamesh Share This

It looks like the W3C is considering changes to their patent policies that would allow recommendations to be covered by patents provided under "Reasonable And Non-Discriminatory" (RAND) licensing terms. For information on how this affects you, go to this analysis of the proposed policy and how it may affect Free Software.

The last call review period ends 30 September 2001. If you have comments, you should send them to www-patentpolicy-comment@w3.org. The comments sent to this address are archived at:


Seems really dangerous..., posted 30 Sep 2001 at 08:42 UTC by adulau » (Journeyer)

I got some feedback from Bernard Lang. The authors of the document are all big patenter.

I suspect some patenter to make some "innovation" in standard and say : "Ok, we got a patent for that, but there is not problem for free software, they got the RAND licensing".

How to trust these companies ? There is some important issue with the free-software licensing and the RAND (random ? 8-) license.

The standard has to be open without any restriction at all !

Dangerous to web standards, posted 30 Sep 2001 at 12:33 UTC by sjmurdoch » (Apprentice)

At the moment the W3C standards are having to fight with proprietary ones, and in some cases they look like they are not winning. Introducing recommendations encumbered by patents is going to loose the W3C's major advantage, namely that their standards can freely be implemented by anyone. This change in policy risks a return to the damaging browser wars, which most of us hoped to never see again.

so, does that mean..., posted 30 Sep 2001 at 13:35 UTC by cmm » (Journeyer)

...that less otherwise mostly intelligent people will be seen blabbing "XML this", "*ML that" and "*MLBLAH other" with their eyes glazed over? it's high time to realize that W3C is largely a machine built to mass-produce multitudes of idiotic "standards" in a (futile) bid to embrace life, the universe and everything. let them charge for their drivel and die a painful death.

Ugh, not here too, posted 1 Oct 2001 at 03:21 UTC by robla » (Master)

Before everybody shoots off at the mouth about the W3C's proposed new policy, they should probably understand the old/existing policy first.

The fact of the matter is that the new policy makes it a lot clearer when patented technology is making its way into a specification. Under the old policy, something could make it all the way to Recommendation without a discussion about what terms the specification would be available under.

It's good that the debate about RAND vs. RF is out in the open, but stop demonizing the W3C on this issue. They've done a good job of bringing this debate into the community.

IETF, posted 1 Oct 2001 at 03:31 UTC by mtearle » (Journeyer)

If you wish to work on standards, the IETF is a much better and more democratic model. It is very difficult for an individual outside one of the W3C member organisations to have any effect on their standards process (unless previously invited to participate)

Where to start..., posted 1 Oct 2001 at 03:38 UTC by mnot » (Journeyer)

I can't really believe the hot air going around on this one. A few thoughts and random corrections;

  • This is not a surprise; the Last Call Working Draft was announced on the W3C's home page on August 20, and has been available for comment since.
  • the Process hasn't been modified for this draft; skipping Candidate Recommendation is a normal thing for certain types of documents.
  • The W3C doesn't produce Standards, it makes Recommendations. That's all.
  • The Patent Policy doesn't change the W3C's policy; it establishes the W3C's patent policy. Before this there was no documented patent policy, except for some brief disclosure requirements in the Process document.
  • The default mode of operation in Working Groups in the proposed policy is not RAND; it's determined on a case-by-case basis, and if it is RAND, there must be a good reason for it.
  • Ultimately, the operating mode of each Activity and Working Group is up to the W3C Advisory Committee, when the groups are chartered. The AC is made up of one representative for each company (disclaimer: I'm an AC rep, but only writing this on behalf of myself). Large companies get one vote. Small companies get one vote. There are over 400 companies, most of them small.
  • Even more ultimately, Tim and the Team write the charters. They more than anyone else have the interest of the Web at heart.

It would be a pity..., posted 1 Oct 2001 at 09:14 UTC by scav » (Observer)

...if the W3C began introducing non-free standards. I can't see why anyone other than the patent-holder would adopt them.

There's certainly a critical mass of free-software developers out there who could design and implement an alternative. PNG and Ogg Vorbis prove that no matter how complex the technology, you don't need patents to encourage the innovation of something that is wanted badly enough.

The W3C would eventually become irrelevant if their standards, however well-thought-out, fail at the first test. i.e. "Can I use it?"

GNU standarts.com?, posted 1 Oct 2001 at 14:49 UTC by Malx » (Journeyer)

How about creating independent standarts gruop?
Wich whould support other standarts if they are free or develop it's own, if not.

Notes, posted 2 Oct 2001 at 07:13 UTC by lilo » (Master)

It does appear that the process of public comment broke down on this one, for whatever reason. Maybe nobody expected to see this important an issue quietly working its way through the W3C. The time breakdown of comments is interesting.

The first five comments on the mailing list, covering the span of most of one month, were spam directed at the mailing list. Of the next two comments, one seems to be mostly in the nature of proofreading. The other is a favorable comment to the effect that the new policy will make it more clear when patent-encumbered recommendations are adopted. This takes us up through September 27. Two comments on the 28th, one of whom seems to have been from the person who pointed the problem out to the community that day. 20 comments on the 29th. The bulk of the month's 755 comments appeared on the 30th.

It does appear obvious that nobody on the committee has thought much about the downside of at all accepting patent-encumbered recommendations. It also seems obvious that W3C is not used to getting this kind of public outcry. Their communications director seems to indicate that they didn't get useful commentary; if it were me, I think I'd be willing to accept that 7 or 800 emphatic "No!" votes were probably an indication that people feel strongly about the issue.

Let's hope they're willing to accept and work with what is clearly an unexpected form of commentary.

W3C extends review period, posted 2 Oct 2001 at 13:32 UTC by logic » (Journeyer)

It looks like the W3C has responded to the public feedback on this, and has decided to extend the deadline for comments until October 11, in preparation for their meeting on the subject on October 15. So, if you've got something to say on the matter, now is the time to do it. All comments are archived and publically viewable.

Brainwashed by the not-so-mass-media, posted 2 Oct 2001 at 19:08 UTC by piman » (Journeyer)

This proposal has been available for review and comment, on the same page as the other W3C working drafts, for a month and a half. Why is everyone only throwing a fit now? The only trigger I can see is a post on Slashdot, which means the majority of those 1,000 new comments are probably people who have no clue what's going on, but do whatever Slashdot tells them. Looking some of the actual comments (the linked one is someone claiming that the W3C is going to start patenting things, which is not the case at all), I have to conclude that the majority of posters really don't have a clue what's going on, and are probably just causing the W3C (who are a bunch of nice people that do a shitload of work to make the web a better place) a major headache.

Some Observations, posted 3 Oct 2001 at 03:11 UTC by forrest » (Journeyer)

piman: I bet you had to look really hard to find that incompetent comment. I've been reading the archives, and most of the comments I've seen are cogent and eloquent.

mnot: You say "Tim and the Team write the charters" as evidence that the W3C's noble intent. Let's not forget that Tim Berners-Lee is one of the founders of Curl Corporation, a pusher of some sort of proprietary web technology.

It's good that you and robla point out that this proposal establishes a policy where none existed before. In this sense, it's definitely a good thing that the W3C is doing -- they just need to come to the right conclusion, and not adopt any standards which preclude free software implementations. From what I've read, many of the comments I've read in the archive get this right ... perhaps your comments here helped bring that about. Thanks.

I think the avalanche of late comments points to one thing: people trusted the W3C, and thus hadn't kept a close eye on them. Following standards is a lot of work, so usually only those who are closely involved with a particular technology keep up. That's why the standards review procedure didn't attract too much attention. Fortunately, someone noticed this proposal and its implications before the comment period expired. The W3C is on the verge of becoming an untrustworthy organization, but there's still a chance to try to help them put this in perspective.

I know I will be sending my comments in sometime before the 11th ... I need to find the time to read through all the documents and come up with a comment that is close to being as eloquent as many of those I've read.

The W3C is trustworthy, posted 3 Oct 2001 at 04:11 UTC by piman » (Journeyer)


Fortunately, someone noticed this proposal and its implications before the comment period expired. The W3C is on the verge of becoming an untrustworthy organization, but there's still a chance to try to help them put this in perspective.

This is the kind of W3C-demonizing I can't stand. Why is the W3C on the verge of becoming an untrustworthy organization? They had it on their site for a month and a half. It's not their fault that the blind masses don't bother to read for themselves, and only found out a day before. This doesn't make the W3C untrusthworthy. It means the people who should care about tracking stuff like this (if they really do care that much) are just lazy or uninformed.

You're right, the avalanche of comments did point to one thing - that no matter how much people bitch and moan about wanting free standards on news or discussion sites, most of them never take the initiative.

The price of freedom is eternal vigilance.

W3C did play games here, posted 3 Oct 2001 at 07:19 UTC by db » (Master)

It wasn't Slashdot or Linux Today which really publicized this, it was Adam Warner who posted the first generally available analysis I've seen. That found its way through a lot of mailing list, and then relatively late in the game, to those other sites.

But it is curious that nobody else really picked up on this until it was pretty late. A few reasons come to mind.

  • First, the general way that W3C generates recommendations involves over a year elapsing IN PUBLIC SIGHT, so there's usually no need to rush a review cycle. It's quite unusual to see any document get ramrodded through this quickly.

  • Second, the information there really soft-pedaled the impact on free software. It's not like they said in bold letters we want to adopt a closed systems model.

  • Third, 9/11/2001 has been taking over a lot of mind share, and not just the grey cells that had been preoccupied with the Gary Condit media sharkfest. And that was just when those other cells started to really get back from August vacation, too.

W3C trustworthiness ... people In The Know have been raising issues there for a while now. There's a lot of stuff going on behind that code of omerta protecting members from public view, and some of it certainly has raised red flags in the past. What's new is that more people outside W3C are now able to see how much its decision process can be skewed against the Public Interest and in favor of larger vendors.

I think a patent policy is needed, but it shouldn't be rushed and it should have a strong bias towards "Zero Cost Licensing". (I get confused by their so-called "Royalty Free Licensing", which I've always seen to mean huge up-front fees, in contrast to per-unity royalties.) And it shouldn't have so many dubious details and huge loopholes as this current draft.

Demonizing?, posted 3 Oct 2001 at 18:24 UTC by forrest » (Journeyer)

piman, what we have here is our own failure to communicate.

When I said that the W3C risks becoming untrustworthy, I meant that this organization which has been a powerful advocate of open standards would, if the draft proposal passes unmodified, no longer be that. They could no longer be trusted to influence companies to adopt data formats that are open and can be manipulated with free tools.

I finally figured out from your comment that you believe that I must be implying that the review process is some sort of conspiracy to avoid public input. That's not what I meant at all. The communication process didn't work well, but that's not the W3C's fault, and I never meant to imply that it was.

Communication is difficult. That's just a fact of life.

Some thoughts, posted 4 Oct 2001 at 07:44 UTC by raph » (Master)

1. While the aims of the W3C are reasonably noble, most of the people actually on the committees are representatives from proprietary software companies. Thus, a patent policy such as this comes as no surprise.

2. "Open standards" has never meant freely implementable. There are lots and lots of open standards that require patents. Freely implementable standards are far better, of course, but the world is not set up to pay people (such as xiphmont) who put time and effort into them.

3. The IETF has for a while had a patent policy not unlike the one being considered here. However, the IETF, having been burned several times, now has a strong preference for unencumbered technology when it is a viable alternative.

4. Some of the patents under discussion are the bottom of the patent scum-barrel. A particularly egregious example is Apple's patent on classical alpha composition. Sickening, isn't it?

I'm mostly just glad this issue is getting attention. It's important to choose patent-free technology, but most people have no idea.

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

Share this page