Older blog entries for tgw (starting at number 8)

As was posted on Slashdot yesterday, and was posted here a day before that, the CalTech-MIT Voting Technology Report was released a few days ago. I haven't had the chance to read through the entire report. Due to the fact that the font-types used do not render well when printed on my DeskJet printer - or when displayed on my laptop screen - the text of the report is difficult to decipher.

However, the one section that I have been able to read through is contained in pages 60 through 66. This section contains an introduction to their AMVA, which stands for "A Modular Voting Architecture". The authors begin this section by stating ...

This section presents a new framework - a reference architecture - for voting that we feel has many attractive features. It is not a machine design, but rather a framework that will stimulate innovation and design. It is potentially the standard architecture for all future voting equipment.
After reading this I thought, "Hmm. Interesting. Let's see what they came up with." I went on to be more than a little amused when I realized that this "standard architecture for all future voting equipment" was almost an exact duplicate of a voting system design I had posted online three and a half months ago.

For the fourth version of TDP Notes I had written up a new section called - ironically enough - The Future of Voting Systems. In it I described Hybrid Paper/Electronic (HPE), Paper-to-Electronic (P2E), Electronic-to-Paper (E2P), Electronic-to-Electronic (E2E), and Peer-to-Peer (P2P) voting systems. (This page is also cached at Google. Scroll down to see the section I'm referring to.)

Of these, the P2E and E2P descriptions were simply laying out what had already been suggested or previously implemented by others. However, the E2E and P2P explanations were new - as well as the terminology I was using. Of particular interest is the E2E design I laid out.

With the CalTech-MIT AMVA, they specify generic designs for both a paper-based and an electronic voting system. The paper-based system is simply a traditional mark- an-X-on-a-paper-ballot type of system, where the paper ballots are counted by hand. However, the AMVA electronic voting system design is almost an exact duplicate of my E2E design.

I don't believe I had ever heard of this type of split, two- step voting system design before I thought of it, and wrote about it, earlier this year. To my knowledge I was the first to publicly suggest this type of design when version 0.4 of TDP Notes was posted online in March 2001. I realize someone else may have publicly suggested this before March, but if they did I was not - and still am not - aware of it.

I will probably post more of a comparison between the AMVA and the E2E design out on the TDP mailing list when I get the chance. In the meantime, I will be feeling more than a little pleased that the CalTech-MIT team has validated my work in such a positive way. The sad part is, I didn't have the benefit of a quarter million dollar grant to fund my efforts. :-<

The fourth version of TDP Notes is now online. For anyone interested in using open/free software for e-voting, e-democracy, and e-government, you might want to check it out.

Even though there is more material to write out, it may be the last version created.

Any future news on Techno Democracy Project will be posted to the TDP Mailing List.

An updated version of TDP Notes is now available.

Earlier this year, before starting this Advogato journal, I traveled from Chicago out to Washington DC to attend several events related to voting. I've been meaning to document them somewhere. This seems to be as good a place as any.

In January 2000, I was able to attend a one day symposium on The Future of Internet Voting, sponsored by The Brookings Institution and Cisco Systems. The event included a number of big name people.

In February 2000, I was able to attend the founding assembly of the Internet Voting Technology Alliance. Although the IVTA seemed to have a solid beginning, it hasn't really accomplished much as of this writing.

In March/April 2000, I was able to attend a conference put on by the Voting Integrity Project. It was quite a good conference - mixing, in one place, people from a number of different disciplines related to voting.

In August 2000, I drove out to DC to attend a meeting sponsored by the US Federal Election Commission. The meeting was to review a partial draft of updated US Voting System Standards published by the FEC, and used by most of the US states to certify voting systems for public elections. It was a beneficial meeting.

In the next few weeks I plan to publish my feedback to the FEC on this partial draft of the updated VSS. I am fairly certain that I am the only person from the open-source/free-software realm to be working with the FEC on this. All the other people either work for a for-profit voting system vendor, are a government official, or are involved with performing certification tests on the voting systems.

My primary concern is insuring that none of the FEC requirements prevent an open-source or free-software voting system from being certified. My secondary concern is trying to insure they don't make the requirements unnecessarily narrow - and thus prevent voting systems with non-traditional designs from being built.

Two days ago I attended an "Informal Roundtable Discussion" with the title "The Internet in Power - Networked Governance or Virtual Disconnect?". It was facilitated by Steven Clift, the moderator of the 1400-subscriber Democracies Online Newswire. We were using facilities provided by the Center for Democracy and Technology. There were 23 people in the room and 2 more who teleconferenced in.

It was a good event - a new group of people that I hadn't been around before. Several people there had previously worked on Capitol Hill (for the US Congress), one gentleman was from the White House, and I sat next to Owen Ambur from the still-forming XML.gov There were also people from other various other groups, universities, and consulting companies.

The gentleman from the White House (i didn't catch his name) spoke about how no government agency is responsible for creating e-government solutions. A lot of agencies have partial responsibility, but there is no person or organization in the US government to act as a "hub" for the whole government's e-government initiatives. Another gentleman suggested that this would be the job of a government-wide CIO.

A few minutes later I was able to speak up and tell them that they were describing TDP. Among other things, TDP is intended to be exactly what the man from the White House described - a "hub" to facilitate the creation of e-democracy and e-government Open/Free software. I had said that it doesn't make sense for 50 state governments, plus however many provincial, national, continental and local governments to all be building basically the same pieces of software from scratch. Everyone is - very inefficiently - re-inventing the wheel. It makes more sense to create one application with 95% of the functionality needed by everyone, then everyone adds their own 5% of customized functionality. This makes much more sense economically and in other ways, too.

I stayed after and was able to have good discussions with Steve Clift and Owen Ambur. I had wanted to speak with some of the others, but they got out the door before I was able to.

I justed finished up writing an article on VoteAuction.com & The Whack-A-Mole Defense. I also put together some quotes and links explaining the Whack-A-Mole technique. Been up all night working on this. It's 8:00am, time to get some sleep.

Last week I attended an e-Voting Workshop here in Washington DC. It was sponsored by the Internet Policy Institute and featured a panel with quite a lot of big guns (big credentials) on it. Big credentials don't impress me much, competence and contribution does. This panel, however, produced some excellent dialog on the topics of e-voting and Internet voting.

Last week I was able to attend a meeting on Capitol Hill hosted by the Congressional Internet Caucus Advisory Committee. It was about Internet Voting. The panel consisted of Gary McIntosh of the National Association of State Election Directors, Jim Adler of VoteHere.net, Marc Strama of Election.com, Tony Wilhelm of the Benton Foundation, and Deborah Phillips of Voting Integrity Project. Dr. Lorrie Cranor of AT&T Labs was the moderator.

Just like most of the panel discussions that I've been to over the past nine months, this one was an introductory-level look at Internet Voting. Information-wise, they tend to be pretty worthless for people like myself who work with this stuff every day. However, they're a great place to network and meet new people who work in the same problem space.

A good thing about last week's meeting, in particular, is that it raised the level of visibility and understanding of Internet Voting among the 120 or so Congressional staffers who were in attendance.The ironic thing about it is that elections in the United States are controlled at the state and local level. So, there's a very limited amount that Congress can do when it comes to Internet Voting.

The main thing Congress can do is create permanent, ongoing funding for the FEC to keep pace with the rapidly changing nature of current technology and update their Voting Systems Standards (VSS) on an ongoing, yearly basis. Currently, the FEC needs to seek out special funding to update their VSS -- so it happens very infrequently. Too infrequently. The original FEC VSS was completed in 1990. Only now, 10 years later, are they being updated. The 1990 standards are pretty useless when you try to apply them to modern client/server and Internet-based voting systems. Congress definitely needs to create ongoing funding for this.

9 Sep 2000 (updated 18 Sep 2000 at 17:20 UTC) »

I just completed an article entitled Introduction to Open Source and Free Software. It is the most comprehensive introduction to Open Source, Free Software, and the differences between them - that I am aware of.

The article explains the key terms, definitions, people, organizations, and acronyms of Open Source and Free Software in a way which both non-technical and technical people can understand. Plus it offers an explanation of "What is Open Source and Free Software?", "Why does this matter?", and "Does this approach really work?". There are also plenty of links out to other sites for readers to dig deeper on the topics of interest to them.

The article is scheduled to be published in the September 2000 issue of The Bell newsletter. It is also available online at www.technodemocracy.org/people/tgw/docs/ossfs.html.

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!