Older blog entries for muks (starting at number 22)

Tinyproxy developer list membership lost

With a brown paper bag over my head, I’m sorry to announce that the Tinyproxy developer mailing list’s membership information has been lost. The list was deleted by mistake today. The list archives were backed up and these were restored to a freshly created list, but unfortunately, the list’s membership wasn’t backed up (this didn’t make the list of things to backup!). I don’t know how else to announce to all the list members that they have to re-subscribe.

:(

Syndicated 2008-09-17 18:10:49 from Mukund Blog

Tinyproxy 1.6.4 released

Tinyproxy 1.6.4 was released recently, after a gap of nearly 4 years since the last release. It contains several bug fixes and current users are encouraged to upgrade to it.

For those who haven’t heard of Tinyproxy, it is a light-weight HTTP proxy daemon for POSIX operating systems, written with special consideration for users with low resources such as embedded applications. It can be modified easily too.

Syndicated 2008-09-12 11:54:51 from Mukund Blog

State of Transmission on Windows

I’ve been working on getting Transmission up and running on Windows. After a ton of patching, it now builds and works to an extent under Wine. There are still some bugs in the libevent and I/O code which need to be ironed out. However, I don’t have the mojo to complete it in a hurry. Debugging issues under Windows sucks. And doing things differently for Windows sucks.

On a related note, it’s easy to build a GCC cross compiler under Linux to build win32 apps. One can build GTK+ apps and make installers for them, all from the comfort and elegance of Linux. However, an up-to-date document of the process and some gotchas to help the programmer would be helpful and I’ll post a link to such a document shortly.

Syndicated 2008-09-11 11:29:39 from Mukund Blog

New Banu logo

Hylke Bons drew a new logo for Banu yesterday. I had requested him for a cuddly brown bear, and adapting Linus's words for Tux, said the bear should look contented and happy, as if it's just had a lot of honey :) Hylke replied within 2 hours with this logo image which is the sweetest bear I've ever seen. It even seems to be hiding a jar of honey behind it :) It's amazing he created it in so little time. Thank you Hylke!

Banu logo

Syndicated 2008-07-25 05:48:00 from Mukund's blog

Firefox 3 and SSL certificates

I should not single out only Firefox 3 for this issue, but because it's the browser I use, it gets criticised. Recent UI usability changes in web browsers towards handling self-signed and other “invalid” SSL certificates leave a lot to be desired.

Take my use-case. I want to use a HTTPS secured connection for bugzilla.banu.com (which is a website I setup for my projects). I don't have the dough to get my wildcard certificate for *.banu.com signed by a CA. So I use a self-signed certificate. This self-signed certificate does not mean that the bugzilla website accessible via HTTPS is any more malicious to any end-user than the main Banu website at www.banu.com accessible via HTTP.

I want new visitors who use my Bugzilla to be able to use it as any other plain-old website without suggestion that it's somehow malicious. Google or Wikipedia for example wouldn't like it if the browser screamed “This host uses an invalid security certificate” when someone visited http://www.google.com/ or http://en.wikipedia.org/.

HTTPS is simply an access protocol here. It can serve both authenticated and unauthenticated sessions. This whole issue would seem even more stupid if we didn't have HTTPS but something like STARTTLS for HTTP. Most web surfers do not know the difference between HTTP and HTTPS. They would go by what their browser shows them about whether a website is to be trusted or not. Current browser wording for messages that are displayed when a certificate is not signed by a known CA leans towards suggesting that somehow the remote website is malicious. A website using a self-signed certificate may not be malicious. In fact, statistics lean towards the fact that most websites are not malicious. Browsers would gladly present any content using HTTP, but when an unknown certificate is reached, it's now a stopping point and you now need to do a lot of actions in a browser such as Firefox 3 to get past to the website.

A more usable UI would be to simply indicate that the session was protected when the certificate is deemed as valid (via any padlock icons, or the green/blue Extended Validation info, or the yellow URL bar), and allow a user to simply browse the website otherwise without indicating in the UI that there is any secure connection, without having to go through any extra steps to accept a self-signed certificate.

This would raise some questions. What about forms which post to HTTPS URLs? Would having the browser stop you when it reaches an “invalid” certificate be correct to stop the browser posting to such URLs? No, this won't serve any purpose, as users hardly ever check the action URL of a form to see if it's SSL protected or not before submitting form information. They would look to trust the page which contains the form in the first place, to do the right thing.

With the change suggested above, a user visiting my bugzilla website would not see any icons or other UI indicators in her browser to say that her connection is authenticated even though she's using HTTPS. Nothing would discourage her from using my website. On the other hand, if I add my self-signed wildcard certificate to my list of personal certificates in Firefox, I can have an indiciation that my session is authenticated.

Update: In response to my own post, it occured to me that someone could hijack and force a renegotiation with a malicious server and get posted form fields if the above was implemented, i.e., if your form was served by an authenticated website, but when you submitted it, a MITM attack directed the posted form to a different webserver. So this is probably a bad idea.

Syndicated 2008-07-05 13:54:00 from Mukund's blog

Some cool programmer software

Here are some more software programs for Linux that you may find useful. I used most of them at my last place of work.

  • Coverity Prevent is a non-free static analysis tool for C and C++ similar to lint and Sparse, which has a pretty good signal-to-noise ratio. It checks and catches many programming errors, along with the the occasional false positives. It's a good tool to have around if your company can afford it. Sparse is also useful with a lot of C programs.
  • If you are a Git user, msmtp is something that you can use as a helper to git-send-email which lets you send email through a SMTP server that only does TLS.
  • If you are an Emacs user working with projects in C and C++, you probably already know of the wonderful (and non-free) Xrefactory, a source code navigation and refactoring tool. I had looked for something which came close to IntelliSense and Visual Assist X on Linux, and Xrefactory is it. The commercial version is much better than the $free version if you're gonna try it. The program's maintainer has expressed a willingness to release the $free version under a free software license if someone wants to package it for Debian. Also, Cscope with the xcscope.el module is also pretty decent, but it won't navigate in scope the way Xrefactory will.
  • Wireshark rules! You already know that if you are a network programmer, but even the web browsing user can get a lot of bang from it. For example, you have this Flash object on a website that downloads some data (.flv?) from the web server. You want to know what URL it's accessing for it. Or you want to get at a URL (to download it using curl) that is constructed by JavaScript and works only when the web browser also sends several cookies along. You can find out all this information by doing a packet capture using Wireshark and parsing what you've captured. Wireshark breaks the packets up into protocol specific layers with plenty of annotations. EFF has an article on how you can detect packet spoofing by ISPs, using Wireshark. Wireshark does suffer from slow parsing issues if you're working with multi-million-frame captures. Its filter expression syntax is also pretty basic. Implementing indexing of pcap files will probably help it with the speed issue. Scapy is another useful tool that lets you interactively construct and deconstruct packets.

Syndicated 2008-07-05 02:36:00 from Mukund's blog

history meme

On my workstation, which is one of two machines I use:

[muks@jurassic ~]$ uname -a
Linux jurassic 2.6.24.4-64.fc8 #1 SMP Sat Mar 29 09:15:49 EDT 2008 x86_64 x86_64 x86_64 GNU/Linux
[muks@jurassic ~]$ history | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | sort -rn | head
326 ls
168 cd
92 git
74 clear
66 joe
30 grep
28 svn
15 ssh
15 cat
13 cp
[muks@jurassic ~]$

I seem to use clear a lot!

Syndicated 2008-04-16 08:42:11 from Mukund's blog

history meme

On my workstation, which is one of two machines I use:

[muks@jurassic ~]$ uname -a
Linux jurassic 2.6.24.4-64.fc8 #1 SMP Sat Mar 29 09:15:49 EDT 2008 x86_64 x86_64 x86_64 GNU/Linux
[muks@jurassic ~]$ history | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | sort -rn | head
326 ls
168 cd
92 git
74 clear
66 joe
30 grep
28 svn
15 ssh
15 cat
13 cp

I seem to use clear a lot!

Syndicated 2008-04-16 03:12:11 from Mukund's adventures

So the nerd is “in” now?

Touch my body has hit #1 this week on the Billboard 100. She has all of my 99 cents. The song is a tune and it seems we’re the demographic (you’ll have to watch the video for that).

What the heck is 802.11n? You don’t want me to compile your kernel? ;)

Syndicated 2008-04-06 02:32:13 from Mukund's adventures

Try Sparse

Try Sparse to analyze your C code. Unlike Splint, it works well with the GTK libraries and reports a wide range of issues. It also neatly fits in with the autotools build system, so you can basically run your autotools and prep for a make, and then call:

make CC=cgcc

cgcc is a wrapper which invokes sparse first, and then gcc. Then go through the warnings and fix them. :)

Syndicated 2008-03-28 23:24:33 from Mukund's adventures

13 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!