benad is currently certified at Apprentice level.

Name: Benoit Nadeau
Member since: 2002-04-03 17:23:09
Last Login: 2014-04-22 17:48:00

FOAF RDF Share This

Homepage: http://benad.me/

Notes:

I am Benoit Nadeau, jr. eng. in Software Engineering,
living and working in Montreal, Canada.

Projects

Recent blog entries by benad

Syndication: RSS 2.0

My Ubuntu Tips

Now that I’ve used a Ubuntu Linux laptop for a couple of months, I’ve realized that in day-to-day use I don’t have to use much, if any, arcane commands on the command-line. And while I can edit the source code to fix bugs myself (and I did for the duplicity backup tool) or install software by compiling its source code, I rarely needed to do so. In fact, most of my customizations and installed software were straightforward.

XPS 13 Specific Tips

For some unknown reason (licensing with Microsoft?), the “super” key (the one that still has the Windows logo) cannot be used to show the Ubuntu launcher by tapping the key. As mentioned on the Ubuntu forums, to restore that functionality, simply remove the dell-super-key package.

When I use an external monitor, I typically use the internal screen at half its resolution and with a global UI scaling factor of 1, and without the external monitor I set it back to its native resolution with a UI scaling factor of 2. To do so on the command-line, without having to log out, I use these command. The external display is DP1 and the internal one is eDP1. Note that the touch screen virtual mouse might be a bit confused until you log out. Also you may have to restart some programs, for them to pick up the scaling changes.

Half-res with scale of 1:

  xrandr --output eDP1 --mode 1600x900 && \
gsettings set org.gnome.desktop.interface scaling-factor 1 && \
gsettings set com.ubuntu.user-interface scale-factor "{'DP1': 8, 'eDP1':8}"

Full-res with scale of 2:

  xrandr --output eDP1 --mode 3200x1800 && \
gsettings set org.gnome.desktop.interface scaling-factor 2 && \
gsettings set com.ubuntu.user-interface scale-factor "{'DP1': 8, 'eDP1':16}"

I’m using the Plugable USB-C Hub, and for some reason the drivers tend to get confused with the hub’s network and audio ports if I just unplug the USB-C wire. To do a “clean eject” of the hub’s internal 2 USB hubs, I run the following. You should use lsusb -t before and after connecting the hub to find the bus IDs in your setup.

  sudo bash -c 'for i in 3 4 ; do echo 1 > /sys/bus/usb/devices/$i-1/remove ; done'

GUI Themes

While there are countless GUI themes on Linux, the one I currently use is Adapta with the Noto fonts. It mimicks the latest Android look, and while it’s not a huge fan of the latest “Material” look, at least it avoids the ugly “orange everywhere” Ubuntu look.

To set it up follow this guide. After that, install the packages unity-tweak-tool, fonts-noto, fonts-noto-mono, fonts-noto-hinted and fonts-noto-cjk, and finally use the Unity Tweak Tool to select Adapta as the theme and the various Noto fonts for the system fonts.

In Ubuntu parlance, the menu bar items on the top-right are called “indicators”. Any program can add indicators when launched, but “pure” indicator programs only need to be launched once and from then on they will automatically start each time you log in.

The indicator-multiload displays graphs of CPU load, I/O, network and so on.

The indicator-cpuprefs displays the current CPU frequency. Also, on some CPUs, you can select the specific frequency you want, or select the CPU “governor”. For example, on my Kaby Lake processor I can set it to “ondemand”, “convervative”, “performance” or “powersave”.

The my-weather-indicator simply displays the current weather, and lets you view the weather forecast.

The psensor program tracks your system’s various temperature sensors and when running adds a useful indicator menu with the current temperature.

Great Apps

My currently favourite editor, Visual Studio Code from Microsoft, is based on WebKit (similar to Atom, but simpler and more lightweight), and is available as a Debian and Ubuntu package.

A beta version of Skype for Linux, again from Microsoft (who knew?), is now available with both microphone and camera support.

NixNote is an open-source client for Evernote. It is generally more convenient to use than the Evernote web client, especially for offline use.

There is a desktop application for the DevDocs web site. It is a convenient way to search and read documentation for various programming languages and tools, even in offline more. I found it to be a compelling and often superior replacemnt to Dash.

Ubuntu has a built-in screenshot tool, but for recording the screen as a video file or a GIF there is Peek. It is a breeze to set up and use.

If you use Twitter, Corebird is a nice twitter client, and can be installed on Ubuntu by following the instructions here.

Other Tips

To open files or URLs from the command-like, you can use xdg-open. To view a man page in a GUI window, you can run xdg-open man:crontab.5.

The built-in VNC server is quite limited and doesn’t support the better image compression techniques. Instead, you can use x11vnc by following these instructions for Ubuntu.

Syndicated 2017-03-19 00:24:21 from Benad's Blog

My Linux Laptop

Well, I got impatient and waited only two months (rather than a year) to buy my new laptop. I went with the “New” Dell XPS 13 Developer Edition, the 9360 model to be exact. The model I picked came bundled with a Dell-supported version of Ubuntu 16.04 LTS, and so is $150 CAD cheaper than with Windows 10 Pro. Also the UEFI firmware seems to be fully unlocked.

Apart from a few infrequent glitches (I’ll write a separate post about working around these), everything works great out of the box. All the ports and components work fine, and Ubuntu easily supported my exFAT and NTFS formatted USB 3 drives. The processor is quite fast for a 13” laptop, battery life is amazing, a full 16 GB of RAM, a very fast 512 GB SSD, and a touch screen that effectively has more pixels than my mid-2013 MacBook Pro’s “retina display”.

Speaking of high-resolution displays, Ubuntu’s support of HiDPI is decent at best. While all GNOME GTK 3 programs properly support 2x UI scaling, Qt 4 programs still draw tiny toolbar buttons, and anything else (or older) barely support UI scaling, if at all. There is no way to scale indivisual windows or programs, other than running a local VNC server with the program running in it and scaling that in a VNC client, and you can’t set different scaling values on different displays. As a compromise, when using an external 1080p screen I change the resolution of my internal screen to exactly half its normal size and use a UI scaling of 1x everywhere. At that resolution, the bilinear scaling a barely noticeable, and I’m looking more at the external display anyway.

For my software development, I already feel more productive on it than I were on my Mac. Most development tools feel “native”, even in the rare cases when I have to recompile them when they’re not already available in Ubuntu’s repositories or as a Linux binary. Setting up custom server-like services, such as a second SSH server running on a separate port, is trivial to set up compared to macOS. New Mac development tools are targetting web front-end development and are quite expensive, so apart from a few Mac programming tools, I don’t miss them much. And since almost everything on Linux is open source, customization for me goes as far as changing the source code and recompiling.

Overall, the transition was much faster and easier than expected, even compared to transfering from one Mac laptop to another. It also feels quite nice to have a built-in socket for a Noble lock, USB 3 ports, an SD card slot, and even a battery charge indicator, all abandoned by Apple over the years. To compare the XPS 13 with the nearly equally priced MacBook Pro 13” with Touch Bar (2016), the XPS 13 has complete (and public) service manuals with an iFixit repairability score of 710 compared to an abysmal 110 for the MacBook Pro, twice the RAM, a 7th-generation i7 processor compared to a 6th-generator i5, the entire screen is touch enabled compared to just a “touch bar” (in addition of physical function keys versus a “touch bar”), and a normal combination of ports versus “courage” and lots of expensive dongles.

Sure, I can’t unlock my laptop with my Apple Watch anymore and enjoy macOS’ iPhone integration, nor does the branding of a Dell laptop impress anyone, but if you actually have to work or do serious software development, it’s really a “Pro” laptop.

Syndicated 2017-02-11 19:48:36 from Benad's Blog

On the Usability of Strings

I’ve recently read an article about why programmers should favour Python 2 over Python 3 (”The Case Against Python 3”), and most of it is an incoherent rant that expose the author’s deep misunderstanding of how bytecode is internally used in scripting languages and how “market forces” of backwards-compatibility work against new languages. Somebody else already rebutted those arguments better than I would do, and unlike the original author, his later edits are clear and doesn’t involve “it was meant as a joke”. One interesting a valid technical argument remains: Python 3’s opaque support for Unicode strings can be unintuitive for those used to manipulate strings as transparent sequences of bytes.

Many programming languages came from an era where text representation was either for English, or for Western languages that would neatly fit all their possible characters in 8-bit values. Internationalization, then, meant at worst indicating what “code page” or character encoding the text was. Having started programming on 90s Macintosh computers, the go-to string memory representation was the Pascal string, where its first byte indicated the string length. This meant that performing the wrong memory manipulation on the string, using the wrong encoding to display it, or even attempting to display corrupted memory would at worst display 255 random characters.

There is a strong argument that UTF-8 should be used everywhere, and while it takes the occasion to educate programmers about Unicode (for more complete “Unicode for programmers”, see this article and this more recent guide), doing so seems to conflate the two different design (and usability) issues: What encoding should be used to store Human-readable text, and what abstractions (if any) programming languages should offer to represent strings of text?

The “UTF-8 Everywhere” document already has strong arguments for UTF-8 as the best storage format for text, and looking at the popularity of UTF-8 in web standards, all that remains is to move legacy systems to it.

For strings in programming languages, you could imagine one that has absolutely no support for any form of strings, though it’s difficult to sell the idea of a language that doesn’t even support string literals or an “Hello World” program. The approach of “UTF-8 Everywhere” is very close to that, and seems to indicate the authors’ bias towards C and C++ languages: Transparently use UTF-8 to store text, and shift the burden of not breaking multi-byte code points back to the programmer. The argument that counting characters, or “grapheme clusters”, is seldom needed is misleading: Splitting a UTF-8 string in the middle of a code point will break the validity of the UTF-8 sequence.

In fact, it can be argued that programming languages that offer native abstractions of text strings not only give greater protection against accidentally building invalid byte representations, but also give them a chance to do a myriad of other worthwhile optimizations. Languages that presents strings as immutable sequences of Unicode code points, or that transparently use copy-on-write when characters are changed, can optimize memory by de-duplicating identical strings. Even if de-duplication is done only for literals (like Java), it can greatly help with memory reuse in programs that process large amount of text. The internal memory representation of strings can even be optimized for size based on the biggest code point used in it, like Python 3.3 does.

Of course, the biggest usability issue with using abstracted Unicode strings is that it forces the programmer to explicitly tell how to convert a byte sequence in a string and back. The article “The Case Against Python 3” above mentioned that the language’s runtime should automatically detect the encoding, but that is highly error-prone and CPU intensive. The “UTF-8 Everywhere” argues that since both are using UTF-8, it boils down to memory copy, but then breaking code points is still a risk so you’ll need some kind of UTF-8 encoder and parser.

I personally prefer the approach of most modern programming languages, including Perl, Python 3, Java, JavaScript and C#, of supporting both a string and “char” type, and force the programmer to explicitly mention the input and output encoding when converting to bytes. Because they are older and made when they naively thought that the biggest code point would fit in 2 bytes, meaning before these days of Emojis, Java and JavaScript use UTF-16 and 2-bytes characters, so they still can let you accidentally break 3 or 4-bytes code points. Also, it would be nice to do like C# and by default assume that the default encoding used when decoding or encoding should be UTF-8, instead of having to explicitly say so each time like in Perl 5 and Java. Still, providing those string and “char” abstractions while using UTF-8 as its default byte representation reduces the burden on programmers when dealing with Unicode. Sure, learning about Unicode code points and how UTF-8 works is useful, but shouldn’t be required from novice programmers that write a “Hello World” program that outputs an Unicode Emoji to a text file.

Syndicated 2016-12-26 15:11:21 from Benad's Blog

The Dongle Generation

Apple finally updated their MacBook Pros, and professionals weren’t impressed. Last time, with the late 2013 model, they reduced the number of ports, but having bought one I managed to live with it using a few dongle adapters. While I like that they moved to USB-C, I am annoyed that they moved to solely USB-C, since I would have to buy a new set of dongles, let alone the USB-A to USB-C adapters for “normal” USB devices. Beyond that, the specs are average for the price, and the “touch bar” has little to no use for a developer that frequently use the function keys.

All that being said, I’m not planning to buy a new laptop until roughly a year from now. In the meantime, it does raise the question about if MacBook Pros, let alone macOS in general, is what I need for my programming work. Each time I upgrade macOS I have to recompile countless packages from MacPorts, to the point where I realize that almost all of my work is done on command-line tools easily available on Linux. I have to constantly run a Windows 7 virtual machine, so having Windows 10 in that strange BootCamp + Parallels Desktop setup doesn’t seem to be necessary.

So I’m seriously considering buying a Linux laptop as my next programming laptop. Something like the New XPS 13 Developer Edition, the one that comes with Ubuntu, would be nice, and hopefully by next year they fix that annoying coil noise. If I feel adventurous I might take a typical ThinkPad with well-known components supported in Linux and install Linux myself. Yes, I get the irony that a “Mac guy” would buy what used to be an IBM laptop. Either way, I might both save money (even more in the former since I don’t pay for Windows), and potentially time, since most of my development tools would be easy to set up. I might still have to buy some dongle for an Ethernet network connection if I get a thinner laptop, though interrrestingly both my DisplayPort DVI adapter and Thunderbolt Ethernet from Apple adapters may still work. Or I could even go with a thicker “portable computer” (like the ThinkPad P50) and use a docking connector, the 90s solution to dongles… In fact if I’m willing to let go of thinness and battery life, like I did with my first 17” MacBook Pro, I’d be able to get more storage and 32 GB of RAM.

I should admit that I have no experience with a Linux laptop or desktop, in the context of one attached to a display for day to day use. All my Linux systems were either “headless” or running in virtual machines, so I can’t tell if dealing with Xorg configuration files is going to be difficult or not. Same thing can be said for multiple displays, Bluetooth mice, and so on. But from what I’ve read, as long as I stay away from smaller ultrabooks I should be OK.

I’m not going to stop using Macs for home use, though the price increases may restrict my spending on a well-needed upgrade to my mid-2011 Mac mini. Home use now seems the natural fit for Macs anyways. Long gone is the business-oriented Apple from the mid-90s.

Syndicated 2016-11-10 08:30:07 from Benad's Blog

HTTPS, the New Standard

The “web” used to be simple. A simple plain-text protocol (HTTP) and a simple plain-text markup format (HTML). Sadly, after 20 years, things are not as simple anymore.

Nowadays, it is commonplace for ISPs to inject either “customer communications” or downright advertisement into unencrypted HTTP communications. Using web sites from an unencrypted or “open” WiFi is often a vector for a malicious user to inject viruses into any web page, let alone steal passwords and login tokens from popular web sites. On a larger scale, governments now have the capability to do deep packet inspection to systematically either censor or keep a record of all web traffic.

So, indirectly, my simple, unencrypted web site can become dangerous.

Buying an SSL certificate (actually TLS) used to be something both expensive and difficult to set up. Now with the help of “Let’s Encrypt”, any web site can be set up to use HTTPS, for free. Sure, the certificate merely says that HTTPS traffic came from the real web site, but that’s good enough. And for a personal web site, there is limited value in buying one of those expensive “Extended Validation” certificates.

This is why my web site is now using HTTPS. In fact, HTTPS only, though by doing so I’ve had to cut off browsers like Internet Explorer 6, since they do not support secure cryptographic algorithms anymore. It breaks my rule of graceful degradation, but ultimately the security of people that visit my web site is more important than supporting their 15-year old web browser.

What is sad with this though is that as older cryptographic algorithms become obsolete, so too are machines too old to support the new algorithms, let alone those “Internet appliances” that aren’t supported anymore. This means that, unlike the original idea of simple, plain-text protocols, web browsers have to be at most a decade old to be usable.

And still, HTTP with TLS 1.2 is merely “good enough”. There are simply too many root certificates installed in our systems, with many from states that could hijack secure connections to popular site by maliciously create their own certificates for them. HTTP/2 is a nice update, but pales to modern techniques used in QUIC. Considering that even today only a fraction of the Internet is using IPv6, it may take another decade before QUIC becomes commonplace, let alone HTTP/2.

For now, enjoy the green lock displayed on my web site!

Implementation Notes

The site Cipherli.st is an excellent starting point to configure your web server for maximum security. I also used the Qualys SSL Labs SSL test service to verify that my server has the highest security grade (A+).

I was also tempted to move from Apache to Caddy, as Caddy supports HTTP/2, QUIC and even Hugo (what I use for the blog section of this site), but then I remembered that I specifically chose Apache on Debian for its long-term, worry-free security updates, compared to a bleeding edge web server.

Syndicated 2016-09-13 01:15:00 from Benad's Blog

129 older entries...

 

benad certified others as follows:

  • benad certified benad as Journeyer
  • benad certified llasram as Journeyer
  • benad certified shlomif as Journeyer

Others have certified benad as follows:

  • benad certified benad as Journeyer
  • llasram certified benad as Journeyer
  • pasky certified benad as Apprentice

[ Certification disabled because you're not logged in. ]

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

X
Share this page