Recent blog entries

28 Jul 2014 oubiwann   » (Journeyer)

The Future of Programming - Adopting The Functional Paradigm?

Series Links

Survivors' Breakfast

The previous post covered some thoughts on the future-looking programming themes present at OSCON 2014.

Following that wonderful conference, long-time Open Source advocate, Pythonista, and instructor Steve Holden, was kind enough to host his third annual "OSCON Survivors' Breakfast" with tens of esteemed attendees, speakers, and organizers enjoying great company and conversation, relaxing together after the flurry of conference activity, planning a leisurely day in Portland, and -- most immediately -- having some much-needed breakfast.

The view from the 23rd floor was quite an eyeful, and the conversation ranged across equally panoramic topics. Sitting with Alex Martelli, Anna Ravenscroft, and Katie Miller, the conversation inevitably turned to thoughts programmatical. One thread of the discussion was so compelling, that it helped crystallize this series of blog posts. That was kicked off with Katie's question:

Why [have some large companies] not embraced functional programming to the extent that other large ones have?

Multiple points of discussion spawned from this, some of which still continue. The rest of this post explores these. 


Large Companies?

What constitutes a large company? We settled on discussing Fortune 500 companies, which, by definition are:
  • U.S. Companies
  • Ranked by gross revenue (after adjustments for excise taxes).

Afterwards, I looked up the 2013 top 25 tech companies in the Fortune 500. I've listed them below; in parentheses is the Fortune 500 ranking. After the dash are the functional programming languages used on various company projects -- these are listed only if I have talked to someone who has worked on a project (or interviewed for a job that used the language), or if I have read an article by an employee who has stated that they use the listed language(s) [1].
  1. Apple (6) - Swift, Clojure, Scala
  2. AT&T (11) - Haskell
  3. HP (15) - F#
  4. Verizon Communications (16) - ?
  5. IBM (20) - ?
  6. Microsoft (35) - F#, F*
  7. Comcast (46) - Scala
  8. Amazon (49) - Haskell
  9. Dell (51) - Erlang
  10. Intel (54) - Haskell, SML, PLT Scheme
  11. Google (55) - Haskell [2]
  12. Cisco (60) - ?
  13. Ingram Micro (76) - ?
  14. Oracle (80) - Scala
  15. Avnet (117) - ?
  16. Tech Data (119) - ?
  17. Emerson Electric (123) - ?
  18. Xerox (131) - Scala
  19. EMC (133) - ?
  20. Arrow Electronics (141) - ?
  21. Century Link (150) - ?
  22. Computer Sciences Corp. (176) - ?
  23. eBay (196) - Scala 
  24. TI (218) - ?
  25. Western Digital (222) - ?

The companies which have committed to projects guessed to be of significant business value written with languages of a  include: Apple, HP, and eBay. Possibly also Oracle and Intel. So, a rough estimate of between 3 to 5 of the top 25 U.S. tech companies have made a significant investment in FP.

Why not Google?

The next two sections offer summaries of some views on this.


Ideal Use Case?

Is an FP language suitable for large organisations? Are smaller companies better served by them? During breakfast, It was postulated that dealing with such things as immutable data, handling I/O in pure FP languages, and creating/using higher order functions is easier for small startups due to the shorter amount of time required to hire or train a critical mass of skilled programmers.

It is certainly true that it will take larger organisations longer to train its personnel simply due to sheer numbers and, even with enough trainers, logistics. But this argument can be made for any corporate level of instruction; in my book, this cancels out on both sides and is not an argument unique to hard topics, even less, specifically pertinent to FP adoption.


Brain Fit?

I've heard this one a bit: "Some people just don't think in FP terms." They need loops and iteration, not higher order functions and recursion. Joel Spolsky makes reference to this in his article The Guerrilla Guide to Interviewing. In particular, he says that "For some reason most people seem to be born without the part of the brain that understands pointers." This has been applied to topics in FP as well as C.

To be fair, Joel's comment was probably made with a bit of lightness and not meant to be a statement on the nature of mind or a theory of cognition. The context of the article is a very practical one: hiring. When trying to identify whether a programmer would be an asset for your team, you're not living in the space of cognitive theory, rather you inhabit the realm of quick approximations, gut instincts, and fast, economical decisions.

Regardless, I find this perspective -- Type Physicalism [3] -- fairly objectionable. This is because I see it as a kind of intellectual "racism." Early social sciences utilized this form of reasoning to justify all sorts of discriminatory thinking in the name of "science", reinforcing a rigid mentality of "us" vs. "them." In my own experience, I've seen this sort of approach used to shutdown exploration, to enforce elitism, and dismiss ideas that threaten the authority of the status quo.

Rather than seeing the problem of comprehending FP as a physical limitation of the individual, I see instructional failure as the obstacle to overcome. If we start with the proposition that certain brains are deficient, we are essentially abandoning education. It is the responsibility of the instructor to engage creatively with each student's learning style. When adhering to the idea that certain brains are limited, one discards creative engagement; one doesn't even consider working with the students and their learning styles. This is a view that, however implicitly, can be used to shun diversity and dismiss potential.

I believe the essence of what Joel was shooting for can be approached in a much kinder fashion (adapted for an FP discussion):

None of us was born knowing GOTO statements, global state, mutable data, or for loops. There are many programmers alive, though, whose first contact with programming involved one or more of these. That's their "home town", as it were; their programmatic birth place. Having utilized -- as well as taught -- imperative, OOP, and functional styles of programming, I do not feel that one is intrinsically any harder than another. However, they are sometimes so vastly different from each other in style or syntax or semantics that once a student has solidified around the concepts of a particular paradigm, it can be a challenge retraining to work easily in another.


Why the Objections?

If both "ideal use case" and "brain fit" are given as arguments against adopting FP (or any other new paradigm) in large organisations, and neither are considered logically or philosophically valid, what's at the root of the resistance?

It is not uncommon for changes in an industry or field of study to be met with resistance. The bigger or more different the change from the status quo, very often is proportional to the amount of resistance. I suspect that this is really what we're seeing when companies take a stance against FP. There are very often valid business concerns: "we've made an investment in OOP" or "it will cost too much to train/hire/migrate to FP." 

I would remind those company leaders, though, that new sources of revenue, that product innovation and changes in market adoption do not often come from maintaining or enforcing the current state. Instead, that is an identifying characteristic of companies whose relevance is fading.

Even if your company has market dominance or is a monopoly, there is still a good incentive for exploring alternative paradigms. At the very least, one can uncover inefficiencies and apply new knowledge to remove duplication of efforts, increase margins, etc.


Careers

As a manager, I have found that about half of the senior engineers up for promotion have very little to no interest in taking on different (new to them) programmatic paradigms. They consider current burdens sufficient (or too much) and would rather spend what little free time they have available to them in improving existing systems.

Senior engineers who have a more academic or research bent (or are easily bored) are much more likely to embrace this sort of change. Interestingly, senior engineers who have little to no competitive drive will more readily pick up something new if the need arises. This may be due to such things as not perceiving accumulated knowledge as territory to defend, for example.

Younger engineers with less experience (and less of an investment made in a particular school of thought) are much more willing to take on new challenges. I believe there are many reasons for this (including an interest in becoming more professionally competitive with their peers)

Junior or senior, I have found that programmers who are currently looking to find new employment are nearly invariably not only willing to take on the challenge of learning different paradigms, but are usually going about that proactively and engaging in self-study.

I want to work with programmers who can take on any problem space in any paradigm and find creative solutions, contributing as valued members of a team. This is certainly an ideal set of characteristics, but one that I have seen in the wilds of the workplace on multiple occasions. It has nothing to do with FP or OOP paradigms, but rather with the people themselves.

Even if a company is locked into well-established processes and views on programming, they may find it in their best interests to provide a more open-minded approach with their employees who would enjoy that. Their retention rates could very well increase dramatically.


Do We Need To?

Philosophy and hiring strategies aside, do we -- as programmers, software projects, or organizations that support programming -- need to take on the burden of learning or adopting functional programming? Quite possibly not.

If Google's plans around Go involve building a new operating system (in the spirit of 1970s C and UNIX), the systems programmers may find pure functions too cumbersome to work with. FP may be too burdensome a fit for that type of work.

If one is not tied to a historical analogy with UNIX, as Mozilla is not with Rust, doing something like creating a new browser engine (or running a remote services company) may be a good fit for FP, especially if one has data showing reduced error counts when using type systems.

As we shall see illustrated in the next post, the usual advice continues to apply: the decision of which paradigm to employ for any given project should be dictated by the best fit and not ideological inflexibility. The bearing this has on programming is innovation: it is the early adopters who have the best chance of leading us into the future.

Up next: Retrospective on Programming Paradigms
Previously: Themes at OSCON 2014


Footnotes

[1] If anyone has additional information as to which FP languages are used by these top 25 companies, please let me know, and I will include that information. Bonus points for knowing of business-critical applications.

[2] Google Switzerland are using Haskell.

[3] Type Physicality is a form of reductive materialism, also known as the Mind-Brain Identity Theory that does not allow for mental states to be realized in organisms or computational systems that do not have a brain. See "Criticisms of Type Physicality" at http://en.wikipedia.org/wiki/Identity_theory_of_mind#Multiple_realizability.

Syndicated 2014-07-28 13:00:00 (Updated 2014-07-28 13:00:01) from Duncan McGreggor

27 Jul 2014 oubiwann   » (Journeyer)

The Future of Programming - Themes at OSCON 2014

Series Links


A Qualitative OSCON Debrief

As you might have noticed from the OSCON Twitter-storm this year, the conference was a blast. Even if you weren't physically present, given the 17 tracks, you can imagine that the presentations -- and subsequent conversations -- were deeply varied.

This was the second OSCON I'd attended; the first was was in 2008 as a guest of Michael Bernstein, a friend who was speaking there. OSCON 2008 was a zoo - I'm not sure of the actual body count, but I've heard that attendees + vendors + miscellaneous topped 12,000 people over the course of the week (I would love to hear if someone has hard data on that -- googling didn't reveal much). OSCON 2008 was dominated by Big Data, Hadoop, endless buzzword bingo, and business posturing by all sorts. The most interesting bits of that conference were the outlines that formed around the conversations people weren't having. In fact, over the following 6 months, that's what I spent my spare time pondering: what people didn't say at OSCON.

This year's conference seemed like a completely different animal. It felt like easily 1/2 to 1/3rd the number of attendees in 2008. Where that one had all the anonymizing feel of rush-hour in a major metropolitan hub, OSCON 2014 had a distinctly small-town vibe to it -- I was completely charmed. Conversations (overheard as well as participated in) were not littered with examples from the latest bizspeak, but rather focused on essence. The interactions were not continually distracted, but rather steadily focused, allowing people to form, express, and dispute complete thoughts with their peers.


Conversations

So what were people talking about this year? Here are some of the topics I heard covered during lunches, in hallways, and at podiums; at pubs, in restaurants and at parks [1]:
  • What communities are thriving?
  • Which [projects, organisations, companies, etc.] are treating their people right?
  • What successful processes are being followed at [project, organisation, etc.]?
  • Who is hiring and why should someone want to work there?
  • Where can I go to learn X? Who is teaching X? Who shares the most about X?
  • Which [projects, organisations] support X?
  • Why don't more [people, projects, organisations] care about [possible future X]?
  • Why don't more [people, projects, organisations] spend more time investigating the history of X for "lessons learned"?
  • There was so much more X in computing during the 60s and 70s -- what happened? [2]
  • Why are we reinventing X?
  • When is X going to be invented, and who's going to do it?
  • Everything is changing! I can't keep up anymore.
  • I want to keep up, but how?
  • Why can't we stop making so many X?
  • Nobody cares about Y anymore; we're all doing X now.
  • Full stack developers!
  • Haskell!
  • Fault-tolerant systems!

After lots of reflection, here's how I classified most of the conversations I heard:
  • Developing communities,
  • Developing careers and/or personal/professional qualities, and
  • Developing software, 

along lines such as:
  • Effective maintenance, maturity, and health,
  • Focusing on the "art",  eventual mastery, and investments of time,
  • Tempering bare pragmatism with something resembling science or academic excellence,
  • Learning the new to bolster the old,
  • Inspiring innovation from a place of contemplation and analysis,
  • Mining the past for great ideas, and
  • Figuring out how to better share and spread the adoption of good ideas.


Themes

Generalized to such a degree, this could have been pretty much any congregation of interested, engaged minds since the dawn of civilization. So what does it look like if we don't normalize quite so much? Weighing these with what may well be my own bias (and the bias of like-minded peers), I submit to your review these themes:

  • A very strong interest in programming (thinking and creating) vs. integration (assessing and consuming).
  • An express desire to become better at abstraction (higher-order functions, composition, and types) to better deal with growing systems complexities.
  • An interest in building even more complicated systems.
  • A fear of reimplementing past mistakes or of letting dust gather on past intellectual achievements.

As you might have guessed, these number very highly among the reasons why the conference was such an unexpected pleasure for me. But it should also not come as a surprise that these themes are present:

  • We have had several years of companies such as Google and Amazon (AWS) building and deploying some of the most sophisticated examples of logic-made-manifest in human history. This has created perceived value in our industry and many wish to emulate it. Similarly, we have single purpose distributed systems being purchased for nearly 20 billion USD -- a different kind of complexity, with a different kind of perceived reward.
  • In the 70s and 80s, OOP adoption brought with it the ability to create large software systems in ways that people had not dared dream or were impractical to realize. Today's growing adoption of the Functional paradigm is giving early signs of allowing us to better integrate complex systems with more predictability and fewer errors.
  • Case studies of improvements in productivity or the capacity to handle highly complex or previously intractable problems with better abstractions, has ignited the passions of many. Not wanting to limit their scope of knowledge or sources of inspiration, people are not simply limiting themselves to the exploration of such things as Category Theory -- they are opening the vaults of computer science with such projects as Papers We Love.

There's a brave new world in the making. It's a world for programmers and thinkers, for philosophers and makers. There's a lot to learn, but it's really not so different from older worlds: the same passions drive us, the same idealism burns brightly. And it's nice to see that these themes arise not only in small, highly specialized venues such as university doctoral programs and StrangeLoop (or LambdaJam), but also in larger intersections of the industry like OSCON (or more general-audience ones like Meetups).

Up next: Adopting the Functional Paradigm?
PreviouslyAn Overview


Footnotes

[1] It goes without saying that any one attendee couldn't possibly be exposed to enough conversations to form a perfectly accurate sense of the total distribution of conversation topics. No claim to the contrary is being made here :-)

[2] I strongly adhere to the multifaceted hypothesis proposed by Bret Victor
here in the section titled "Why did all these ideas happen during this particular time period?"


Syndicated 2014-07-27 21:25:00 (Updated 2014-07-28 05:25:57) from Duncan McGreggor

27 Jul 2014 oubiwann   » (Journeyer)

The Future of Programming - An Overview

Art by Philip Straub
There's a new series of blog posts coming, inspired by on-going conversations with peers, continuous inspection of the development landscape, habitual navel-gazing, and participation at the catalytic OSCON 2014. As you might have inferred, these will be on the topic of "The Future of Programming."

Not to be confused with Bret Victor's excellent talk last year at DBX, these posts will be less about individual technologies or developer user experience, and more about historic trends and viewing the present (and near future) through such a lense.

In this mini-series, the aim is to present posts on following topics:

I did a similar set of posts, conceived in late 2008 and published in 2009 on the future of cloud computing entitled After the Cloud. It was a very successful series and the cloud industry seems to be heading towards some of the predictions made in it -- ZeroVM and Docker are an incremental step towards the future of distributed processes/functions outlined in To Atomic Computation and Beyond

In that post, though, are two quotes from industry greats. These provide an excellent context for this series as well, hinting at an overriding theme:
  • Alan Kay, 1998: A crucial key to growing large systems is effective communications between components.
  • Joe Armstrong, 2004: To effectively model and solve problems in a distributed manner, we need concurrency... this is made easier when we isolate processes and do not share data.

In the decade since these statements were made, we have seen individuals, projects, and companies take that vision to heart -- and succeeding as a result. But as an industry, we continue to struggle with the definition of our art; we still are tormented by change -- both from within and externally -- and do not seem to adapt to it well.

These posts will peer into such places ... in the hope that such inspection might guide us better through the tangled forest of our present into the unimagined forest of our future.

Syndicated 2014-07-27 18:02:00 (Updated 2014-07-28 05:17:59) from Duncan McGreggor

27 Jul 2014 apenwarr   » (Master)

Wifi, Interference and Phasors

Before we get to the next part, which is fun, we need to talk about phasors. No, not the Star Trek kind, the boring kind. Sorry about that.

If you're anything like me, you might have never discovered a use for your trigonometric identities outside of school. Well, you're in luck! With wifi, trigonometry, plus calculus involving trigonometry, turns out to be pretty important to understanding what's going on. So let's do some trigonometry.

Wifi modulation is very complicated, but let's ignore modulation for the moment and just talk about a carrier wave, which is close enough. Here's your basic 2.4 GHz carrier:

    A cos (ω t)

Where A is the transmit amplitude and ω = 2.4e9 (2.4 GHz). The wavelength, λ, is the speed of light divided by the frequency, so:

    λ = c / ω = 3.0e8 / 2.4e9 = 0.125m

That is, 12.5 centimeters long. (By the way, just for comparison, the wavelength of light is around 400-700 nanometers, or 500,000 times shorter than a wifi signal. That comes out to 600 Terahertz or so. But all the same rules apply.)

The reason I bring up λ is that we're going to have multiple transmitters. Modern wifi devices have multiple transmit antennas so they can do various magic, which I will try to explain later. Also, inside a room, signals can reflect off the walls, which is a bit like having additional transmitters.

Let's imagine for now that there are no reflections, and just two transmit antennas, spaced some distance apart on the x axis. If you are a receiver also sitting on the x axis, then what you see is two signals:

    cos (ω t) + cos (ω t + φ)

Where φ is the phase difference (between 0 and 2π). The phase difference can be calculated from the distance between the two antennas, r, and λ, as follows:

    φ = r / λ

Of course, a single-antenna receiver can't *actually* see two signals. That's where the trig identities come in.

Constructive Interference

Let's do some simple ones first. If r = λ, then φ = 2π, so:

    cos (ω t) + cos (ω t + 2π)
    = cos (ω t) + cos (ω t)
    = 2 cos (ω t)

That one's pretty intuitive. We have two antennas transmitting the same signal, so sure enough, the receiver sees a signal twice as tall. Nice.

Destructive Interference

The next one is weirder. What if we put the second transmitter 6.25cm away, which is half a wavelength? Then φ = π, so:

    cos (ω t) + cos (ω t + π)
    = cos (ω t) - cos (ω t)
    = 0

The two transmitters are interfering with each other! A receiver sitting on the x axis (other than right between the two transmit antennas) won't see any signal at all. That's a bit upsetting, in fact, because it leads us to a really pivotal question: where did the energy go?

We'll get to that, but first things need to get even weirder.

Orthogonal Carriers

Let's try φ = π/2.

    cos (ω t) + cos (ω t + π/2)
    = cos (ω t) - sin (ω t)

This one is hard to explain, but the short version is, no matter how much you try, you won't get that to come out to a single cos or sin wave. Symbolically, you can only express it as the two separate factors, added together. At each point, the sum has a single value, of course, but there is no formula for that single value which doesn't involve both a cos and a sin. This happens to be a fundamental realization that leads to all modern modulation techniques. Let's play with it a little and do some simple AM radio (amplitude modulation). That means we take the carrier wave and "modulate" it by multiplying it by a much-lower-frequency "baseband" input signal. Like so:

    f(t) cos (ω t)

Where ω >> 1, so that for any given cycle of the carrier wave, f(t) can be assumed to be "almost constant."

On the receiver side, we get the above signal and we want to discover the value of f(t). What we do is multiply it again by the carrier:

    f(t) cos (ω t) cos (ω t)
    = f(t) cos2 (ω t)
    = f(t) (1 - sin2 (ω t))
    = ½ f(t) (2 - 2 sin2 (ω t))
    = ½ f(t) (1 + (1 - 2 sin2 (ω t)))
    = ½ f(t) (1 + cos (2 ω t))
    = ½ f(t) + ½ f(t) cos (2 ω t)

See? Trig identities. Next we do what we computer engineers call a "dirty trick" and, instead of doing "real" math, we'll just hypothetically pass the resulting signal through a digital or analog filter. Remember how we said f(t) changes much more slowly than the carrier? Well, the second term in the above answer changes twice as fast as the carrier. So we run the whole thing through a Low Pass Filter (LPF) at or below the original carrier frequency, removing high frequency terms, leaving us with just this:

    (...LPF...)
    → ½ f(t)

Which we can multiply by 2, and ta da! We have the original input signal.

Now, that was a bit of a side track, but we needed to cover that so we can do the next part, which is to use the same trick to demonstrate how cos(ω t) and sin(ω t) are orthogonal vectors. That means they can each carry their own signal, and we can extract the two signals separately. Watch this:

    [ f(t) cos (ω t) + g(t) sin (ω t) ] cos (ω t)
    = [f(t) cos2 (ω t)] + [g(t) cos (ω t) sin (ω t)]
    = [½ f(t) (1 + cos (2 ω t))] + [½ g(t) sin (2 ω t)]
    = ½ f(t) + ½ f(t) cos (2 ω t) + ½ g(t) sin (2 ω t)
    [...LPF...]
    → ½ f(t)

Notice that by multiplying by the cos() carrier, we extracted just f(t). g(t) disappeared. We can play a similar trick if we multiply by the sin() carrier; f(t) then disappears and we have recovered just g(t).

In vector terms, we are taking the "dot product" of the combined vector with one or the other orthogonal unit vectors, to extract one element or the other. One result of all this is you can, if you want, actually modulate two different AM signals onto exactly the same frequency, by using the two orthogonal carriers.

QAM

But treating it as just two orthogonal carriers for unrelated signals is a little old fashioned. In modern systems we tend to think of them as just two components of a single vector, which together give us the "full" signal. That, in short, is QAM, one of the main modulation methods used in 802.11n. To oversimplify a bit, take this signal:

    f(t) cos (ω t) + g(t) sin (ω t)

And let's say f(t) and g(t) at any given point in time each have a value that's one of: 0, 1/3, 2/3, or 1. Since each function can have one of four values, there are a total of 4*4 = 16 different possible combinations, which corresponds to 4 bits of binary data. We call that encoding QAM16. If we plot f(t) on the x axis and g(t) on the y axis, that's called the signal "constellation."

Anyway we're not attempting to do QAM right now. Just forget I said anything.

Adding out-of-phase signals

Okay, after all that, let's go back to where we started. We had two transmitters both sitting on the x axis, both transmitting exactly the same signal cos(ω t). They are separated by a distance r, which translates to a phase difference φ. A receiver that's also on the x axis, not sitting between the two transmit antennas (which is a pretty safe assumption) will therefore see this:

    cos (ω t) + cos (ω t + φ)
    = cos (ω t) + cos (ω t) cos φ - sin (ω t) sin φ
    = (1 + cos φ) cos (ω t) - (sin φ) sin (ω t)

One way to think of it is that a phase shift corresponds to a rotation through the space defined by the cos() and sin() carrier waves. We can rewrite the above to do this sort of math in a much simpler vector notation:

    [1, 0] + [cos φ, sin φ]
    = [1+cos φ, sin φ]

This is really powerful. As long as you have a bunch of waves at the same frequency, and each one is offset by a fixed amount (phase difference), you can convert them each to a vector and then just add the vectors linearly. The result, the sum of these vectors, is what the receiver will see at any given point. And the sum can always be expressed as the sum of exactly one cos(ω t) and one sin(ω t) term, each with its own magnitude.

This leads us to a very important conclusion:

    The sum of reflections of a signal is just an arbitrarily phase shifted and scaled version of the original.

People worry about reflections a lot in wifi, but because of this rule, they are not, at least mathematically, nearly as bad as you'd think.

Of course, in real life, getting rid of that phase shift can be a little tricky, because you don't know for sure by how much the phase has been shifted. If you just have two transmitting antennas with a known phase difference between them, that's one thing. But when you add reflections, that makes it harder, because you don't know what phase shift the reflections have caused. Not impossible: just harder.

(You also don't know, after all that interference, what happened to the amplitude. But as we covered last time, the amplitude changes so much that our modulation method has to be insensitive to it anyway. It's no different than moving the receiver closer or further away.)

Phasor Notation

One last point. In some branches of eletrical engineering, especially in analog circuit analysis, we use something called "phasor notation." Basically, phasor notation is just a way of representing these cos+sin vectors using polar coordinates instead of x/y coordinates. That makes it easy to see the magnitude and phase shift, although harder to add two signals together. We're going to use phasors a bit when discussing signal power later.

Phasors look like this in the general case:

    A cos (ω t) + B sin (ω t)
    = [A, B]

      Magnitude = M = (A2 + B2)½

      tan (Phase) = tan φ = B / A
      φ = atan2(B, A)

    = M∠φ

or the inverse:

    M∠φ
    = [M cos φ, M sin φ]
    = (M cos φ) cos (ω t) - (M sin φ) sin (ω t)
    = [A, B]
    = A cos (ω t) + B sin (ω t)

Imaginary Notation

There's another way of modeling the orthogonal cos+sin vectors, which is to use complex numbers (ie. a real axis, for cos, and an imaginary axis, for sin). This is both right and wrong, as imaginary numbers often are; the math works fine, but perhaps not for any particularly good reason, unless your name is Euler. The important thing to notice is that all of the above works fine without any imaginary numbers at all. Using them is a convenience sometimes, but not strictly necessary. The value of cos+sin is a real number, not a complex number.

Epilogue

Next time, we'll talk about signal power, and most importantly, where that power disappears to when you have destructive interference. And from there, as promised last time, we'll cheat Shannon's Law.

Syndicated 2014-07-26 01:11:15 from apenwarr

26 Jul 2014 dmarti   » (Master)

Newspaper dollars, Facebook dimes

Hard to miss the Facebook earnings news this week.

Facebook earnings beat expectations as ad revenues soar

Facebook Beats In Q2 With $2.91 Billion In Revenue, 62% Of Ad Revenue From Mobile, 1.32B Users

Let's take a look at those numbers. (I'd like to fill in more and better data here, so any extra sources welcome.)

Mobile ads: 62% of ad revenues.

Total US ad revenue: $1.3 billion.

Which would make mobile US revenue about 800 million. (Other countries are heavier on mobile, so this might even be high.)

Americans spend 162 minutes on a mobile device per day of which 17% is Facebook. So figure about 28 minutes per day on average. (Average of all US "consumers", not just mobile or Facebook users.)

That's double the time spent reading the printed newspaper.

US users spend an average of 14 minutes/day on printed newspapers. (Average of newspaper readers and non-readers. Just print, not web or mobile.)

But how are newspapers doing with the ad revenue?

Even after a sharp decline, newspaper print ad revenue in the USA is at $17.3 billion/year. That's the 2013 number, so it's reasonable to expect it to continue to come down as newspaper-reading time continues to decline.

Let's say it comes down another 10 percent for this year (which is faster than trend) and take a quarter of that. That's $3.9 billion.

So the newspaper brings in more than four times as much ad money by being in front of users for half the time. The newspaper completely lacks all the advanced behavioral targeting stuff, and Facebook is full of it.

What's going on here? Why is Facebook—the most finely targeted ad medium ever built—an order of magnitude less valuable to advertisers than the second-oldest low-tech ad medium is?

Here's my best explanation so far for the "print dollars to digital dimes" problem.

Advertising is based on a two-way exchage of information. You, the reader, give advertising your attention. Advertising gives you some information about the advertiser's intentions. That's often not found in the content of the ad. The fact that it's running in a public place at all is what builds up your mental model of the product, or brand equity.

On the other hand, advertising that's targeted to you is like a cold call or an email spam. You might respond to it at the time, but it doesn't carry information about the advertiser's intentions. (For example, you might be the one sucker who they're trying to stick with the last obsolete unit in the warehouse, before an incompatible change.)

As Bob Hoffman, Ad Contrarian, wrote, Online advertising has thus far proven to be a lousy brand-building medium. Walk through your local supermarket or Target or Walmart and see if you can find any brands built by online advertising. So what is web advertising good for? Thus far, it has been effective at search and moderately effective at a certain type of direct response.

Without the signaling/brand building effect, those targeted Facebook ads don't pull their weight, and come in at less valuable than newspaper ads.

I'm not saying we should go back to dead trees, but clearly mobile is leaving money on the table here. What's the solution? Paradoxically, it's going to have to involve some privacy tech on the user's end—preventing some of the one-sided data flow towards advertisers in order to introduce signaling effect.

More: Targeted Advertising Considered Harmful

Syndicated 2014-07-26 14:43:05 from Don Marti

25 Jul 2014 Stevey   » (Master)

The selfish programmer

Once upon a time I wrote a piece of software for scheduling the classes available to a college.

There was a bug in the scheduler: Students who happened to be named 'Steve Kemp' had a significantly higher chance (>=80% IIRC) of being placed in lessons where the class makeup was more than 50% female.

This bug was never fixed. Which was nice, because I spent several hours both implementing and disguising this feature.

I'm was a bad coder when I was a teenager.

These days I'm still a bad coder, but in different ways.

Syndicated 2014-07-25 13:16:54 from Steve Kemp's Blog

25 Jul 2014 caolan   » (Master)

Dialogs and Coverity, current numbers

Converting LibreOffice dialogs to .ui format, 54 conversions remaining

We've now converted all but 54 of LibreOffice’s classic fixed widget size and position .src format elements to the GtkBuilder .ui format. This is due to the much appreciated efforts of Palenik Mihály and Szymon Kłos, two of our GSOC2014 students, who are tackling the last bunch of hard to find or hard to convert ones.

Current conversion stats are:
778 .ui files currently exist
There are 20 unconverted dialogs
There are 34 unconverted tabpages
An estimated additional 54 .ui are required
We are 93% of the way through.

Coverity Defect Density: LibreOffice vs Average

According to Coverity's overview dashboard our current status is:

LibreOffice: 9,425,526 line of code and 0.09 defect density

Open Source Defect Density By Project Size

Line of Code (LOC) Defect Density
Less than 100,0000.35
100,000 to 499,9990.5
500,000 to 1 million0.7
More than 1 million0.65
Note: Defect density is measured by the number of defects per 1,000 lines of code, identified by the Coverity platform. The numbers shown above are from our 2013 Coverity Scan Report, which analyzed 250 million lines of open source code.

Syndicated 2014-07-25 12:05:00 (Updated 2014-07-25 12:05:08) from Caolán McNamara

25 Jul 2014 marnanel   » (Journeyer)

Gentle Readers: catch them, Rimeq

Gentle Readers
a newsletter made for sharing
volume 1, number 15
24th July 2014: catch them, Rimeq
What I’ve been up to

I read a choose-your-own-adventure science fiction book when I was little. It concerned the efforts of an alien named Rimeq to take over the world, and the hero's efforts to stop him. This was made more difficult because Rimeq possessed the ability to move objects around with his mind (telekinesis). The only part which has stayed in my head is towards the end, when the hero has reached Rimeq's room but Rimeq has paralysed him by telekinesis, the police have been stopped similarly, and so have the spaceships bringing help, and the stress is showing on Rimeq's face. Finally the hero manages to take some rings off his fingers and throw them at Rimeq, shouting, "Catch them, Rimeq, they're grenades!" This is the final straw; the stress on Rimeq's mind is too much, and he is taken away catatonic.

So as I mentioned earlier, we have been moving house, and several moments have made me think, "Catch them, Rimeq"-- in particular, I meant to put out an edition of Gentle Readers on Monday as usual, but exhaustion won. Sorry for the interruption in service; meanwhile, I've been very encouraged by the messages I've had telling me how much you enjoy reading Gentle Readers.

Many people are due public thanks for helping us get through the last week. In particular, I want to thank the people of St John's church, Egham; as the obstacles to getting moved grew more and more formidable, so more and more people from St John's turned up unasked to help. We couldn't have managed without you. Thanks also go to the Gentle Reader who offered a garage when the movers needed to deliver before the landlord could give us the key. And thanks to the people from the Runnymede Besom, who turned up to take away some furniture we'd donated, but then came back later to help clean up. That's what love in action looks like, and I'll do my best to pay it forward. Thank you all.

A poem of mine

THE ECHOES OF AN AMBER GOD
(T54)

Electric sparkles in your touch,
the echoes of an amber god.
You fill my batteries with such
electric sparkles in your touch,
that Tesla would have charged too much
and Franklin dropped his lightning-rod:
electric sparkles in your touch,
the echoes of an amber god.

A picture

I was going to draw you a cartoon as usual, but my tablet is still packed away. Instead, here are some photos I took when I was working in London earlier this year.

http://thomasthurman.org/pics/clapham-junction
Trains in the sidings at Clapham Junction, the busiest railway station in Britain.
More than a hundred trains an hour come through.

http://thomasthurman.org/pics/binder
The tombstone of Jason Binder:
"He respected all living things. His inspiration lives on."
And it lives on with me, too, even though his epitaph is all I know about him.

 

Something from someone else

Does this one really need an introduction? Well, if you've never seen it before, then you have the joy of seeing it for the first time; the Guardian has a decent analysis if you're interested in digging into it. "Baggonets" is an archaic form of the word "bayonets", and Kensal Green is a large London cemetery, one of the magnificent seven. There is a pub called "Paradise" near there now; it was named for the poem.

THE ROLLING ENGLISH ROAD
by G K Chesterton

Before the Roman came to Rye or out to Severn strode,
The rolling English drunkard made the rolling English road.
A reeling road, a rolling road, that rambles round the shire,
And after him the parson ran, the sexton and the squire;
A merry road, a mazy road, and such as we did tread
The night we went to Birmingham by way of Beachy Head.

I knew no harm of Bonaparte and plenty of the Squire,
And for to fight the Frenchman I did not much desire;
But I did bash their baggonets because they came arrayed
To straighten out the crooked road an English drunkard made,
Where you and I went down the lane with ale-mugs in our hands,
The night we went to Glastonbury by way of Goodwin Sands.

His sins they were forgiven him; or why do flowers run
Behind him; and the hedges all strengthening in the sun?
The wild thing went from left to right and knew not which was which,
But the wild rose was above him when they found him in the ditch.
God pardon us, nor harden us; we did not see so clear
The night we went to Bannockburn by way of Brighton Pier.

My friends, we will not go again or ape an ancient rage,
Or stretch the folly of our youth to be the shame of age,
But walk with clearer eyes and ears this path that wandereth,
And see undrugged in evening light the decent inn of death;
For there is good news yet to hear and fine things to be seen,
Before we go to Paradise by way of Kensal Green.

Colophon

Gentle Readers is published on Mondays and Thursdays, and I want you to share it. The archives are at http://thomasthurman.org/gentle/ , and so is a form to get on the mailing list. If you have anything to say or reply, or you want to be added or removed from the mailing list, I’m at thomas@thurman.org.uk and I’d love to hear from you. The newsletter is reader-supported; please pledge something if you can afford to, and please don't if you can't. Love and peace to you all.

This entry was originally posted at http://marnanel.dreamwidth.org/307056.html. Please comment there using OpenID.

Syndicated 2014-07-24 23:56:56 from Monument

24 Jul 2014 dangermaus   » (Journeyer)

Writing for The Encyclopedia

Ilario of Wikimedia Foundation was so kind to travel until my small village in the mountains at the end of the world to introduce me and a good friend of mine to Wikipedia. I use Wikipedia a lot for work and fun, but never seriously thought I could contribute. Well, I will keep an eye if my contributions manage to survive there :-) Maybe at least in modified form...

Recently, Toni El Suizo, the bridge builder of the poor, held a conference here in my village, and when I visited his page on Wikipedia I was a bit disappointed because it was quite empty. In my eyes, he is a true modern hero. I therefore decided that my first contribution for Wikipedia will be about improving his pages in as many languages as I can. I did an essay in Italian for Toni and did my best to translate it in English. However, I would be glad if someone with native English could look at the Wikipedia pages of Toni and perform the necessary corrections.

Building an Ear to listen on the Magic Band

Following advices from Internet and from some QSOs with informed people like Marco, HB9YBQ, I built a 6m delta loop at the end of May with some PVC tubes, about 6m 11cm of earth wire, 1m 12cm of very old RG59 75 Ohm coax for television as matching part (velocity factor 0.79, measured with the miniVNA), old RG58 coax used in the first Ethernets about 25 years ago as transmission line and plenty of BNC connectors also used to connect computers in the very first Ethernet networks. It is fed on top as I think I need a high radiation angle because of the mountains. The antenna is directional and can rotate... I took instructions from this youtube video and looking at web pages around the web.

These are the antenna characteristics straight from the MiniVNA BT Pro which I used to cut the wire at the correct length: It is resonant at 50.422Mhz with a SWR of 1.24. On 50.125Mhz the meeting frequency for QSOs it has a slightly higher SWR of 1.3. Thanks to Gerry, EI9JU, QTH in Ireland, for the first QSO with this antenna :-)

Another little success: with Fun Cube Pro Plus dongle I could download telemetry data from the Fun Cube satellite, this only worked when the satellite was very high on the horizon (due to the mountains, of course).

Versioning any Database Type

While releasing deltasql 1.7.0, I did quite an effort to smash bugs. I believe 1.7.0 is a good release both for people who would like to upgrade and for people who would like to try it out. A test data set is installed by default, this should simplify the learning process to get deltasql working well.

24 Jul 2014 vicious   » (Master)

News …

Interesting reading and comparison of news regarding Ukraine and Gaza. So apparently according to Russian media, people in the west have bad opinion of Russia only because the west has had a long lived fascist hatred of Russians. According to Israeli media, the rest of the world (except US) has a low opinion of Israel only because they are all anti-Semite.  It can’t be because anything we do or say, because we are perfect: even our farts do not smell [citation needed].

Rather interestingly [1], fewer (almost by half) Russians view Israel negatively percentagewise than do Americans.  So Americans are almost twice as anti-Semitic as Russians (this might be news to Jews living in Russia).

If negative opinion of a country was purely based on racism, then it means that racists are able to distinguish between for example North and South Koreans. Also South Koreans are really really racist, and they really really irrationally hate the North Koreans. Essentially as much as Egyptians irrationally hate Israelis. By the way notice that the survey asked if the country (not its people) has a positive influence on the world, but we are assuming I guess (at least in Russian and Israeli media) that nobody can tell the difference between the country and its citizens.

Also the French and the Germans seem to really really love each other. I mean … get a room you two. I mean the French and the Germans have always liked each other. Good thing the survey did not ask about Belgians, because those guys are terrible, we all hate the Belgians.

An interesting piece of information from that study is that Nigerians pretty much have a positive view of the world. Most countries they overwhelmingly love. And even Israel and North Korea manage to get over half of Nigerians to like them. They aren’t too crazy about Iranians and Pakistanis, but it’s not too bad either.

It must be wonderful to live in the world of simple explanations that always seem to indicate that the group you belong to is somehow superior to others, and others simply hate you because you are so good. I think we had a word for that …

[1] http://www.globescan.com/images/images/pressreleases/bbc2012_country_ratings/2012_bbc_country%20rating%20final%20080512.pdf


Syndicated 2014-07-24 16:29:27 from The Spectre of Math

24 Jul 2014 Skud   » (Master)

Queer intersectionality reading list

I recently put together this reading list on queer intersectionality for a local LGBTIQ group, as part of thinking about how we can serve a wider community of same-sex attracted and gender diverse folks. I thought it might be useful to share it more widely.

For context, this is a 101 level reading list for people with a bare understanding of the concept of intersectionality. If you’re not familiar with that you might want to read Wikipedia’s article on intersectionality.

Interview with Kimberlé Crenshaw, who named and popularised the concept of intersectionality — I think it’s important that we remember and give credit to Professor Crenshaw and the black movements whose ideas we’re using, which is why I’m including this link first.

Intersectionality draws attention to invisibilities that exist in feminism, in anti-racism, in class politics, so obviously it takes a lot of work to consistently challenge ourselves to be attentive to aspects of power that we don’t ourselves experience.” But, she stresses, this has been the project of black feminism since its very inception: drawing attention to the erasures, to the ways that “women of colour are invisible in plain sight”.

“Within any power system,” she continues, “there is always a moment – and sometimes it lasts a century – of resistance to the implications of that. So we shouldn’t really be surprised about it.”

An excellent article about the New York group Queers for Economic Justice:

“You would never know that poverty or class is a queer issue,” said Amber Hollibaugh, QEJ Executive Director and founding member. She continued: “Founding QEJ was, for many of us that were part of it, a statement of …wanting to try to build something that assumed a different set of priorities [than the mainstream gay equality movement]: that talked about homelessness, that talked about poverty, that talked about race and sexuality and didn’t divide those things as if they were separate identities. And most of us that were founding members couldn’t find that anywhere else.”

An interesting personal reflection on intersectionality by a queer Asian woman in NZ:

On the other side, if I’m having issues in my queer relationship with my white partner the discourse my mum uses is that same-gender relationships just don’t work and aren’t supposed to work. Find a (Chinese) man, get married and have babies like she did. You don’t have to love him to begin with but you will grow to love him. Like my mum did, apparently. It’s like if you’re queer and there’s problems in your relationship it’s because you’re queer and the solution is to be heterosexual. If you’re Chinese and there’s problems with your family it’s because Chinese culture is just more conservative or backward and the solution is to distance yourself away from it or try to assimilate into Pakeha culture. It shouldn’t have to be like this.

An article about intersectionality and climate justice (not very queer-oriented but some interesting stuff to think about):

On a personal level, we have to slow down and educate ourselves so that we can name the toxic systems within which we exist. We have to relearn the real histories of the land, of resistance movements and what it has taken for communities survive. We must also take the time to talk through all of the connections so that we can build a deeper analysis of the crises we face. During this process, it’s important that we commit to the slow time of genuine relationship-building, especially as we learn to walk into communities that we’re not a part of in respectful ways. From there, we create space to truly hear each other’s stories and bring people together in ways that, as Dayaneni says, “we can see ourselves in each other.”

A speech about queerness and disability:

This gathering has been very white and for the most part has neglected issues of race and racism. All of us here in this room today need to listen to queer disabled people of color and their experiences. We need to fit race and racism into the matrix of queerness and disability. I need to ask myself, not only “What does it mean to be a pansexual tranny with a long butch dyke history, a walkie with a disability that I acquired at birth,” but also, “What does it mean to be a white queer crip?”

We haven’t asked enough questions about class, about the experiences of being poor and disabled, of struggling with hunger, homelessness, and a lack of the most basic healthcare. I want to hear from working class folks who learned about disability from bone-breaking work in the factory or mine or sweatshop.

We need more exploration of gender identity and disability. How do the two inform each other? I can feel the sparks fly as disabled trans people are just beginning to find each other. We need to listen more to Deaf culture, to people with psych disabilities, cognitive disability, to young people and old people. We need not to re-create here in this space, in this budding community, the hierarchies that exist in other disability communities, other queer communities.

And finally, Beyond the Queer Alphabet (ebook) — an entire book on the subject of queer intersectionality.

If you’ve got any other recommended reading, I’d appreciate hearing about it.

Syndicated 2014-07-24 04:38:22 from Infotropism

23 Jul 2014 danstowell   » (Journeyer)

Rapists know your limits

There's a poster produced by the UK government recently that says:

1 in 3 rape victims have been drinking. Know your limits.

I can imagine there are people in a design agency somewhere trying to think up stark messages to make the nation collectively put down its can of Tennents for at least a moment, and it's good to dissuade people from problem drinking. But this is probably the most blatant example I've ever seen of what people have been calling "victim blaming".

If your friend came to you and said they'd been raped, would you say "You shouldn't have been drinking"? I hope not. And not just because it'd be rude! But because even when someone is a bit tipsy, it's not their fault they were raped, it's the rapist's fault.

It sounds so pathetically obvious when you write it down like that. But clearly it still needs to be said, because there are people putting together posters that totally miss the point. They should also bear in mind that a lot of people like to have a drink on a night out, or on a night in. (More than half of women in the UK drink one or two times a week, according to the 2010 General Lifestyle Survey table 2.5c) So it's actually no surprise AT ALL that 1/3 of rape victims have been drinking. What proportion of rape victims have been smoking? Dancing? Texting?

(By the way there's currently a petition against the advert.)

On the other hand, maybe it's worth thinking about the other side of the coin. People who end up as convicted rapists - some of them after a fuzzy night out or whatever - how many of them have been drinking? Does that matter? Yes, it matters more, because rape is an act of commission, and it seems likely that in some proportion of rapes a person went beyond reasonable bounds as a result of their drinking.

So how about this for a poster slogan:

1 in 3 rapists have been drinking. Know your limits.

(I can't find an exact statistic to pin down the number precisely - here I found an ONS graph which tells us in around 40% of violent crimes, the offender appears to have been drinking. So for rape specifically I don't know, but 1 in 3 is probably not wide of the mark.)

So now here's a question: why didn't they end up with that as a slogan? Is it because they were specifically tasked with cutting down women's drinking for some reason, and just came up with a bad idea? Or is it because victim-blaming for rape just sits there at a low level in our culture, in the backs of our minds, in the way we frame these issues?

Syndicated 2014-07-23 03:20:54 (Updated 2014-07-23 03:22:58) from Dan Stowell

22 Jul 2014 louie   » (Master)

Slide embedding from Commons

A friend of a friend asked this morning:

Hmmm trying to upload a CC0 public domain presentation from #OKFest14 by @punkish and @SlideShare don't have public domain license option :(

— Jenny Molloy (@jenny_molloy) July 22, 2014

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

Advogato User Stats
Users14002
Observer9886
Apprentice746
Journeyer2340
Master1026

New Advogato Members

Recently modified projects

20 Jun 2014 Ultrastudio.org
13 Apr 2014 Babel
13 Apr 2014 Polipo
19 Mar 2014 usb4java
8 Mar 2014 Noosfero
17 Jan 2014 Haskell
17 Jan 2014 Erlang
17 Jan 2014 Hy
17 Jan 2014 clj-simulacrum
17 Jan 2014 Haskell-Lisp
17 Jan 2014 lfe-disco
17 Jan 2014 clj-openstack
17 Jan 2014 lfe-openstack
17 Jan 2014 LFE
10 Jan 2014 libstdc++

New projects

8 Mar 2014 Noosfero
17 Jan 2014 Haskell
17 Jan 2014 Erlang
17 Jan 2014 Hy
17 Jan 2014 clj-simulacrum
17 Jan 2014 Haskell-Lisp
17 Jan 2014 lfe-disco
17 Jan 2014 clj-openstack
17 Jan 2014 lfe-openstack
17 Jan 2014 LFE
1 Nov 2013 FAQ Linux
15 Apr 2013 Gramps
8 Apr 2013 pydiction
28 Mar 2013 Snapper
5 Jan 2013 Templer