On Management

Throughout my professional career (a bit more than 10 years now), I’ve worked with about 20 managers at various levels (from the team leader to the small company CEO). Among them, most have been between average and good, 5 were truly abysmal (i.e. actually counter-productive and detrimental to the project), and 2 were really outstanding. When I was a junior I used to think that managers were all representatives of the Dilbert Principle (and indeed I had seen at least one good example). As time made me a tad wiser, I found that not only management requires as much talent as development, but truly good managers are even more rare than truly good programmers.

So just for the sake of feeding this blog, here’s a short list of what I believe are key signs to a good manager (and to a bad one) :

  • A good manager will remove obstacles and try to help you working more efficiently. He will keep his door open, ask what problems you encounter and will actually try to solve them. If the problem is yourself, so be it, but before reaching this conclusion, other solutions will be tried. A bad manager will rather coerce you into working as he thinks you should, and will not want to hear about your problems (the typical answer will be something along the lines of “if you can’t do it I’ll go find someone who can). By default, he sees you as a slacker (actual example : at the beginning of a worldwide sports event spanning several weeks, a company-wide mail sent by the CEO, reminding us that everybody should be there a 9AM and should not leave before 6PM, so don’t you people think of leaving to watch the game on TV, Big Brother is watching you).
  • A good manager will follow what he says with actual actions. A bad manager may collect input from his staff, sometimes very thoroughly, but won’t act on it. (actual example : a 3h long meeting, with all recriminations scribbled on papers tacked on large boards. In the end, the manager just took a photograph of the boards, saying “I’ll study this”, and nothing ever happened).
  • A good manager actually has a clue about which management techniques work and which don’t (generally comes through experience). A bad manager will blindly try to apply whatever buzzword he may have heard of, pretty much fitting the “if your only tool is a hammer, all problems look like nails” stereotype.
  • A good manager will actually welcome comments and remarks on his decisions, even negative ones. He will accept debate and will even expect it, even though he will have the last word, because that’s his role. He won’t have a problem with recognizing he is wrong, as he knows he can’t always be right. A bad manager will feel threatened by differing points of view, and will perceive them as a challenge of his authority. Basically, a good manager is self-secure, a bad manager isn’t, and tries to prove otherwise by abusing of his authority.
  • A good manager tries to entice good behavior, a bad manager only punishes bad one. For instance a good manager will publicly praise you for an achievement. A bad manager will publicly scold you for a mistake (actual example : a manager proudly exhibiting at every weekly team meeting a chart showing the developers with the highest number of assigned bugs).

Myspace, “worse is better”

Skimming through a blog about the Cannes Film Festival, I came across this movie website URL. Movies have long had web sites, but a myspace page ? This is new.

Let’s be honest : myspace is the new geocities. Back in the early days of the web, geocities was a huge heap of ugly, boring personal web pages, generally very badly designed and using every single dumb javascript trick in the book (that, and the <flash> tag). It was the place where bright pink text on yellow background would wrap around blurry holiday pictures scattered at random. And your mouse cursor would be followed by a trail of cute little stars. Geocities was the haven of thousands of vanity sites. Now doesn’t that make you think of something ?

Why was geocities so successful ? Because it offered free and easy to use hosting. Forget about quality, the “worse is better” principle applies. Why is myspace so successful these days, even though just about 99% of its webpages are eye-piercing demonstrations of ugliness ? I can’t see any other reason that it’s because it gives all the tools a home site needs in a nicely wrapped package : blog, photo gallery, audio and video playback, and… hyperlinks. Because, really, what are “friends” in myspace if not a reinvention of hyperlinks within the realm of myspace ? Yes, hyperlinks within a text are used to offer a way to “more info”, but in a blogroll for instance they are an endorsement. Much like myspace “friends”.

Then of course there’s also the critical mass factor, which myspace has reached long ago. It’s sheer weight makes it mandatory for media marketing droids to set up camp here whenever they have something new to promote. And so myspace becomes a fac simile of the Net, except centralized.

This is actually a global trend as Nicholas Carr says. First were centralized, disconnected networks (AOL, Compuserve, Genie… my first online steps where on a system called Calvados, which later became Calvacom). Then Internet wiped them out. And now we see them growing back, within the Net.

Spamming, 15 years on

How long do you think an email address can stay in the spam lists ? I just quizzed a bunch of friends about this, one immediately answered “indefinitely”. He’s right.

Back in 1992-93, my first job was being a trainee at the INRIA. There was no web back then, we actually witnessed its birth (and all we did was shrug). Usenet was the main channel for discussions, and I used to post in a few newsgroups. Spam was exceptionnal, people doing it (generally college kids who didn’t know better) were rebuked by people receiving their messages and would even apologize when they would realize what they had done.

By some strange twist of fate, I find myself working again at the INRIA today, some 15 years later. I started on april 2nd. When I first logged on my office machine and looked at my mail, I was very surprised to find about 45 spams in it. Most of them were directed at my own address, rather than at one of the mailing lists I had been subscribed to by default.

I was rather surprised but given the amount of spam we get these days, I figured it was to be expected, since it couldn’t be possible that this was because my address was a “once valid” one (since the INRIA has kept the same policy for creating logins and mail addresses, in effect they had “ressurected” an email address that used to exist 15 years ago). Or so I thought. Then I noticed that some of this spam was directed at a host of the inria.fr domain that couldn’t possibly be “seen” from outside. It was the host I used to post to Usenet from.

So it really goes like this : create an email adress, post on Usenet or somewhere where it will be harvested, disable the adress, wait 15 years, enable the adress… and watch the spam pour in.

Yuck.

Concert Photography 101

After screwing up some potentially good shots at the last concert I attended, I figured it would be a nice opportunity to apply one of the principles mentioned in the original article which led me to open this blog : write for yourself, as a way to order your thoughts. So here goes.

A concert is generally fast moving subjects in low, yet high-contrast light conditions. If anybody can think of worse conditions to take pictures, I’m seriously curious to hear about them. Therefore you need wide-aperture lenses. f2.8 is a minimum, constant if possible. Stabilised optics are even better. On my Canon 20D, I am lucky enough to carry the 17-55mm f2.8 IS and the 70-200mm f2.8 IS. The first one is also a very good walk-around lens. The 2nd is of the “pry from my cold dead hands” kind.

Camera setup :

  • aperture priority (AV) mode. Set it to the widest aperture you have.
  • increase ISO, depending on the light conditions. 400 is the usual minimum, 800 is more common, 1600 is tolerable. You need to get 1/25 maximum exposure time (above that, you’ll get motion blur no matter what, IS or not – unless the whole band is under heavy sedation)
  • no flash : it bothers the performers, and it’s useless (except in very small venues, in which case it will bother the performers even more). Don’t even think about it, flash is explicitly forbidden in most venues anyway.
  • spot metering mode (or anything close enough your camera has). This is very important, because your camera has very little chances to figure out the right exposure by itself. More on this below.
  • if needed, under-expose a little. Noise or under-exposure can be fixed (to some extent), blur cannot. So if the current lighting won’t give you a short enough exposure time, set the exposure down a bit. 1.5 stop is generally the most you can afford without getting picture you won’t be able to ressuscitate. Be aware that even the best image noise processors will leave this caracteristic “plastic-like” look on skin, or turn hair into blur (which is avoidable, but takes time).
  • yet, a concert is one of the situations where motion blur can actually look good. But it’s generally better if you have it on the subject you’re shooting while not on the background.
  • standard, “one-shot” auto-focus mode. If your camera has some kind of ‘focus servo’ mode (where the camera automatically keeps focus on the subject if it moves), don’t use it. You’ll be reframing often (and often significantly), and the focus will be messed up. (This is the bit which cost me a bunch of good pictures last time).
  • burst mode : fast moving subjects, talking (well, singing) in many cases – shooting in burst mode will increase your chances of taking a good picture where the subject doesn’t look goofy.

Why spot metering : as I said, you’re shooting subjects in high-contrast light conditions. What’s more, the performers themselves will very often be highly contrasted, namely wearing dark clothes. The part which you want to be properly exposed is the face (a shot where the subject’s face is either under or overexposed will generally look bad, no matter the rest). If you use any other metering mode, it’s quite likely the camera will evaluate an exposure longer than what you need, because of the subject’s dark clothes, or the dark surroundings. Not only you’ll get motion blur, you will also have overexposed faces.

So when shooting, you need to lock light metering on the subject’s face first. Only then go ahead with focusing, reframing if needed (it often is), shooting. This is a quick reflex game, it does take some practice.

Finally, some more general tips : ear plugs, small torch light (always comes in handy), high-capacity memory card. Behave nicely try to be as inconspicuous as possible (don’t ever try to attract the performer’s attention – you’ll be thrown out, and if not you should be).

(add : a compilation of links regarding concert photography).

No choice is bad, some choice is good, more choice is … ?

… paralysis, frustration, and generally unhapiness, as explained by this very interesting post from Garr Reynold’s Presentation Zen blog (the video of the TED presentation is worth your time).

Two equally interesting applications to what is explained here :

  • a political/economical one : the current trend of offering a plethora of choices to customers is actually alienating, and is only meant to drive consumption. Another paradox there is that behind all these choices are actually a diminishing numbers of actual makers and suppliers. These choices are actually “more of the same”, or more precisely “an ever-growing more of an every-diminishing same” (cf. Naomi Klein’s No Logo)
  • a UI design / programming one : whenever you want to add another user-configurable option to your program, think hard. Very, very hard. In this case, the frustration will come not from the fear of missing an hypothetic “even better” choice, but from being asked questions you don’t care about, or really don’t want to deal with.
  • To wit, a few hours after reading this post, I was looking at akregator’s configuration panel. In it, there’s an option to set the archive back-end. It’s a combo, with only two choices : “metakit”, or “no archive”. That this option is clearly unfinished is one thing, the real problem is that it should never have been implemented at all. Which end-user wants to deal with that ? This is really geek stuff, and even then, those who take a kick out of trying every single program of their favorite linux distribution, fine-tuning their environment down to the pixel, but never actually produce anything useful.

    This is a spot-on answer to one of my favourite pet peeve : wannabe’s who claim they can’t live without such and such tool or feature, so removing them would be extremely bad, so please leave it or make it an option. They call to freedom while they’re actually selfish morons. This post is part of the clue bat they should be beaten with.

Miguel has finally seen the light :-)

I had the privilege to meet Gnome’s founder, Miguel de Icaza at the first GUADEC back in… oh my, 2000, has it been that long already. I was still co-leading gtk– at the time. Miguel is a brilliant fellow, full of energy and spirit. But at the time I was disagreeing with most of his views. So it is with some amusement that I caught the following bit in a recent interview :

derStandard.at: What would – in retrospect – be the one piece of software that you most regret having written in plain C cause C# was not around at that time?

Miguel de Icaza: Everything that I ever wrote for the desktop.

And just before that :

I would not waste time in the 99% of the application that copes with high-level issues doing it in a low-level language.

You don’t say.

Gee, Miguel, C++ was there at the time too. It’s far from being as high level as C#, but it’s still better than C. (that said, believe it or not, I very much hope that mono will grow to be a commonly used platform for desktop apps on Linux, because C# is just so much better than C++ – but please drop GTK as a graphic toolkit, that can only be a temporary solution. Debugging through two levels of completely different object models will always be a nightmare)

Poisonous users, a live example

It’s ironic that a couple of weeks after this silly display of zealotry, this interesting video on how to deal with poisonous users would be highlighted on slashdot. We’re quite fortunate with Rosegarden that we almost never had to deal with this problem. I can only think on one occurrence, but the guy actually turned into a valuable contributor once we explained the problem to him.

The thread linked above on the merits of KDE’s new file manager, Dolphin has pretty much all of the standard features of the clueless user who can’t tolerate having to change his ways. His point is that his current way of working is the best one and must not be altered, more specifically must not be simplified or “dumbed down”. I’m pretty sure there’s a strong correlation between how much a user wants his environment to be configurable, and how not unproductive he actually is. I.e., the more you care about fine-tuning your tools, the less you actually use them. The extreme being, you guessed it, PC tuners :-). View it as a form of procrastination if you will, I’ve yet to see someone who’s adamant about “having choice” (which really means wanting to use his pet apps and considering that anything else is crap) also producing anything useful.

That won’t prevent him from demanding to see the data from usability tests, only if those contradict his own usage patterns of course (which are completely geekish but he can’t realise that).

On a side note, you have to be impressed by Aaron’s tactful and patient behavior all throughout the thread.

Free software as a political paradox

This slashdot story prompted me to write a post about free software and politics which I had hinted in a footnote of my first post.

Before carrying on, for the sake of clarity I should state that in France, I’m center-left, in the US I’m a dangerous leftist.

The author of the slashdot story is puzzled to see that right-wing people are more likely to use free software than left-wing ones. But, given that free software is generally considered to be a ‘left’ value, one would have thought it would be the opposite (and, as some comments explain, I believe it is the case in France). However there are two ways to see the problem :

You can consider the software industry to be the perfect example of capitalism and free entreprise in action, and therefore free software, aiming to destroy it, to be anti-capitalist (thus left-oriented).

Or, you can consider that free software, being the empowerment of the individual, is an even better free enterprise illustration against the state-like monopoly of the software industry. One of the basic postulate behind right-wing politics is that the free market always finds the best solution, eventually, a solution which state-driven economics (i.e. socialism) cannot hope to find. And that’s exactly what free software postulates : give free reign to developpers and they will create software that the industry (which is a state-like structure) can’t possibly produce. And there you have why libertarians such as ESR who believe in minimal government also support free software.

At this point it’s hard to resist indulging into an obvious statement : the little I’ve learned about economics clearly show that free market (or, the combination of everybody’s selfish motives) does reach equilibrium, only the worst one. And you have a perfect example of this in free software : people prefer to start their own project than to collaborate with an existing one. The very philosophy of free software gives them a perfect reason to do so : “let the users decide”. Only the users don’t have perfect market knowledge, they don’t evaluate different pieces of software rationally, but emotionally. Because they liked what the author said in an interview, because they like the looks, or some specific feature, but not because it’s really better written or designed (they may sometimes argue so but honestly, have they really looked at the code ?). So sometimes there is indeed stabilisation over a few pieces of software, each fitting its own niche. Most of the time however you get a big old duplication of efforts, with zealots on each side arguing on how their own favorite is really the best one and the other side are just a bunch of morons.

So there. If free market really was working so well, we wouldn’t have two incomplete desktop frameworks and Microsoft would be long gone. Instead, it seems that the State as a government system is still the best working solution : letting devs do what they like whenever possible, and constraining them to do what’s needed the rest of the time. And that’s why I’d rather pay taxes than letting the market decide if a school or a road should be built or not.