Thoughts on WWDC 2013

As usual, some thoughts on this year’s WWDC

I found the keynote and sessions much more stimulating than last year. Things that particularly stood up :

First, Apple is not abandoning the Mac, nor professional users. There’s still a lot of work being done on OS X, and not just spit and polish. Given the architecture of the new MacPro with one of its GPU not dedicated to graphics (see this article on how the new MacPro’s design really is new, and not just because of its cylindrical shape), low-level stuff like OpenCL and Grand Central Dispatch appear to be part of a much wider strategy, not just cool new tech. Even the venerable AppleScript platform is getting some new features, much to the relief of many who expected it to be abandoned because it would be inherently incompatible with the Mac App Store requirements.

There’s also the announce of a long overdue update of iWork for OS X. I really hope it will be more than cosmetic, like the previous one was. At least it’s nice to see Apple putting back more efforts behind its own OS X software.

Second, obviously, iOS7. The new “flat” design may look off-putting at first but I find it quickly becomes very familiar, and visually restful compared to the crowded look of iOS 6. I hope that the welcome restrain on graphic appearance won’t be spoiled by an overuse of the physics engine, though. Be wary of the “new tool” syndrome here.

Regarding iOS7’s visual appearance, it is quite interesting to see that it was indeed pioneered by Android (and even Windows Mobile to some extent), as a means of differentiation from iOS’s skeuomorphic style. Given that it appears such an abstract style was Johnny Ive’s intent all along starting from the first iPhone prototypes, I wonder if the iPhone would have been such a success if Ive’s design had been favoured over Forstall’s. I feel that the original iPhone opened the door with a familiar enough UI, then Android proved that something much more abstract would work, thus breaking the path for iOS 7.

A side note on skeuomorphism compared to well, “non-skeuomorphism” (what would be a better name for that ? “free-form” ?). Skeuomorphic UIs may look cool and instantly familiar, but it does constrain what you can do with them. The best example is Calendar in OS X Mavericks compared to Mountain Lion. The Mountain Lion version looks like an actual calendar, and you can move the pages as you would with one. And just like a real calendar, you can’t really see the last half of a month followed by the first half of the next one. In Mavericks, you just have a continuous scroll which may be less impressive visually but is way more useful and practical, because it lets you see any portion of a week or month no matter where it starts.

Update Aug. 2nd 2013 : As for the new look being an improvement or not, I was mostly positive when I first saw the screenshots and after a week of using it on my iPhone and several weeks using it on my iPad, I think it’s way better. This article by Matt Gemmell summarizes the evolution very well.

iTunes library tagging

A long while ago I looked into solutions to help me clean up and complete the mp3 tags of my iTunes music library. The problem is that many of my CDs were ripped on Linux, with a Ruby script I had written which used FreeDB (an alternative to CDDB) to fetch tags.

(Actually I’ve just dug out the sources of that script and it was more sophisticated than I remember – thanks to Ruby’s DRb (Distributed Ruby) module, it had a ripper client feeding the wavs to an encoder server using lame and freeDB – not bad)

Anyway, FreeDB didn’t really have the quality of CDDB, so many of my tags are lacking data such as genre or year. Having migrated now to OS X, I started searching for a convenient way of fixing this.

About the only solution I could find is Song Genie by Equinux (which name awfully sounds like a Linux shop 🙂 ). It looked like a good solution but I quickly ran into serious limitations. One is that, in the case of tags with multiple choices, it’s very tedious to undo the choice you’ve picked. But the main one is precisely that many songs will yield several possible tags. Song Genie treats your library as a list of tracks, not a list of albums. Therefore each track is seen independently of any other, and if a given song has been featured on an album and a compilation for instance, SongGenie will show both possibilities and ask you to you decide. It gets worse for jazz or classical music tracks, where the “packaging” scheme of songs into albums is much looser than in rock/pop. So, forget about using this for an automatic solution : you have to make sure you pick the right choice for each track, and it also occasionally gives only one choice which is still wrong. One particularly annoying detail : this applies to album titles, and also to track numbering.

To sum up, I’d often start with an album lacking genre and year, but otherwise properly named and numbered, and Song Genie would turn it to a bunch of tracks apparently coming from several different albums, each with their own numbering order. So much for a hassle-free solution.

When iTunes Match was introduced, I hoped it could provide a solution since the iTunes Store obviously has properly tagged files. I thought there would be a “update tags” feature in iTunes but no. I also tried matchtag which does pretty much that, fetching tags from iTunes Store (only on tracks which have been redownloaded via iTunes Match), but the problem of track vs. album dichotomy remains. Also, matchTag often fails to find info (I just tried with Pink Floyd’s Dark Side of the Moon).

I also tried MusicBrain’s Picard but the UI is very cumbersome, and while it does seem to be build around the concept of albums rather than tracks, I couldn’t get it to work reliably and simply.

Finally I found iTunes Script, in particular this one which copies track info to CD tracks (for when you need to re-rip a CD). I simply reversed the source and destination, and now I can insert a CD, and copy its CDDB info on the tracks. The main problem is that it’s so very tedious (I have to manually select the corresponding tracks in iTunes), but it’s simple and does the job.

A remaining lead would be to use GraceNote’s SDK to try and see if I can do without inserting a CD, or having to manually select the tracks in iTunes.

Miscellaneous bits

It’s been quite a while since my last post, and I missed a few news topics I felt like commenting on, so here goes, all in one block.

Canon 5D mark III

Boy has this one kept us waiting. The main hope was even better low-light performance than the mrkII, and the first samples were indeed astounding. But that’s because it applies heavy noise reduction on the jpegs. The raw files show only a much lesser improvement compared to the mrkII. What remains is a much improved auto-focus, something I could certainly do with given that all the focus spots on the mrkII except the central one can be pretty stubborn.

the Linux desktop is dead, and it finally knows it

When someone like Miguel de Icaza publishes a post titled What killed the Linux desktop, it’s safe to assume the idea has gotten wide recognition. It did get a serious backlash from no less than Alan Cox and Linus Torvalds, but neither claimed the premise was untrue, only the causes which Miguel invoked.

For one thing, Alan Cox’s response is spot on, Miguel helped creating the confusion he laments by launching Gnome (though he fails to recognize he was once an active member of Gnome, albeit not a very enthusiast one, as I remember). Gnome certainly helped killing any hope of Linux ever making a dent on the desktop because 3rd-party apps devs would be confronted by a choice no dev want to make : about which platform you’ll code for. The only worst thing to do would have been to offer a “choice” in different C libraries.

Moreover, I really can’t see how Linus Torvalds character or his stance toward Linux ABI compatibility can be seen as part of the problem. Linus certainly did not “invent” the “tough geek” persona, that existed long before him.

And the reason behind that was not that we have a culture of “engineering excellence” as Miguel stated in his original post (though we certainly liked to think we had that). Constantly breaking APIs is not a sign of good engineering, engineering is also about pragmatism, not just lofty ideals. We saw ourselves as programming prodigies able to code better and faster than the old grumpy suits-and-tie corporation engineers, but that was the arrogance of inexperience.

The reason is because everyone still wants to have things the way he likes and nobody is willing to give up his own preferences for the sake of common good. We had the moral caution to keep doing so from the old Cathedral and the Bazar manifesto, since we believed the Right Solution would always impose itself in the end. It never did.

iPhone 5 and iOS 6

It’s already a commercial success, while the press consensus is that it’s boring (“no vision”, “no creativity”, etc… with the recent addition of “you can see Steve Jobs is dead”). It was the same for the 4S, perhaps not so much for the 4 given the redesign, but it was also the case for the 3GS… Anyway, the best description I’ve read about this so far is from John Gruber : “this is (still) how Apple rolls”. No, we won’t ever again feel the same sense of wonder and history-in-the-making that the initial 2007 MacWorld keynote created. The iPhone is an established product, it will only have incremental improvements. Remember that even the iPad was met with collective yawns from the press : “It’s just a big iPhone”. It took a while to understand that it was yet another whole new market.

The only comment I have is, it may actually be too tall for me. The 3.5″ screen format fits my hand perfectly, I don’t have to reach too much with my thumb to activate any control, though the top ones are a bit hard to attain. I haven’t handled an iPhone 5 yet, but I doubt it will be as comfortable for me, and I actually hope Apple will keep maintaining models with both screen ratios, though I think that’s very unlikely.

About iOS 6, the big disappointment of Plans overshadowed almost everything else. Yes, Apple shouldn’t have bragged so much about it during the presentation. That said, the application itself is way better than Google Maps : it displays much faster, zooms and rotates much more smoothly. It’s just the data which sucks, though from what I’ve seen with the satellite tiles around here (south of France), which were updated twice already between now and the first iOS beta, Apple is pretty hard at work at improving it. It’s still remarkable that Tim Cook wrote an apology about it, that’s not a common thing in Apple’s history.

I joke, however, that Plans is actually a ploy to divert the attention from the real fiasco, namely Podcasts. The new app which is supposed to handle that very important functionality of iPhones (the term ‘podcast’ derives from ‘iPod’) may be pretty (if you like skeuomorphic UIs – I don’t, and I find it idiotic that a device like an iPhone should present the appearance of a 4 decades old reel-to-reel tape player) but it can’t handle playlists and, worse, does not properly sync the episodes status with iTunes. As many, I had a simple “unplayed podcasts” playlist (a smart-playlist, actually, giving me all unplayed podcasts from the French national radio France Inter), and the whole thing was maintenance free. Refresh daily on iTunes, sync the iPhone, then in the car, ask Siri to play the smart-list, and that was it. Now iOS 6 has broken this, because even after removing the Podcasts app and having them back in the Music app, iTunes still fails to sync the episodes status, I have to manually mark them as played. It’s hard to think of something more stupid than this. I hope the next release will fix that.

Objective C and C++ verbosities compared

In my previous post I mentioned Steve Job’s choice of Objective C over C++ for development, based on his assertion that he wanted “to eliminate 80% of the code you have to write for you app”. To which Chris commented that in this older post, a piece of C++ code that I had converted to Objective C was actually twice as long as the original.

I probably should have gone into a bit more details in that initial remark on Objective C vs. C++, but that would have been rather off topic. Chris’s comment calls for more discussion, though.

The quick reply :

the C++ code I converted uses a simple STL-based data structure. The Objective C version uses Core Data.

There’s no question that, for simple operations, Core Data is more verbose than the STL equivalent. In the latter case you’re iterating over a simple container, in the former you’re actually querying a DB (an SQLite db, btw – yet another example of a well-reused piece of technology).

So, even though both codes do the same thing conceptually, the underlying technology is completely different. Yes, the Objc version is much longer, however… in the C++ case, there’s the whole definition of the STL data structure which is not shown in the example, and that you have to write. In the case of Core Data, well, there isn’t. You simply design the data model with Xcode’s Core Data builder :-). So the complete number of lines of code is actually smaller in the ObjC case.

The longer reply :

Core Data didn’t exist yet in at the time Jobs chose Objective C as the base language for NeXTStep, and Objective C does tend to be more verbose in its APIs than C++. The best way to demonstrate this is through a basic example of an array of ints :


// declare an array of ints
std::vector arrayOfInts;

// add one element

// get the element’s value
int i = arrayOfInts[0];

Objective C

// no array of ints, only arrays of NSObject* so :
NSMutableArray* arrayOfInts = [NSMutableArray array];

// add one element
[arrayOfInts addObject:[NSNumber numberWithInt:1]];

int i = [[arrayOfInts objectAtIndex:0] intValue];

If that reminds you of Java, you’re right. And thankfully, autoboxing is being added in Objective C (better late than never).

Edit Jan. 12th, 2013 : the above code would now be written as follows :

// no array of ints, only arrays of NSObject* so :
NSMutableArray* arrayOfInts = [NSMutableArray array];

// add one element
[arrayOfInts addObject:@1];

int i = arrayOfInts[0].intValue;

// or, even simpler :

NSArray* arrayOfInts = @[ @1 ];

int i = arrayOfInts[0].intValue;

end Edit

So why choose Objective C over C++ at the time ?

Because Objective C, as verbose are its basic APIs, is actually closer to Python than to C++. The object model is much more elaborate than in C++ (in which you have nothing else but virtual methods) : classes are first class objects, an object can be asked if it handles a method or not, you can add methods to a class at runtime and without having to derive it, and the language’s position toward type safety is much more relaxed – it’s essentially duck typing. You don’t have to declare a method in the interface to implement it, which is very convenient for internal methods. You do not have to declare a method that you override either, again simply implementing your classes’s own version is enough.

@interface MyClass : NSObject

// nothing


@implementation MyClass

// this overrides NSObject:init
– (id)init
// some code
[self moreInit]; // moreInit not declared in interface

– (void)moreInit
// some more code


In C++ that would be :

class MyClass : public Object
virtual void init();
virtual void moreInit();

void MyClass::init()
// some code

void MyClass::moreInit()
// some more code

All this makes of Objective C almost a scripted language in disguise. While in C++ you will spend a lot of time getting your types right, and any non-trivial refactoring will take a whole lot of time, Objective C lets you code much more freely.


(yes, this is an old topic, but I’m a slow blogger. Anyway…)

Of all the material that came under the spotlight shortly after Steve Jobs’s death, the most interesting one I’ve seen by far was his WWDC keynote from 1997 :

Let’s recap the situation : Apple is months away from bankruptcy, Gil Amelio is the current CEO, they’ve just bought NeXT and Steve Jobs has returned as “advisor”.

At the WWDC (that’s the Mac developers worldwide conference), Steve Jobs walks on stage and instead of doing a presentation, offers to take questions from the audience. And his answers have made me realise why the guy really was completely different from the other tech CEOs that run other IT companies.

Here’s a breakdown of the most interesting moments (time indications are approximative) :

4:00 – explains how he thinks that there’s a market for great products. Not “fancy products with an apple logo on them” – great products. Products that stand above the others in term of quality. All other companies do market studies, try to offer a variety of products tailored to each market segment… He wants Apple to do differently.

5:00 – “I know some of you worked on stuff that we put a bullet in the head of” : the way he acknowledges that is rather uncommon in my experience. Most would try to weasel around the issue and soften the blow. The reason he doesn’t is not because of his usual callous approach, it’s because he’s able to justify it with what comes next : “Focus is saying no”. This will echo to any developers who’s been involved in a project that has fallen to feature bloat. Raise your hand if you’ve met many managers with this kind of mindset. I haven’t.

10:00 – at this point he acknowledges that Apple should no longer reinvent everything, as they had done in the past. Pick the right elements (i.e. the Unix core technologies), figure out what they need to turn them into a product that is really better than the competition. And they did just that.

13:00 – “using computers not for computation intensive tasks, but as a window into communication intensive tasks” – coming from NeXT, he describes what his experience was using an OS which had the network built in from the start. A familiar vision to any Unix user, something very remote to Mac users at the time. In more ways than one, he also describes what cloud computing is aiming to bring to everybody now.

He also mentions gigabit ethernet, which will only be deployed 3 years later.

16:00 “what is really exciting to me is to look at that personal computer, and take out every moving part except the keyboard and the mouse”. That’s the Macbook Air, right there, which would be released 11 years later, in 2008. There was also the failed Sun/Oracle Network computer in between, but Apple pulled it off.

19:00 “Apple is vertically integrated – makes the hardware, the software, the marketing experience”. To this day, nobody else than Apple has this, and few still understand how fundamental a strength this is for them. So he got that while the PC world has the advantage of economies of scale, they can’t match Apple’s reactivity and ability to provide a much more seamless experience.

22:00 let’s not forget this is a developer convention – here he explains how cool the NeXTStep development platform is. Nothing special in itself, except that I don’t know of too many CEOs of IT companies who can convincingly sell a development environment to an audience of experienced developers. The part about “managing complexity” (at 25:00) really hits home.

41:00 “the way you get programmer productivity is not by increasing the number of lines of code per programmer per day. That doesn’t work. The way you get programmer productivity is by eliminating the lines of code you have to write. […] the goal here is to eliminate 80% of the code you have to write for your app”. Another thing that not too many tech managers get (although more do nowadays than back when this was recorded). That’s why he chose Objective C over C++.

01:01 About the Newton. “Most companies can be successful with 1 stack of system software. Rarely can they manage two and we are going to succeed at managing two during the next several years with MacOS and Rhapsody. I cannot imagine being successful at managing 3”. Let’s recap : this is still MacOS 9. Rhapsody, which will become Mac OS X, is in its infancy. So Apple will have to manage those two. The 3rd one is the Newton OS, therefore that will have to be shut down. Again, focus.

In a few years, once OS X is well established, they still will release the iPod, which did have its own (very simple) OS.

“Do you have a newton ?” asks a guy – He replies he bought one of the early ones, thought it was a piece of junk and threw it away, same with a Motorola Invoice. He grants that the new Newtons may be a lot better, the guy suggests he tries one, but he stops the argument with this : “the high-order bit is connectivity. It’s being in touch, connected to a network”. He then explains that using infrared to transfer data from your organizer to your computer when you get back is not what he wants. “If somebody would make a thing where you’re connected to the Net at all times… I’d love to buy one”.

Again, 1997. The Net is mostly accessed through modems. DSL is in its infancy. Wireless data access hardly exists at all. Yes, the concept itself is obvious, but at this point it’s clearly many years away… 10 years away, to be exact, when he took the stage at MacWorld and started with those words : “we’re gonna make some history today”. Others had implemented that concept before them (Treo, Blackberry), but they set the bar on how to do it.

Name one IT company which could see and plan 10 years ahead, and successfully achieve those plans. That’s focus.


Usually, the month of August is a quiet one as far as news are concerned. Not this year. In a few short weeks, Google buys Motorola (the mobile division), HP goes out of the hardware business (and axes its WebOS products), and, last but certainly not least, Steve Jobs resigns from his position as Apple CEO.

So I guess I should indulge in a post about those.

In chronological order :

The Google deal. People can argue all day long on whether it’s a blunder or a master coup, I remain very curious as to how Google can “integrate” a hardware company with such a different culture. Even if the plans are to let it run independently, this is not going to be straightforward.

HP’s strategy change : apparently, they’re trying to “pull an IBM”, and turn themselves into a pure software/service company. OK, good for them, IBM’s got one more direct competitor. I hope they’ll licence WebOS and still actually try to do something with it.

Finally, Steve. More telling than anything is that the news of his resignation have all but eclipsed both of the other items. It got front page on even French newspapers. Dozens of homages, timelines and anecdotes about him have been posted. I have to say I actually feel sad about it, for two reasons : I admire the guy as he’s been the only “bad ass” CEO in IT, and the Apple saga is quite unique in this industry. As such, I can’t see him go without feeling regrets about it.

There’s another reason, however. May be it’s me getting old, but I feel the IT industry has gone pretty dull in the past 10 years, except for one thing : Apple. Back in 97-98, when Jobs went back to Cupertino (and people didn’t care so much at the time), what was really exciting was the rise of Linux and free software. Years before that, it was Amiga vs. Atari ST. Oh, and the BeBox was fun for a short while. OK, the advent of Internet for all and the World Wide Web was pretty exciting too. But nowadays, what ? Linux failed to reach its promise as a credible alternative to Windows, OS X achieved that instead. Smartphones have gone from ugly, button-laden bricks to slick pieces of glass and metal thanks to the iPhone, and tablets are becoming ubiquitous, thanks to the iPad. Without Apple, we’d have Windows, an ever-growing bunch of Linux distribs, Palm Treos and thick fat laptops. Apple is the only company that successfully challenges the overall boring uniformity and makes things interesting and fun. And they also raise the quality bar quite a few notches in doing so. Should they stop doing that, I can’t see any replacement.

Thoughts and questions after the WWDC 2011 keynote

I haven’t finished watching the keynote yet but have read most of what there’s to read about it, so here goes. Also, that will be a nice change not to write about something 5 months after.

First, no questions about it : as far as iOS 5 is concerned, the new features were all lifted (copied, stolen…) from elsewhere. Android, RIM, and the jailbreaking community (which they should properly acknowledge and let work unhindered, IMHO – add a proper “hack me” mode to iOS and see what happens). Good thing they did, all where sorely lacking.

More interesting are the new features of OS X Lion. Many will seem unimportant, but may have profound changes on the way we work. Like full screen apps for instance (which people who are challenged in their sense of observation have dismissed as “finally OS X has window maximize”). We all consider multi-tasking OSes for granted, all too often forgetting that we, users, aren’t. There’s enough literature on how computers and the Net make it hard to focus, as we are constantly solicited by dozens of attention-grabbing sources of interruption : mail, IM, twitter, etc… Reverting to a screen which shows you only what is pertaining to the current task is a nice change. Note that this has already been touted as a feature by some word processors (writeroom, OmmWriter and, of course Pages).

But the most interesting part was definitely iCould. Assuming they can pull it off properly, that will change a lot of things. I had to look for hackish solutions to keep my iTunes libraries in sync between my Mac pro and my MacBook pro. Cultured Code has spent an untold amount of effort to deploy sync for their nice Todo app Things, often under the frustration of their users, and this has just been made obsolete by the fact that iCloud will have an API useable by tiers (something I recall wishing for for MobileMe). And as a photographer, PhotoStream and the perspective of being able to de-rush on my iPad is a huge boon.

I just wonder how flexible will the settings be. Apple usually takes care of its ‘pro’ customers, although often with a delay. I don’t think I’m going to sync my 300Gb photo library on any “cloud” yet. Not so much a question of space as a question of how long it would take through my ADSL connection (100kb/s upload). Movie makers and musicians are going to have the same problem. Of course, “cloud computing”, no matter who provides it, won’t really achieve its full promise until we all have fiber connections. DSL just won’t cut it.

Finally, iTunes Match. Now that’s a cool move from Apple, but no word yet on how widely available it will be. I’m guessing it will take a while before we have it in France.

edenx update

In the last few months I’ve had enough free time (i.e. less concerts to shoot) to resume work on edenx. Long before that I had converted most of Rosegarden’s Rulers code. This wasn’t a port, since I’m using CoreData structures, it meant that all the STL stuff had to be converted to CoreData calls.

For instance, this (full version) :

Event dummy(“dummy”, 0);
dummy.set(BarNumberProperty, n);

ReferenceSegment::iterator j = std::lower_bound(m_timeSigSegment.begin(), m_timeSigSegment.end(), &dummy, BarNumberComparator());
ReferenceSegment::iterator i = j;

if (i == m_timeSigSegment.end() || (*i)->get(BarNumberProperty) > n) {
if (i == m_timeSigSegment.begin())
i = m_timeSigSegment.end();
} else ++j; // j needs to point to following barline

becomes this (full version) :

NSManagedObjectContext* moc = [self managedObjectContext];
MyDocument* document = [[NSDocumentController sharedDocumentController] currentDocument];

NSEntityDescription *timeSignatureDescription = [NSEntityDescription entityForName:@”TimeSignature” inManagedObjectContext:moc];

NSFetchRequest *timeSignaturesRequest = [[[NSFetchRequest alloc] init] autorelease];
[timeSignaturesRequest setEntity:timeSignatureDescription];

// order by absolute time
[timeSignaturesRequest setSortDescriptors:[[document coreDataUtils] absoluteTimeSortDescriptorArray]];

NSError *error = nil;

NSArray *allTimeSigs = [moc executeFetchRequest:timeSignaturesRequest error:&error];
if (allTimeSigs != nil) {

if ([allTimeSigs count] > 0) {

// there are time signatures – find the one right after the bar n
BOOL (^checkBarNumberBlock)(id, NSUInteger, BOOL*) = ^ (id obj, NSUInteger idx, BOOL *stop) {
if ([[obj barNumber] intValue] > n) {
*stop = YES;
return YES;
return NO;

NSUInteger idxOfTimeSigAfterBarN = [allTimeSigs indexOfObjectPassingTest:checkBarNumberBlock];

More recently, I’ve improved the drawing of the segments on the canvas (and actually introduced the concepts of segments, which at first I had discarded, figuring that just using tracks as is for event container would be enough, but since Garage Band is apparently having those too, I changed my mind).

This led me to learn about Core Animation, which is a pretty darn cool framework. I’m a bit annoyed by the C-based API (mostly out of principle), but given the data structures in play, it’s good enough. What is cool about Core Anim is the wealth of effects readily available, and that every change in a CALayer (say a resize) is, well, animated.

When I moved to OS X one of the things I liked was that UI changes never happen instantly, there’s always a quick animation smoothing the change from one state to the other. I found this to be much easier on the eye, because you actually perceive the change and can follow what happens, rather than having to find your way in the UI again. I always wondered how this was done, if all apps had intricate code taking care of those changes. Well, turns out it’s automagically handled by Core Animation.

For instance if you have a CALayer with a bounds rect of 100.0 witdth (all Core Anim coordinates are floats – another huge bonus), and set it to 200.0, simply setting

rectLayer.bounds = CGRectMake (0.0, 0.0, 200.0, rectHeight);

will result in a rectangle smoothly expanding from 100.0 to 200.0. This is the default behavior but it can be altered in any way you want (faster, slower, grouped with other anims, etc…). Likewise, adding a layer will not flash it instantly in view, but will fade it in.

So I tuned this for edenx. On this screencast I create several segments and resize them :

Finally, I’ve looked into how to display notes. At first I anticipated that I’d have to “manually” draw the notes through a bunch of Core Graphics commands. Then I remembered that Rosegarden was using Lilypond‘s note font. Things became much more interesting when I found about CTFontCreatePathForGlyph() : take a font glyph, get a CGPath, which you can draw, transform, etc… to your hearts content. That offered me a quick solution to the note drawing problem (although there are still some fine tuning to do).

Here’s a short screencast :

So, to get back to edenx’s dev status. I’ve just reached the point where I’m about to dive into layout code. Looking at RG’s, it won’t be easy. I think I’ll write my own, very basic version instead. Then see about adding/removing notes, after which I may have something engaging enough for volunteers to join in.

One more thing I wanted to talk about. In Rosegarden, we were quickly confronted to the problem of letting know some part of the code that something has changed in data. For instance, if you have two editors opened on the same segment and make some changes in one of them, you need to tell the other editor about that change. Or you’re recording and you want the notes to appear immediately in the editor. This led us to write a bunch of code following more or less the Observer Pattern, or to use the Qt signal/slots in some cases. But we never got around to make something generic that would be used in all the code. So I was happy to find that Cocoa has a standard solution for this problem, which are notifications. Yet more code I don’t have to worry about, which is a good thing given how much time I can spend on this.

Update on iPhone vs. Android

So where are we now. Android has taken over, in terms of number of handsets running it. That was rather obvious.

The consensus is, it’s “Mac OS vs. Windows all over again”. That’s also hard to escape : a “luxury” product being offset by a cheap, not quite as good but “good enough” version, that surely rings bells. But it’s a bit more complicated.

For one thing, the original Mac never had a market share comparable to the one of the iPhone. And, despite what old Apple fanboys would say, Windows quickly became better than Mac OS, at least because it had more useful programs running (boring, but useful : Lotus 1 2 3, Word, that kind of stuff). The original Mac was hardly a serious office machine, and Apple failed to turn it into one back then.

The other thing is, what’s happening now in the PC market is that the Mac is gaining market share, and Mac sales are growing much faster than PCs. Also, Apple makes quite a bit of profit out of Mac sales, while most PC makers have paper-thin margins.

This clearly demonstrates that there’s money not only in being the absolute market leader, but also in being able to offer better service and quality. Customers are willing to pay for that. The pre-sales numbers of the iPhone 4 on Verizon seem to corroborate it as well.

So a likely outcome is that, even though the iPhone market share will diminish compared to Android as a whole, it will still be much larger than any single Android handset, and also much more profitable (except may be for Google which will rake in the cash from ads and licenses).

On a related matter, there’s the issue of “user freedom”. iOS is a walled garden, no contest. But the claim that Android would make the user free is not being realised so far. Quite the contrary, all the mobile providers are taking advantage of Android’s openness to do precisely what they couldn’t do with the iPhone anymore : rebrand it. They have regained the power of control on the software updates and the services being provided. When Apple releases a new version of iOS, every iPhone upgrades in a matter of days (except for that big fuck-up that iOS 4 was on the 3G model), no matter the carrier. When Google releases a new version of Android… not much happens. There’s no telling when your handset will actually run it, because you don’t know when the handset maker or the carrier will decide push the update, and they need time to port their own added layer too.

There may be pressure from users against that, but I doubt it will make things any different. Only geeks will have a problem with that situation, regular users won’t care or even know about the problem.

And then, there’s the new Nokia/Microsoft partnership. And HP’s WebOS. It tempting to dismiss them as too-little-too-late, but who knows…