Of Jobs, rms and Ritchie

Last two weeks were a complete train-wreck for people in IT: First, Steve Jobs died on October 5. I thought I should write something to “enlighten” some people saying things like “Oh, he did some gadgets, so what?” But, when I was waiting for the dust to settle, rms came with an statement that inflamed a lot of people, including some asking “What this Stallman wrote, after all?” Le sigh. And, again waiting for the dust to settle, another lost: Dennis Ritchie died a week after Jobs. And no press really cared.

But let’s close those points:

First, we have Steve Jobs. Yeah, he was a capitalist who commanded one huge company who did care about creating devices where under their total control.

But, at the same time, he was the helmsman of a company that disrupted everything other companies were doing. In IT, we have the expression of “disruptive technology”, which are things created the shake the market completely. Linux is such thing. And Apple products are other. Tablets changed and got accepted when Apple launched the iPad, even if the idea of such devices predates the device by a long time. Smartphones weren’t really smart till Apple launched the iPhone; MP3 players only became a mass market thing after the iPod; and suddenly we have a whole new wave of ultra-thin notebooks coming around after the MacBook Air.

Apple did hold control of everything they did, yes. But you can’t deny how disruptive the company was for the IT market.

But there is one thing that tops it all: HTML5. When HTML5 was just a thing coming out of draft, Jobs was the voice saying “Kill Flash, ’cause HTML5 is the future”. Sure, he had his motives (the fact that his company hold part of the video standard was one of those), but HTML5 became a big thing when Apple started pushing it. Or would you prefer Adobe position of, for example, saying that “Linux is hard, so no acceleration there” instead of Mozilla position of “We recognize there is a problem with accelerated WebGL/Canvas on Linux, but we are willing to work with developers to improve this situation”?

And then Richard Stallman, also known as “rms” (yes, lower case), came with some… erm… “rude” words about Steve Jobs. And then people came with “And what have this Stallman done? Nothing.”

Well, dear sirs, Stallman is the helmsman of the Free Software Foundation and the creator of the GNU project. The very first thing rms created was the Emacs editor, which influenced the readline library, which influenced, for example, the OS X shortcuts. Oh, and his second project was a compiler called “gcc”, which was the only compiler on XCode till version 3 (version 4 also uses LLVM, but that’s a completely different beast). And what’s XCode? It’s the official application for creating iOS applications. Oh yeah, ladies and gentlemen, part of the success of Apple can be be attributed to the guy that dissed Jobs days after his death.

Not only that, but gcc was the official compiler for NeXTSTEP, the operating system that Jobs started after being fired from Apple. And hey, guess what? NeXTSTEP is the reason Apple bought NeXT and, as Jobs said himself, “was the base of the renaissance of Apple”. Would NeXTSTEP be such interesting operating system for Apple if they had to take the time to design their own compiler?

And, after you brush off the politics of rms message… Can you point anything wrong with it? Isn’t Apple the company that pushed a completely controlled environment? Isn’t that the completely opposite vision of what rms have for software?

And then, on October 12th, Dennis Ritchie died. Ritchie was the co-creator of Unix operating system and the C language.

“Whateves” is what those attacking rms and saying “meh” to Jobs are probably thinking now. Unix is the power behind OS X today (with the Mach kernel) and its main developing language is Objective C, which is C with some added features (Wikipedia page lists it as “adds Smalltalk-style messaging to the C programming language).

Not only that, but rms idea with the GNU project was to create a completely open source Unix — which later was achieved with the help of the Linux kernel — including a C compiler (the gcc I mentioned before).

If Apple went long, it’s because it stood in the shoulders of giants like rms. If rms managed to fulfill his dream of an open source operating system, it’s because he stood in the shoulders of giants like Dennis Ritchie.

The multiple faces of nothing

[… or “C, variants and the NULL”]

In C, you have a way to represent nothing. It’s NULL (all caps). NULL points to nowhere and it’s defined as “0”. Why would someone use it? Well, if you have a list and some of the elements aren’t valid, you make them NULL. Since NULL is not a valid pointer, your application will crash if you try to access it. The whole point of NULL is to provide a way to represent the nothing. There is also a nothing type “void”, which you can define anything statically, but you can make it a point of it. Since all pointers have the same size, a “void pointer” is, basically, a pointer to anything.

Also, C have the idea of “nul-terminated strings” (yes, with just one “l”.) the “nul” character is represented by “\0”, which, in practical terms, is a space of memory with the size of a “char” with the value 0 on it.

When going down to the very bits of NULL and nul, they go almost the same, except for their size.

C++ was build on top of C, but if defined NULL as a pointer pointing to the byte 0. It’s almost the same thing as the C NULL but, because it’s a pointer, it doesn’t need to be converted when you’re using a CPU which have a different size for “int”s and pointers (usually, pointers are “long int”s or even more, if your CPU have more than 64 bits.)

Objective-C is a variant of C adding support for objects in a different way and the biggest “user” of Objective-C is Apple. The Apple version of Objective-C provides some basic types like lists. But, because you can’t leave an empty space in the list (which I think it similar to the way we deal with nul-terminated string), they created a NSNull object, which is a valid object, but it represents the null (which, by the way, are called “nil” in Objective-C.) It’s not an invalid memory address, as it points to a real object. The NSNull object provides just one method, “null” which returns a “nil” pointer (are you confused already?)

Now, the fun part: Most list (dictionaries actually, but the process is almost the same) operations, when you try to access an object that doesn’t exist, returns nil. But remember that the only way to leave an empty spot in a list is adding a NSNull object. So, to be really sure that something is not there, you need to check if the result is “nil” or “not an [NSNull null]”.

That’s too much stuff for nothing…