Posts for Sunday, February 14, 2010

package managers

i’ve been into package managers for some time and i always wondered why developers have to define dependencies manually and not supported by a program using ‘a syscall interface monitoring all OPEN calls‘ in combination to something like ldd for the built binary.

the idea is pretty simple:

  1. first try to compile the program you want to packet. if all compiles then you probably have installed all required libraries on the system. (i assume PATH and LD_LIBRARY_PATH wasn’t altered).
  2. next create a package as an  ebuild for instance. the ‘configure’ and ‘make’ step can be called by ‘emerge’. emerge could then monitor everything, as
  3. on ‘build time’ of the package all OPEN syscalls are monitored and every single file which is opened (which in theory belongs to a different installed ‘known’ package) is checked for the owner package. a group of packages which the ‘build step’ belongs to is acquired, and
  4. finally after the source has successfully compiled the binary is processed by ‘ldd’ which usually looks like:
    # ldd `which htop` =>  (0×00007fffce0a7000) => /lib/ (0×00007f305f907000) => /lib/ (0×00007f305f684000) => /lib/ (0×00007f305f328000) => /lib/ (0×00007f305f124000)
    /lib64/ (0×00007f305fb64000)
    now every library the binary depends on is expected to be from a specific package (again – which in theory belongs to a different installed ‘known’ package)
  5. finally all this automatically detected dependencies are reported to the package maintainer and one probably could also check the existing dependencies for the package to see if they match.
  6. another important point would be if a package, NO read: if a upstream package is removing support for something which requires an external library, does include a dependency which is not used anymore it could be dropped automatically as well.

i did not test if that would actually work but i can’t think of any issue why it should not.

i’ve had some other ideas as well, if you are curious you can read the article in my wiki [1]. if you want to contribute please don’t use the comment field, just write me an email or ask me for a wiki account and you can change it directly.

my goal is to evaluate a mixture of the concept of ebuilds – everything built from source – with having binary packages combined to a p2p network in both hosting ‘already compiled packages’ as well as a distributed compile farm. currently it’s a mixture of german and english but i might fix that soon. another important goal is to merge package management at some yet unknown point to remove redundant work and to have a common package manager for all linux distribution as well as all other distributaions as mac os x and windows.

spoken in analogies: the same process for building a cross platform package manager is ‘cmake’ for cross platform build system generation which was (or currently is) replacing  ‘autotools’.



Review: Parsing Techniques: A Practical Guide

For Christmas this year, I received a shiny hardback copy of Parsing Techniques: A Practical Guide by Grune and Jacobs. It's a thrilling book, if you want to learn parsing, which I do.

Where most books proceed in a sort of linear fashion, this book teaches parsing in layers. First you learn what a grammar is. Then you learn what it means to parse: what's a parse tree? What's bottom-up vs. top-down? What's a leftmost vs. rightmost derivation?

Next you get some general ideas and methods for parsing, e.g. CYK and Unger, and then you dive into the implementations of parsers (in pseudocode and in C) in great detail. This is about as far as I've gotten so far, before having to go back and figure out what the heck I just read. But it's an interesting progression. Reading the book, I feel like I'm constantly revisiting things I learned a few chapters ago, but this time in more detail. The book kind of does a breadth-first traversal of the world of parsing.

Be warned however: this book is not easy reading. It's dense, heavy on the info, light on the entertainment. Unless you really get a kick of out parsing, this will probably put you to sleep if taken in large doses. But it is a trove of information, and I couldn't put the book down during certain chapters.

In fact there's so much information in this book that it's almost depressing. The bibliography alone takes up 1/4 of the book, and lists 1,500(!) authors. It'd take me a week to read the bibliography, and probably many years to read every book listed there. Parsing could easy consume a lifetime of study, and I'm saddened that I'm probably never going to find the time to master all there is to know. But such is life.

If I had one quibble with this book, it'd be the same quibble I have with most math papers. The notation is horrible. Say what you will about programmers, most of us know that code is written for humans, not for machines, and we give our variables descriptive names. In math it's all single letters variable names.

When the authors of this book run out of single letters, they use letters with bars over them, or bold letters vs. normal typeface letters, or they do things like this:

...whenever a non-terminal A is entered in to entry Ri,l of the recognition table because there is a rule A -> BC and B is in Ri,k, and C is in Ri+k,l-k, the rule A_i_l -> B_i_k C_m_n is added to the parse forest grammar, where m = i + k and n = i + l - k.

This is the first paragraph of a section. Those variables are not mentioned before this sentence. This is certainly not a style of writing that I'm used to reading. It takes me a good dozen tries to understand. (Using lowercase i's and l's right next to each other should be prohibited by law.)

In any case, this book is good. One of my favorite tools has always been Perl-style regular expressions, and I feel like this book has expanded my understanding of how they work. Learning to write a recognizer, learning how things are implemented under the hood, you couldn't ask for a more interesting topic. I can't wait to try writing a toy parser generator or regex recognizer in Clojure once I've solidified my understanding of some of these concepts.

UFOs, or the Joy of Skepticism (part one of two)

One topic I don't write about enough is skepticism and science. It's something I find interesting (in fact something I try to live my life by) but up to now I've never really come up with anything I thought I could write that would be worth reading.

Today I thought of something, so I'd like to remedy this situation. Here is a fun story (part one of two) about strange experiences I've had while flying. It may involve alien spacecraft!

Space aliens!

In this incident, I was on a normal commercial flight somewhere north of Washington state. I like to have a window seat and sit and stare out the window when I fly. Flying is an amazing experience that has never quite worn off for me, and I like to enjoy the sights.

On this flight, as I stared out the window, I saw some kind of orange sphere go whizzing backwards past the window. The plane was well above the cloud line at this point, and the "orb" was between the plane and the clouds, so I knew it wasn't on the ground. I thought to myself that this was very strange, but maybe my eyes were playing tricks. I was looking carefully out the window at this point, when I saw a second, similar orange sphere go flying past (again in the reverse direction of the plane).

How easy, how tempting it would be to say "UFO! Aliens! Experimental government aircraft! Ghosts! Demons! Sasquatch!" Maybe not sasquatch, but still. This thought immediately occurred to me as I sat on that plane. (More on this later.) This is the kind of experience wacky UFO beliefs are made of.

Or maybe not

I'm happy to say this line of thought didn't last long in my mind. The first question I must ask myself is how genuine my experience actually was. What did I really see?

  1. I have no idea how big the orange globules were. It's extremely difficult to judge the size of something in the air if there's nothing close to it for reference. Scroll down a bit and look at the picture of two planes on this site. I don't know if the photo is genuine or photoshopped (yay skepticism) but I don't have much reason to doubt, and you can find many similar photos all over the internet.

  2. I likewise have no idea how fast these things were going. To me they appeared to be going backward, so did everything else, because the plane was going forward at many hundreds of miles per hour. Maybe these things were stationary and the forward motion of the plane made them look like they were going backward. Maybe they were even going forward, but more slowly than the plane. They would still appear to be going backward to me.

  3. More importantly, did I see something that really existed? It's very possible it was a trick of light. Maybe a reflection in my window. Maybe a play of sunlight on the clouds. Interestingly, no one else on the plane seemed to see anything. I didn't ask around, but I didn't hear anyone yell "Oh wow, look!" This leads me to suspect that maybe I was just seeing things that weren't there.

  4. If I did see something real, did I see it accurately? At the time this happened, I was very excited about the plane trip I was taking. It's not hard to imagine that I was so excited as to be distracted. I was also pretty hungry (I never eat when I fly, as a rule); my brain may not have been at full operating capacity.

  5. I also must consider that my eyesight is terrible. (I asked my eye doctor what my vision was once, and he laughed at me and said "blind".) Maybe my coke-bottle glasses caught a reflection. Maybe my sucky eyes see spots sometimes.

  6. Perhaps most importantly, maybe none of what I remembered ever even happened. This was years ago. I am telling this story from memory. Is my memory real? Maybe I did see something, but I'm misremembering important details? Maybe there was only one sphere, not two? Maybe they weren't orange? Maybe I have all kinds of details wrong. Maybe I dozed off for a minute and I'm remembering a dream. I have had dreams before that I later remembered as "real" but that the people involved tell me never happened.

The human mind is a highly fallible piece of hardware and memory is a lossy storage system. A lot of people greatly overestimate how accurate their eyes and their memories are. Eyewitness testimony and anecdote are the least valuable form of evidence, for good reason. The entirety of the scientific method, testability and repeatability of testing, is intended in one sense to make up for the deficiencies of a single human brain. Lots of brains testing a theory lots of times gets us close to answers, but one experience in one brain, even my own, is highly suspect.

But what was it?

Suppose I really did see something, and I really am remembering properly. I have no idea what these things could've been. But I can think of a lot of things more likely than space aliens. "Balloons" is the obvious first guess.

In any case, I'm perfectly comfortable settling on "I don't know". I don't have a need to know. Bad answers are worse than not having answers.

Who cares about this?

This story is important to me partly because of my childhood. As a young child I had a lot of superstitious beliefs, and I was plagued with fear of a lot of silly things. One of those things was UFOs. I slept with my window closed so I couldn't see the night sky, and I never looked up when I went outside at night. I was terrified I'd see something scary in the sky. I had nightmares for years.

I cured myself of this fear by educating myself. I learned that the chance of there being aliens flying around earth is very close to zero. The physics of space travel nearly precludes any such thing from happening. The logic of a bunch of super-advanced beings coming all this way just to poke people up the bums with pointy rods and leave doesn't really make any sense. There is no actual evidence that UFOs are anything other than natural phenomenon, urban myth and the misunderstandings of a gullible public.

One day I read Sagan's The Demon-Haunted World: Science as a Candle in the Dark, which is an awesome book for many reasons, one reason being the serious treatment of "space aliens" by a scientist who apparently would really love them to exist on Earth, but is forced by lack of evidence to say they probably don't.

Sagan makes some excellent points in the book. He mentions for example that many people claiming to be abducted by aliens would contact him and offer to ask the aliens questions for him. Sagan would ask some people to ask the aliens "Should we be kind to other humans?" (and the answer was always "Yes"), and others he would ask to solve unsolved (at the time) mathematical problems like the Poincaré conjecture. It stands to reason that any race of beings advanced enough for interstellar travel would have answers to such mundane mathematical problems. Sadly Sagan never got an answer to the latter question.

In any case, via countless similar little data points collected in my 29 years of life, via a lot of thought and reading the thoughts of others, today I'm no longer afraid of the dark. Science was certainly a candle for me, and continues to be one. As I sat on the plane that day, I enjoyed the puzzle of thinking about what I saw (or didn't see?) and what it could be. I enjoy living in a world without demons.

Part two will follow. It's about rainbows.

Posts for Saturday, February 13, 2010

FCron + Postfix

In the late hours I got FCron and Postfix to work together (problem: stupidity on my part) so I can finally have a nice cron system that reports to my normal (non-root) user's inbox what the system crontab is doing.

Should make my life a lot less hectic now that I mastered how to make tedious repeting tasks a part of what my laptop does for me and not what I should do for it.

...more on FCron to come. Need sleep. Badly...

hook out >> Vancouver over, just bed now...

Exaile: The best Amarok since Amarok 1.4

Like a sad dumb dog who still hopefully visits the grave of his dear, departed master, every once in a while I try Amarok 2 again. Unfortunately, there has been no improvement in usability since the last dozen times I checked.

But have you seen Exaile lately? This is what the bleeding edge version looks like:


It's pretty nice. It's about as close as you can get to a stable, fully-functional Amarok 1.4-ish player nowadays.

Aside from looking good, Exaile is good at handling ID3 tags (a few Japanese tags that Amarok 2 displays as ????????, Exaile displays properly) and it's pretty fast to rescan my collection nowadays, which is nice. It does fairly sane grouping of multi-artist albums under "Various artists". It supports moodbar and song lyrics and cover art fetching and such, if that's the kind of thing you enjoy. It even splits the library display by the first letter of the artist names, just like Amarok 1.4 did, which is awesome.

I did have some problems installing the dependencies (python bindings for webkit?) for some of the plugins, but oh well. I figured it out.

Today I went so far as to install gnome-settings-daemon and gnome-control-center just so Exaile wouldn't look like crap. I use KDE4, and I haven't touched Gnome or any Gnome libs in a few years, so this is saying something.

Mark Kretschmann, an Amarok dev, recently wrote an article about the paradox of choice, in which he said (probably correctly) that being presented with too many options and too many choices end up paralyzing people and making them miserable.

Sorry, but the irony was overwhelming...

Exaile Explained

Amarok2 Explained

I really do believe there's a good program buried somewhere in that mess of controls, desperately wanting to be free.

2nd meeting of FSFE (Fellowship group) Slovenia

Yesterday we held our second meeting of (soon to be) FSFE Slovenia / Fellowship group Slovenia — and it was quite a success.

First for some stats: on the first meeting there were 9 of us, on the second one there were 14, and on the three-day-old mailing list there's already 20 members!

Apart from figuring out that we need a FSFE group and are willing to do it, we already made some sound plans on what we plan to this year and I'm very happy that in the group we have so many experienced people from all sorts of backgrounds.

It's quite late and I sent too much time behind the keyboard today already, so I'll just make it short — bullet-time style!

FSFE Slovenia's plan for 2010:

  • found FSFE Slovenia — start as a Fellowship group, in a few months' time develop a FSFE team from that;
  • Document Freedom Day — we've yet to decide if we want to make it small or already huge in the first year;
  • translate FSFE's site — make a team to translate it;
  • FOSS in schools — promote GNU/Linux and other FOSS to pupils directly and also by collaborating with the profs. There's some pretty good and detailed ideas on how to do that already;
  • Windows/Apple-tax — devise a system (probably via sample e-mails etc.) on how to make Windows- and Apple-tax refunds easy, bring it to the masses and possibly even try to work with the competition and consumer protection offices to solve the problem on the big scale;
  • si2010 — we plan to take apart the national implementation of the EIF and see how much of this strategy has been really implemented and write a report on it. We know this could be the motherload and should therefore be taken under serious scrutiny and would be a terrible mistake to let such an oportunity to leave a mark pass us by!

Personally, I think the plan's is just right — ambitious enough to change something, yet not too big, so it stays managable.

hook out >> maybe Vancouver, most definetly bed...

Posts for Friday, February 12, 2010

Technology ain't everything

Let's discuss can openers.

Growing up, my parents would often invest in electric can openers. These things never worked. Some of them sit robot-like on top of the can and walk themselves around the top while chopping the metal. Some of them were mounted on the wall and you somehow get the can to hang in a harness while the device spins the can around. It takes a PhD and double-jointedness to get the can set up in these devices properly. And then you push a button, a lot of noise happens, and usually the can ends up half-open, half-bent up to the point where it's un-openable short of dynamite.

When I open a can, I use one of these. You jam the metal bit into the can and turn the crank, the can spins in a circle and 10 seconds later, off comes the razer-sharp top. The one I own was probably manufactured in the 1980's and it's still sharp enough to open a can with minimal effort.

Is it really that hard to turn a handle for 10 seconds? Do we really need computer-controlled robotic can-opening devices?

Consider books. I still buy and read all of my books in the form of compressed wood pulp. There are newfangled e-book readers, but I don't want one. Why? Because the only places I read are 1) In the bathtub, and 2) Lying in bed. Taking a computer into the bathtub is generally not a good idea, and holding a Kindle above my head for 3 hours is awkward compared to lying a (3-D) book on the bed beside me with one page bent up so I can read it. (Note: I have dropped a book in the bathtub on more than one occasion, and contrary to my expectations, once it dried it was still perfectly readable, no ink runnage at all.)

I know some day, maybe soon, paper books are going to be gone and we're all going to read books from digital devices. But I like my books. I know there are benefits to having electronic books instead of paper ones. But even though they're a waste of space, even though they can have pages ripped out, even though they can burn up or smudge or age and become brittle, I like paper books better.

Mostly I like paper books because they're simple, analog devices. I don't have to mess with any kind of user interface. Books don't have battery life. Books don't have copy protection. Books don't require me to sign up for user accounts at some website and worry about having an internet connection. I can flip through the pages with my fingers. I can tell how many pages are left by the thickness of the pages that are left. I have actually never comfortably finished a long e-book, not even books about programming, where you'd think the ability to copy/paste code would be a boon. I'll pay good money for a paper copy of a book even if the electronic version is free.

This is probably the most banal thing I've ever written about. But there is such a thing as too much technology. I say this as a person who spends all day trying to get people to use databases instead of keeping drawers full of paper records. Technology for the sake of technology is a waste of time.


Not working for Facebook

In november last year, I was contacted by Facebook HR.
They found my background interesting and thought I might be a good
fit for an "application operations engineer" position in Palo Alto, California. (it is
basically the link between their infrastructure engineering and operations/support
I did a few technical interviews over phone with other app ops and engineers
from CA (about the Linux kernel, lowlevel userspace, mysql, memcached, networking, programming,
scalability, etc) and solved one of their optimisation puzzles.
( I picked usr bin crash. Actually I wanted to do something with thrift but I
couldn't get it to compile
). The technical interviews went well but then I had
another interview which was about handling support. As I have no experience
in setting up support frameworks and procedures to hand off to separate
support teams, I was/am not good enough for this position.

Then they suggested a role as site reliability engineer for the office in Dublin, which is more about
troubleshooting, monitoring and systems management/automation.
So I did some more interviews with SRE's and engineers from the London office and
from Palo Alto. Similar subjects as before, but with more of an operations/support touch to it.
These also went well, except the last one, which was more about things less related to
high-performance/scalability such as nfs, pam and ldap.
I think I missed too many questions on the last interview. I could come up
with some excuses such as me being tired (it was the evening before our Kangaroot showcase event, and
the call being late - Facebook HR messed up a timezone conversion) but fact
of the matter is: I have little experience with such "office ;-)" stuff.

So after 8 interviews over phone (each one about 40-60 minutes), spanning
about 2 months, they let me know they would not go forward with me.
That was late december, I asked for some feedback but haven't heard from
them since.

Bottomline: it sounded quite nice but I'm pretty happy with my current life in Belgium.

"Every person needs a project"

A few weeks ago after lunch my coworkers and me were sitting in the cafeteria of our university and having a post-lunchy coffee. We were basically all just digesting and I actually don't remember what we were talking about but I can remember one quote by my Professor: "Every person needs a project."

For a few weeks this has been at the back of my head now because it kinda fit to some feeling of unrest I had been having lately. Not the kind of unrest that makes you run up walls or buy a motorbike or get your ears pierced, the kind that makes you wanna create. Since I started working out at a local gym (just to stay in shape and to avoid turning into the stereotypical computer scientist) I have found myself with a lot of extra energy that I couldn't get rid of.

"Every person needs a project." I do have projects. I have a relationship. I have this blog. I now have a podcast. I have a dissertation to work on. I have a job. But it didn't seem to be enough, there always seemed to be more.

We live in times of great freedom. There's so much to learn and so much to know and that can, as I wrote about a year ago, become a burden, can become more of a curse than a blessing.

A similar thing is true for being creative. We used to have so many excuses for not creating things: The tools were too expensive. Or you needed other expensive resources. You could spend time writing a story but you wouldn't ever get it published. Nowadays a blog is free and after a while you will find people reading your stuff. You can create images, movies and whatever you can think of and have thousands of people watching it. And again all this freedom can make you freeze.

"Every person needs a project." And I agree. Every person does, there needs to be something that you put your heart into, some tiny, microscopic space within the universe that you change, that you form. That little piece of everything that you focus on to form the world. But it's not all.

Every person also needs a vision. I recently realized that the only people I pity more than those without any project are those without a vision. The German politician Helmut Schmidt, who was German Chancellor for a while, once said: "If you have visions you should go and see a psychiatrist." And that idea has been floating around for a long time. We've been dissing people that said that they are dreaming of something for years. Telling them to "grow up" or "get real".

Because we, as a society, hate visions. Because visions change the world. Because they challenge the status quo. Because it might mean that we might lose the status or position we have. That we might have to adapt.

And where has it lead us? To a world of two extremes: "Robots" and "losers". The "losers" are those not willing or not capable of playing the "make yourself a stupid machine so you are predictable and can be properly used for work", society either laughs at them or treats them even worse threatening them with forced labor and other dehumanizing concepts. The "winners" or "robots" made themselves uncreative so they wouldn't disturb the system.

People aren't machines. People are not just "functional units", they are more than a set of skills or a job description. People are creative and brilliant and a source of unlimited entropy (and that is awesome!). People need their own projects and visions that they follow. There's more than one world, there should be unlimited ones: The one we can call our "shared reality" (however that thing is actually build) and all the worlds that the different people around you want the "real world" to be.

Find a vision. Find a project. In fact, find many. And if you already have something, talk to me, we'll setup a date and talk about it. Let's make the world wonderful.

WIPUP February release date revealed. Ooooh.

Now that all the hubbub over the KDE SC 4.4 release and KDE website redesign is over it’s back to regular blog posts and other pet projects. This, some of you would’ve realised by now, includes WIPUP – which I’ve really tried to turn into an incremental release project. So yes, I’m announcing the February release date: 21.02.10.

Read the full news here.

Oh, and happy Chinese New Year!

Related posts:

  1. After the WIPUP release, the stats are in.
  2. Countdown to KDE 4.4 and the new KDE website
  3. WIPUP 14.01.10 released!


Setting up Cacti SNMP Monitoring on a Windows 2003 Server

So this week I’ve been taking a break from planning our Exchange 2010 migration and have been playing around with Cacti as currently we have very little data on things like network and server usage short of a couple of key websites being monitored by an external site to track uptime, but absolutely nothing to tell us if servers are being overloaded or that our internet connection is being saturated.

For those who haven’t heard of Cacti before, its an open-source PHP based frontend that can be used to graph pretty much any data source you can feed it with the most popular source being SNMP which pretty much any business class network enabled bit of electronics supports these days. Even if you only have quite a small network like ours, it can be very useful to actually visualise whats going on, and its a lot easier to show your boss a graph showing how your internet connection is maxed out and needs replacing/upgrading than any other way!

Rather than re-write an existing guide, the easiest and quickest way to get Cacti running is to follow this guide written by a very helpful Cacti user over on the Cacti Forums. Below are a few additional tips that should help you avoid some of the problems I ran into when setting up Cacti on a Windows 2003 server.

  1. Dont use PHP 5.3, stick with 5.2 as 5.3 doesn’t yet include the SNMP module and so wont work with Cacti.
  2. Do install Cygwin, its only an optional step but for the little extra work needed it will make patching Cacti with updates a lot easier (bug fixes only get released as .patch files so having Cygwin installed allows you to run the patch command just like on linux).
  3. Dont install Apache unless you plan on using it in place of IIS, if you already host sites using IIS then there is no need to install Apache at all.
  4. If you are using IIS make sure you install FastCGI before you install PHP, FastCGI is the recommended way to run PHP as of 5.2, its as simple as running the installer so there is no reason not to use it.
  5. When you reach the IIS instructions, if you are using FastCGI the skip to step 8, dont miss this out as step 8 and onwards talks you through setting the file permissions correctly and Cacti will not work properly without them!
  6. When setting the permissions I had to always use the Advanced > Find option to select the IUSR_ user as it could not be found when typing in the name as normal.
  7. If you are polling other Windows based machines then make sure you increase the SNMP timeout value when adding the device, for some reason Windows takes longer to respond to SNMP queries, setting it to 5000 works fine for me.
  8. By default SNMP isn’t enabled in Windows, see this Knowledgebase Article for how to enable it.
  9. Finally, if once SNMP is enabled on a machine and it still isnt sending any SNMP data despite you being positive that it is setup Ok, try removing the SNMP feature and re-installing it. I’ve had this happen on a couple of Windows 2003 boxes and after reinstalling the SNMP service it has started working.

Posts for Thursday, February 11, 2010


New phone

Since my old phone was sometimes shutting down without me telling it to shutdown or an empty battery forcing it to shutdown, I decided to get myself a new phone: the HTC Tattoo.

One of the things I missed on my Nokia N73 was the ability to sync pproperly with my linux enviroment, the tattoo runs android and syncs perfectly with google, wich in turn syncs nicely with other things. Of course the downside is that it syncs with google, but for now that is necessary evil.

So far I am pretty impressed, mobile e-mail, Facebook, flogging etc. All work like a charm. Even Eduroam, the WiFi used at the university works out of the box! That is not even the case on my laptop!

This message was of course written on my tattoo.

Lucky Day...

So...  This week has been just one blessing after the other.  I mean, seriously, it's been an amazing week!  I've been recording this great band in my studio, I've been rockin' the code at my day-job, and things are just great.

So, to celebrate my happiness, and general good mood, I decided to take myself out to lunch.  I went to Del Taco and got 10 49-cent hard-shell tacos.  Oh yeah... I'm living life in the fast lane, buddy...

Well...  On my way to Del Taco, I happen to notice a certain type of car, against which I've come to hold a grudge....  Okay, okay... Not just a grudge.  I flat-out want to massacre this particular type of car, no matter who's driving it.  Yes...  You guessed it, it's a Neon SRT-4.  Don't ask why...  It's too painful a story.  Well, okay... Here's the skinny of the history.

When I had first purchased my 2007 Mustang GT, I was putting 85 octane gas in it, uknowingly crippling it's potential for power.  In this state, I was on I-80, when an orange Neon SRT-4 pulled up and wanted to race.  I was positive I was gonna blow his doors off.  heh... yeah...  He beat me 2 out of 3 times, and I was left dumbfounded.  So now that I'm wiser, and have my mustang properly tuned, modified, and raging like a racehorse, I never let an SRT-4 go untouched, nor unhumiliated.

So, anyway...  Earlier today, I'm on my way to Del Taco on the freeway when, I notice this Neon driving behind me, and come to find out it's a 20's-ish-looking girl driving who looks like she's ready to take on the world - with guns-a-blazin'!

So, we play cat-and-mouse a bit on the freeway (keeping the speeds reasonable, because freeway racing is just stupidly dangerous), and turns out, we both get off the same exit....  and also turns out that we get stopped at the same red stoplight, first in line.

So, with engines roaring, as soon as the green hits, we take off like bats out of Hades, and within seconds, I'm at least 2 car lengths ahead of her, as well as about 50 MPH over the 45MPH limit...  So, I slow down, and we give each other a "thumb's up", and then, as we're casually passing over a little overpass, she speeds up slightly to get ahead of me, and since the race is done, I see no problem, so I gracefully slow down and let her pass by.

It's about this time, I notice the cop sitting in the drive-way of a residence right off the overpass...  Ouch...

I look at my speedometer, and I'm doing about 10 over.  She must have been doing 20.  We both pass the police officer, and after I pass him, he immediately turns on his lights, and exits the driveway.

I, of course, immediately pull over...

The cop then passes me, and about this time, I just can't believe my luck.  As I turn into the parking lot of a Walgreens, I happen to catch the glare of my new found friend in her nice shiny red Neon SRT-4, and I can't help but chuckle as her anger must have been boiling over.  Rightly, so!  I mean, she was being pulled over after having been totally speed-beaten by the guy who is getting away scott free.

I hope she didn't see me chuckle.  That would have been just rude.

Posts for Tuesday, February 9, 2010

ogre 3d

i’ve been using ogre 3d lately and here are some issues i had:

  • in gentoo there is no way to install ogre WITH the samples in one go, if one wants the samples as well one would have to install the library based from the source (bypassing the package manager) or hack the ebuild
  • to update ogre i had to uninstall ‘ogre’ with emerge -C ogre first and then install the new version with ‘emerge ogre‘ in a second go. if not done so ogre would screw up the build for some unknown reason?! if i recall correctly that was a linker issue
  • autotools is so ugly, i don’t see any way to compile the samples when not using ./configure from the base library directory which means also compiling all the library as well…
  • i think that writing all the ogre documentation must have taken a lot of time – and i honor that – but i prefer how the trolltech guys do  it. the trolls use a ‘incode documetation’ syntax (probably doxygen with some markup) and no external documentation. in contrast the ogre developers use a mixture of  mediawiki/html documetation/library doxygen/internal samples and external samples. if they would for instance only use templates in the wiki. currently not doing so makes it very confusing finding related things.
  • most of the ogre examples don’t include any screenshot of what is actually done, this makes searching much more ‘fun’…
  • they use autotools for the library and they use autotools for the samples. i used cmake for my project instead. since i don’t know much about autotools i had problems finding out what linker and compiler switches i needed and since my package manager removed all Makefiles after installation of the samples (or probably never build them at all) i had to do ugly things to find out how things work.
    FIX: so if you want to find that out i urge you to download the ogre release by hand (bypassing the package manager) and use ./configure; make, then have a look at the samples directory
  • the ‘package manager’ of gentoo (portage) has the latest ogre included but kubuntu didn’t, so we had to use the custom build script there by hand…
  • using the package manager of gentoo i set the useflag for double-precision which resulted in linker errors and also ‘our’ codebase which worked on a different machine using the custom build of ogre probably not using ‘double-precision’ at all using mesa for 3d rendering. but it did not work on my laptop after i removed the linker error. it just showed a black screen. took me hours to find out with lots of library recompiles…
  • the kubuntu defaults for the ogre library are acceptable but they had only a very old version in contrast gentoo didn’t provide good defaults since cg and devil was disabled but had a very recent version of ogre. funny, isn’t  it?
    this is how i have it working now:
    % equery u ogre
    [ Searching for packages matching ogre... ]
    [ Colour Code : set unset ]
    [ Legend : Left column  (U) - USE flags from make.conf              ]
    [        : Right column (I) - USE flags packages was installed with ]
    [ Found these USE variables for dev-games/ogre-1.6.5 ]
    U I
    + + cg               : NVIDIA toolkit plugin
    + + devil            : image loading support with DevIL
    + + doc              : Adds extra documentation (API, Javadoc, etc)
    - – double-precision : more precise calculations at the expense of speed
    + + examples         : Install examples, usually source code
    + + gtk              : Adds support for x11-libs/gtk+ (The GIMP Toolkit)
    + + threads          : Adds threads support for various packages. Usually pthreads
    you might wonder what that there is a ‘examples‘ use flag. but all it does it to install the example source but not the compiled examples NOR any Makefile, so …. again this autotools issue i don’t know how to compile it then… not in any documentation i could find either.
  • documentation how to use custom shaders is really missing in ogre, i managed to compile my own shader (cg file) with cgc (that is the nvidia shader compiler) but the library wasn’t able to use it no matter what i tried…
    it seems that most developers use windows and directX anyway which makes the situation on linux even worse.
  • in ogre one has several cfg files which are essential to run a sample application. but one has to create these files individually for every sample. so mixing a ‘package manager ogre library install’ with a ‘bypassing source ogre library build’ makes the situation even worse. the files are:
    • ogre.cfg (can be deleted, since it can be generated at program start with a gui)
    • plugins.cfg (i hate this file since it is required but no sample includes it after build), basically this file contains paths to shared objects .so files which are used by the ogre core backend
    • resources.cfg (this file contains a list of used textures, shaders and other graphics related stuff), this needs to be done per application. but there is a collection of Media files for the samples but the copyright is somehow strange since parts of it may not be redistributed…. so why do they included it at all?
  • ogre contains OIS for input handling but i couldn’t figure out how to get joystick support working. this probably might be broken completely. so instead of using OIS for joystick handling i had to include libSDL for that. i wrote a libSDL joystick example some time ago, can be found [1] on my wiki. i wonder what that joystick-support in OIS is all about…?
  • in general there are tutorials, about how to use ogre from an ide, in the wiki. there is also tutorials how to use ogre and how to import objects from blender and stuff. finding these resources is very time consuming. they have the tendency to be for ‘pro uses’ only and tend to be incomplete as well.
  • there is a ogre documentation which covers the basic concepts. i’ve been reading some stuff and it seems to be very good. however i can’t tell if it’s complete as my current use case is quite limited.

my project

you can find the code at [2]. but i’d like to warn you since:

  • we didn’t have much time so guess how the code looks
  • has no documentation
  • might not compile
  • voids many coding standards (especially c++ class/file naming schemes)

but still it might still be useful for some coders to look at, for instance the joystick integration or how i did use cmake with ogre…

so why do i write this post at all?

  • i hate the fact that there are multiple distributions – all factoring the same package of ogre – one more incomplete than the other….
  • the samples are not pre-built – if included at all
    i found a forum thread where developers did not see the samples as part of the ogre core distribution at all – how backward is that? examples and documentation is basically everything if one distributes a ‘library’!
  • the buildsystem of ogre and the samples is very confusing and badly documented
  • there is no qtdemo like program for ogre included, although there was a summer of code project assigned and completed recently, it’s refered as ’sample interface’ and will be included in ogre 1.7 – i love that!
  • despite the fact that there is no ’sample interface’ the samples are hard to build, no documentation about ‘how they can be built’ is included in the release

summary: i don’t like the way the documentation is handled in the ogre project and i don’t like the build system as well. my main criticism however is that the distributions do a very bad job at including the library (with all the samples missing & misc quirks).

possible fix: take a look at how trolltech did it with the qt release. they have assistant for the documentation (no need to open a webbrowser with a distribution specific installation of the doxygen and html documents). they have qtdemo and they build all the examples while also having the source around. they don’t have any external documentation as in a wiki for instance.





avatar relaunch with a brand new design!

I think I can safely assume that although there were signs from the latest mockup that suggested the design that was released today nobody really expected what came out to come out. Well, the final layout is out and recorded as per protocol in the WIPUP project. It can be seen in detail here. As usual the full project timeline can be seen in my WIPUP profile page.

Well, go check out the website to see the real deal. For those interested in more details they can read the KDE Dot article about the relaunch. Pretty awesome, except that I was originally credited as "Doin Moult" (now fixed). Hopefully I should be poking my nose into more KDE www projects in the future.

Also happy to see a spike up to almost 1000 views in the past week of WIPUP updates. I know it’s unfinished and all, but I must say I quite like using the system. Perhaps in the future I should try out some other media types in my updates to see how well they fare.

Related posts:

  1. Countdown to KDE 4.4 and the new KDE website: 4 days left
  2. Countdown to KDE 4.4 and the new KDE website: 2 days left
  3. Countdown to KDE 4.4 and the new KDE website: 1 day left

KDE SC 4.4 – Steady, Incremental Improvements

I haven’t noticed any killer features in KDE SC 4.4 and I’ve been running it since Beta 1. I’ve noticed a lot of subtle improvements.  Things like app stacking and selection in the task bar seem much more responsive.  All around, plasma looks subtly better and my favorite KDE apps seem to just keep getting better.

KSysGuard is really impressive and now has the ability to connect to remote hosts for monitoring.  However, the biggest change is in the greater ecosystem.  It seems all the external apps like Amarok, K3b, and digiKam are coming along to fruition.

Other than that, this is a smooth release and shows that the platform is starting to mature.  I think the Summer release distros will be able to do a good job delivering a nice desktop experience based on KDE 4.4.  I’ll end with my obligatory “try KDE 4.4 if you had previous bad KDE4 experiences”.

Share and Enjoy: Digg Slashdot Facebook Reddit StumbleUpon Google Bookmarks FSDaily Twitter email Print PDF

Related posts:

  1. KDE 4.2 beta 1 on Gentoo KDE 4.2 is set for release on January 27th.  Eager...
  2. KDE4 on Gentoo So I bit the bullet and installed KDE 4.0 on...
  3. One Small Step for QT, One Giant Leap for Free Software QT Software, under the graces of Nokia, has released the...

Posts for Monday, February 8, 2010


Countdown to KDE 4.4 and the new KDE website: 1 day left

The new KDE website redesign is due any day now (with the release of KDE SC 4.4) and when it’s released you will be able to see how ideas were amalgamated from many different mockups and some which I’ve not had the records to post. The final design is different, much more aligned with the KDE "Air" branding, and most importantly a shared effort, like what open-source is meant to be. So don’t be too shocked if what is released is completely different.

Check out the latest mockup update here.

Past mockups viewable in full on my WIPUP profile.

Related posts:

  1. Countdown to KDE 4.4 and the new KDE website: 2 days left
  2. Countdown to KDE 4.4 and the new KDE website: 4 days left
  3. Countdown to KDE 4.4 and the new KDE website

Posts for Sunday, February 7, 2010

What does all that crap actually mean?

I run open source software. A lot of it, I guess it comes close to 95% of my daily software usage. I run Firefox, Linux and a bunch of other free software things. I also support Creative Commons by making all my stuff licensed under free and simple licenses. But I am somewhat of a techy which admittedly most people ain't.

For most people the whole discussion about free software and open source and open knowledge and participatory culture is just hogwash: Why would they care about whether their software comes from some company or some guy in his/her basement? It's not like commercial software costs money (it doesn't, you just download it). What people care about is that the tools work.

This often leads to discussions about "freedom" and other fluffy, abstract words that lead absolutely nowhere due to lack of understanding, different definitions of words or general not-giving-a-shit-iness. Why is it so hard to explain the situation for the free software people and why is it so hard to understand as the average guy?

The problem is actually quite simple: For the tech person that says "but you can edit the source code!" this doesn't just mean that you can edit the source code, it means that you have regained access to the important means of production. In a world where software rules (like the virtual/online space) software is the same as a big ass factory is in the real world.

For the non-programmer, non-techy person that is sometimes hard to understand: What is the use of having source code if you cannot program anyways? That doesn't help, does it?

In fact it does. Let's try to milk this a little more till we get something clearer (and yes, I did just intentionally kill that metaphor ;-)).

Using free software is like getting back to a somewhat pre-industrialized world. No longer do you have to buy things at a big ass store, you can make things yourself. Imaging that when you need a certain tool, you can just build it yourself.

A reply could be: "But I don't want to have to make all my tools by myself!". Another one "But I don't have the skills!". And both points of view are absolutely valid, in fact, I agree with both of them. I don't have the skills to build all I need and I don't want to have to. Industrialized production has its merits! On the other hand, that is a somewhat wrong conclusion.

Having access to the source code does not mean that you have to do everything yourself. Let's try to bring the example to the real world.

Imagine a workshop. Not just one like the one you might have at home in your basement with a few basic tools to fix your bike. A bigger room with "real" machines, machines like say laser cutters and stuff like that. Machines experts use in their workplace. Of course you do not have the skills to use those machines. Maybe some but probably not all of them. But on one wall of the room is a big shelf, a library of "recipes" that allow you to build stuff that other people (that did in fact know how to deal with the machines) designed: You need a special screw so you go to the library, take out the construction manual and just walk through it. You take a piece of metal, you put it into a lathe and program it like the recipe says, a few minutes later you have your screw.

"But", I hear some people say (this is in fact just a figure of speech, I am not hearing voices ;-)), "if the screw I need isn't there, I'm [pun alert] I am still screwed!". Not at all my friend. You take the recipe of something similar, look who wrote it, go to that person and ask him/her: "Hello, you designed this thing which is kinda like the thing I need, could you maybe design something similar to it with the following properties?". And in order to show the other person that you respect their time, you can also add "I'd pay for your time.".

That is open source. It does not mean that you have to do everything yourself. It does also not mean that you cannot just build stuff and sell it to people: Building a bicycle is a pain in the ass and if you can do it well, you can sell those products to people that just don't want to do it themselves. The point is that people can modify it, change things, build on top of the knowledge of others, there always is money in betting an people being lazy or short of time ;-).

So don't focus on open source software being about some kind of "freedom", nobody cares about freedom (look at how many devices Apple sells!). Show them that open source is basically just the old tradition of science and engineering applied to real life (well in the case of software the "virtual life" obviously). It's basically just what you do with your neighbors: Each has something to offer, you throw all different abilities into a pot and see what kind of soup comes out.

Creating multi-page PDF files with GIMP and `convert`

Occasionally I have to sign some document (old style, with a pen) and send it electronically. Sometimes those are multi-page documents. Since it is uncommon to send it back as multiple image files after scanning, and multi-page image formats are uncommon as well, I’d like to send them as PDF file. Before I discovered this method, I used to insert the scanned images into OpenOffice Writer, and then create the PDF with it. This works, but it is a bit cumbersome to tell OpenOffice Writer to maximise the images (eliminating page borders, etc.), especially when there are a lot of pages. It just doesn’t feel like a real solution.

So, here we go:


  • GIMP (I’m currently at version 2.6.8, but this will probably work with older versions as well)
  • GraphicsMagick (tested with 1.3.8) or ImageMagick (tested with


  1. Get the scanned pages opened as layers of one image in GIMP. If they are available as files already, you can use File / Open as Layers….
  2. Make sure that the layers are ordered in the following way: Page 1 must be the bottom layer, the last page must be the top layer. You can reorder them via the “Layers” dialogue (activate it via the Windows / Dockable Dialogues menu if you don’t see it)
  3. Save As… and choose “MNG animation” or just add “.mng” to the filename. (In case you are wondering, MNG is the animated counterpart to PNG).
    A dialogue window saying “MNG plug-in can only handle layers as animation frames” will come up – choose “Save as Animation” here and press the Export button. In the next dialogue you don’t need to make any changes to the defaults, just press the Save button.
  4. Now, open a console window and simply enter
    convert document.mng document.pdf

That’s it – you now have your PDF file ready for sending!

Update (2010-02-08):
As chithanh pointed out in comment 1, there is another convenient way to accomplish the same. It does not involve GIMP, but instead requires pdftk to concatenate PDF files. Please see comment 2 for details.


Countdown to KDE 4.4 and the new KDE website: 2 days left

Only 2 days left until the KDE SC 4.4 is released, but apparently the website design is due out on the 8th! Yes, that’s tomorrow. Today’s update shows a stage in the mockup which is natural to designers – the rejection stage. A new idea (in this case, minimalism) is chosen and we try out something new to see if we like it.

View the full update here. Full progress can be seen on my WIPUP profile.

Related posts:

  1. Countdown to KDE 4.4 and the new KDE website: 4 days left
  2. Countdown to KDE 4.4 and the new KDE website: 3 days left
  3. Countdown to KDE 4.4 and the new KDE website: 5 days left

Posts for Saturday, February 6, 2010


Countdown to KDE 4.4 and the new KDE website: 3 days left

Yep, it’s just 3 days left until KDE SC 4.4 is released and we see even more polish on yesterday’s design for the KDE website. We’ve now shaped it into a full design and we’re debating which ideas we like from this and which should be thrown away.

Click here to check it out. As usual the full progress can be seen on the WIPUP profile.

Related posts:

  1. Countdown to KDE 4.4 and the new KDE website: 2 days left
  2. Countdown to KDE 4.4 and the new KDE website: 4 days left
  3. Countdown to KDE 4.4 and the new KDE website: 1 day left

Posts for Friday, February 5, 2010


I missed this perfectly healthy rotating kitchen?

<object height="360" width="480"><param name="allowfullscreen" value="true"><param name="allowscriptaccess" value="always"><param name="movie" value=";;show_title=1&amp;show_byline=1&amp;show_portrait=1&amp;color=00ADEF&amp;fullscreen=1"><embed allowfullscreen="true" allowscriptaccess="always" height="360" src=";;show_title=1&amp;show_byline=1&amp;show_portrait=1&amp;color=00ADEF&amp;fullscreen=1" type="application/x-shockwave-flash" width="480"></embed></object>

Related posts:

  1. TEDIndia: The thrilling potential of SixthSense technology by Pranav Mistry – how could I have missed this?


Countdown to KDE 4.4 and the new KDE website: 4 days left

Another day and we continue to see development on the KDE website redesign. We’re fleshing out the KDE webdesign mockup seen yesterday into a full page and it’s taking shape slowly but surely.

Check it out here.

As usual you can see the full series (3 in total now) on my WIPUP profile.

Related posts:

  1. Countdown to KDE 4.4 and the new KDE website: 2 days left
  2. Countdown to KDE 4.4 and the new KDE website
  3. Countdown to KDE 4.4 and the new KDE website: 3 days left


Syslinux from Linux!

This post tells you how to launch syslinux from a Master Boot Record (MBR).

Recently I was locked out of a customer-provided laptop with their development environment, and access to their source code repository via vpn.  I suspect their domain controller propagated an update last time I was on the vpn which has locked me out.

They're overseas and about 12 hours flight time away, so with their permission I used ntpasswd to reset the Administrator password.  The boot CD (downloadable as an iso) uses syslinux, which is fine, except that instead of wasting CD-Rs I like to use USB keys.

I copied the contents to a blank FAT32-formatted usb key, but it has no boot sector yet.  I installed grub and tried to make a grub menu file from the syslinux.cfg with these tips for converting a syslinux .cfg file to a grub .conf file.  I failed because the syslinux.cfg has the line:
 append rw vga=1 initrd=initrd.cgz,scsi.cgz

And I don't know how to append the two cgz's into one grub initrd line.  Normally grub uses an initrd like this:
 initrd /initrd.cgz

So I decided to install syslinux from linux.  The man page makes it look easy:
 syslinux [-sfr] [-d directory] [-o offset] device

When I ran "syslinux /dev/sdd1" and booted the laptop with this usb key, it just gave me a blank cursor blink.  This is because the laptop is looking in the MBR of the usb key and finding nothing.  The syslinux man page shows some hints:
Booting from a FAT partition on a hard disk
SYSLINUX  can  boot  from  a  FAT  filesystem  partition on a hard disk
(including FAT32). The installation procedure is identical to the  pro-
cedure  for installing it on a floppy, and should work under either DOS
or Linux. To boot from a partition, SYSLINUX needs to be launched  from
a  Master  Boot  Record  or  another  boot loader, just like DOS itself
would. A sample master boot sector (mbr.bin) is included with SYSLINUX.
Well, that's nice to know, but how do I put that on my usb key?  Follow these steps:

1. Start with a FAT32 formatted usb key (it can have other data on it) and some syslinux-based boot image.  I'm using the latest ntpasswd iso cd080802.  Unpack the contents to the root of the usb key.

2. Copy a boot sector to the code image of the MBR of your to-be-booted usb key:
sudo dd if=/usr/share/syslinux/mbr.bin of=/dev/sdd bs=440 count=1
  • /dev/sdd is my usb key
  • Look at Wikipedia for an explanation of the MBR layout
  • your mbr.bin might be in a different location.  It should come installed with syslinux

3. Run syslinux to make the partition bootable:
 sudo syslinux /dev/sdd1

4. Mark the partition as bootable (may not be necessary)
 sudo fdisk /dev/sdd

Select a, 1, w to make the first partition bootable.  CHECK THESE OPTIONS FIRST!

5. Insert USB key to your PC / laptop and boot (so long as your BIOS is setup and capable!)

Posts for Thursday, February 4, 2010


Countdown to KDE 4.4 and the new KDE website: 5 days left

Right, it’s a day later and it’s time to see where we’ve gotten with the redesign of the KDE website.

As most of you noticed yesterday, the main gripes were with the header of the design. Today’s update is a small but necessary one – one that shows the start of a new design idea. Very often the start of a design determines its success at the end, and people may or may not take an instant like/dislike to it.

I’d like to stress to those that perhaps didn’t quite get it that this work has already been done, I’m simply posting it out of good humour and I thought people would be interested.

Well, here it is.

Previous entries can be viewed in the WIPUP profile page.

Related posts:

  1. Countdown to KDE 4.4 and the new KDE website: 4 days left
  2. Countdown to KDE 4.4 and the new KDE website: 3 days left
  3. Countdown to KDE 4.4 and the new KDE website: 2 days left

Planet Larry is not officially affiliated with Gentoo Linux. Original artwork and logos copyright Gentoo Foundation. Yadda, yadda, yadda.