Lispian Random meanderings on whatever catches my fancy

Lispian
Lasagna Code: Redux

I write here as sort of a pressure release valve. It seems that my little rant on Lasagna Code got some attention.

I read through the comments. It seems most get what I’m on about. But I figure I might as well be a bit clearer, in case any of those posters revisit.

Yes, I’m against object oriented programming. I’ve been against it for years. I find it an obtuse and bloated way to code. And beyond Smalltalk, I’ve really not found another decent object-oriented language within which to code. This obsession language designers have of wedging an object system into a language “just because” is rather stupid. After all, all objects are is formalized data structures. It’s really that simple.

For those that are about to jump up and down and scream they aren’t, I won’t argue because it’s pointless. I’ve used OO since the early 80s, initially in Smalltalk. Later, much to my horror, C++, and then for a brief period with Java — but I refuse to ever touch Java again. Too horrible and, happily, I’m sufficiently old enough to not have to do what I don’t want to do.

Python, as I said, is a very elegant language. I like some aspects especially, such as the forced indentation rules — but mostly because so many people seem incapable of comprehending how to use the tab key and keep their code blocks indented correctly. After 30+ years in this business I have grown weary of staring at unindented, improperly indented, uncommented, single-character variable name code that was probably comprehensible to the person who wrote it, but to anyone else it’s utter and complete gibberish.

I’m also not against JavaScript. JavaScript, for those that haven’t followed its history, is very much an Algol-like interpretation of Scheme

http://en.wikipedia.org/wiki/Javascript

– which is a Lisp variant. Obviously, tacking on “Java” onto the name was brilliant marketing but it’s not Java. And you can write some pretty nifgy code with it. Although it has an object model embedded, it seems you can ignore it pretty much all the time. I do have problems with JavaScript but that would require a longer rant than I’m prepared for at this time. But linked with HTML5 JavaScript provides the ability to view the browser as a platform, and that is a game changer. A complete reset, akin to what the PC did to programming back in the 70s and 80s.

As may be gathered from the languages I like I’m a functional guy. The reason is that over the past 30+ years I’ve noticed that functional programming is just way easier for a novice to pick up. I’ve seen that recently where students were given a pure functional language and immediately were productive. On the other hand, other students forced into using C++ or Java were hopelessly lost constantly searching for objects and abstractions that would save them time. It was a difference of applying the necessary math vs. fighting with a system that seems more intent on arguing with you than helping you get the job done.

Some results may be instrumental here. In the functional language we were using, which is effectively a Scheme derivative, we were able to code up a page caching system in about 30 lines of code, including comments. In Java, it was 800. And that was after a couple of senior, very good Java blokes were finished with it. Similarly, I wrote a tokenizer for a DSL effort we also have underway. The first version I wrote was in C. I have 30+ years of C experience. And I wanted something against which to compare, plus do testing against. That tokenizer is 1200 lines long. The one in the functional language? 80, again including comments.

Since bugs are and have always been measured in terms of bugs per lines of code, it should be painfully obvious that the fewer lines of code you hammer out the fewer bugs you’ll have. It’s just common sense. Thus, any system that can give you a density of 10:1 or more will mean at least an order of magnitude fewer bugs per system coded. For those who prefer metrics, Steve McConnell stated that the industry average is 15 – 20 errors per 1000 lines of code. Thus, if you have a system that is 100,000 lines of code you’re looking at 1500 – 2000 bugs. However, if you can code that same system in 10,000 lines you’d be looking at 150 – 200 bugs. If it can be done in 1000 lines, you’d be better off still.

And smaller is less complex, in the sense that it’s easier to comprehend since you’re probably more focused on the language’s capability as opposed to some class library. And the brighter bulbs can keep the code and concepts in their heads. And the more code you can keep in your head the more likely you’ll see something stupid. If you’re staring at hundreds of thousands of lines of code, you’re just lost in the morass.

The notion of OOP was that you could leverage other people’s code and get those advantages. But that’s not the case. Instead you find code that “almost” does what you want, and then tweak and adjust it. And most of the time, it’d have been faster to simply rewrite it — and the code base would have been smaller.

Thus, if a person starts with a richer, more capable language and programmers actually utilize intelligence instead of code grazing to develop software we’d all be better off. We’d have better programmers and better code. Instead we have irritated, bored programmers and horrible code. Worse still, I see many very talented computer scientists simply up and leaving the profession because of the “productivity tools” they’re saddled with.

As might be obvious by now my problem is that we’ve gone from people who knew how to program on small machines and who comprehended those machines to folks being spoiled by using large, complex systems with lots of resources understanding neither the environment, the language, nor the underlying machine. In other words, the former comprehended the language-machine interconnection. They knew what clock cycles were. They comprehended that bad code would impact performance. Much of that has been lost. There’s this utter disconnect.

It’s so bad that we’ve gotten used to how badly systems perform. Instead of wondering why is this computer, that is infinitely more powerful than the one I used 30 years ago — running nothing more than a variant of an OS I used then (UNIX) — so much slower? You can’t blame the pretty colours. There’s more going on. There’s just way too much sloppy code.

Some may figure I’m just getting old and now worry about what it’s going to be like when they get old. I think it’ll get better. Why? Mostly because there seems to be an awakening that what we’ve been doing for the past 20 years has been wrong. Even some of the Fathers of OOP are coming to the conclusion that functional is the way to go. For those that prefer more proof than my blathering, watch Dave Thomas discuss OOP and functional languages and offer similar laments at SPLASH recently. And for those unfamiliar with Dave, let me just point out that he is the godfather of OOP. And as he says, OOP is a huge commercial success, too bad it’s an utter practical disaster.

 

One Response to Lasagna Code: Redux

  1. Pingback: Quora

November 2011
M T W T F S S
« Oct   Dec »
 123456
78910111213
14151617181920
21222324252627
282930