Sunday, August 31, 2008

Why I hate Idea Men

Ever shoot a rifle and miss the target? Well, if you haven't here's how you miss: you're aim is off by just a little bit.

The same thing happens when you're building software - or anything, for that matter.

I've spent about 40 years building systems of various kinds - mostly software implementations to 'solve' some problem. The process is pretty straightforward and goes like this:

1. Try to figure out what you're going to do.
2. Pick the most important parts and the most important actions and interactions between the parts.
3. Pretend that is all there is in the world and build some software which mimics everything you've thought of. By the way, these parts, actions and interactions are the 'model'.
4. Test it until it works well enough to do the job.
5. Knowing what you now know, go back to step 1 and do it all over again.
6. Repeat step 5.
7. Repeat step 6.

You get the idea.

Nobody is ever right the first, second, third, or even the 'last' time.

Models aren't reality - we just work on them until they are 'close enough'.

What's this got to do with Idea Men?

Idea Men do parts 1 and 2, then they get somebody else to do step 3.

Then they blame the guys who built the thing in step 3 because they don't want to do step 4.

Finally, they won't do step 5 because they're 'right' and the guys who built it are all incompetent and that's the reason it doesn't work like they said it would.

Then the Idea Men get promotions and raises.

Crap

Tuesday, July 1, 2008

Singleton Patterns in PHP 5

The Singleton Pattern is one of the few things I really like from Design Patterns. While the original Gang of Four book has some very good stuff in it, later works seem like Software Industry Marketing Fluff [SIMF] - but I've already bitched about that.

The Singleton is GREAT because it can replace Global Variables - if not everywhere, at least in a lot of important cases.

In case you don't know what a Singleton is, it's an Object which has only one instance. Ideally, you would like to write something like
$foo = new Thing();
and get a reference to the one instance of Thing.

The "normal" way to implement the Singleton Pattern is to hack up the class definition in an unnatural way:
  • Disable the normal constructor by making it private
  • Define a class variable to hold the One Instance
  • Create a different public function which is used to instantiate the first instance and return it on all subsequent 'instantiations'
This is because none of the Object Oriented Languages directly support Singleton - except Ruby which has a Singleton base class so you can create them by normal inheritance.

I was so taken with Ruby's Singleton base class, I decided to write one in PHP 5.

Drum Roll........

Didn't work. Failed Miserably.

It turns out that in PHP 5.2 static class variables [that is, Class Variables] are always in the base class. I won't go into that here. Instead, take a look at this test I wrote.

Anyway, just because we can't build a base class for Singleton Objects, doesn't mean we can't clean up their creation. Here is an example of a pseudo-Singleton which allows me to write
$foo = new Thing();

and have a $foo object which acts just like a Singleton even though it has many instances.

The reason this works is that we don't need a single instance of the Object. What we need is a reference which returns the common data, where the methods act on the common data, where modification of the common data is seen by all existing references.

The Pseudo-Singleton looks, acts, and smells like a Singleton, except it is more convenient and natural to use.

The only thing better would be a 'singleton' keyword prefix to use in 'class' definitions.

Saturday, June 21, 2008

Reality and Stuff Like That

Believe it or not, I spend far too much time thinking about what is real and what isn't.

It seems a lot of people seem to think that there are lots of realities and lots of equally valid versions of Truth. This seems like fuzzy thinking to me. So fuzzy that it just doesn't make any sense. [where are you going to put all these versions of Truth - if there isn't a universal Truth which contains them all?]

Anyway, here's what I've come up with: (nothing original here - much stolen from Lao Tzu's work)

The first thing to be clear about is 'how do we think'. 'Cause if we understand that, then we can understand what we think is 'real'.

As nearly as I can tell, we start out by classifying stuff as 'this' and 'everything else'. Some people call this 'dualistic thinking', but my opinion is that it's simply a way to simplify the cacophony of sensory input we are inundated with. It also lets us find 'patterns' so we can form 'habits' to simplify plodding though daily life.

Anyway, when you add some properties to these 'classifications', you get a 'model'. It's a model because the 'thing' has behaviors associated with it. Like a window lets in light and a tea pot holds tea.

The BIG CATCH is when we confuse the 'model's we construct with the real world we live in. The 'model' world is really an illusion we've created - but which we treat as real.

What is 'reality'? That's tough because we're so used to classifying, finding patterns, and creating models that we aren't equipped to perceive - let alone process - what's actually out there.

Anyway - that's what I think is going on. It also answers the quest of why we argue so much:
we're all living in different illusions - not different realities - and our illusions are only loosely based on 'reality'.
There are a lot of conclusions you can draw from this - but this is plenty for now.

Wednesday, April 23, 2008

ORM's or not?

In my quest for a CMS I can not only stand but want to embrace, I've spent the last couple of weeks messing with Drupal.

Drupal is impressive - at the very least in scope and participation. I like quite a bit of it. In contrast with Rails, Drupal releases seem to be Designed rather than Grown - which is a very strong plus. [I'm not sure I can stand it, but that's another story.]

But I'm not writing to Praise Drupal, but to make an observation about Object Relational data Mappers.

Drupal through 6 doesn't have an ORM. They have two 'database adapters' and opt to use SQL. Most of the SQL used for CRUD operations is standard, so that works just fine. I'm told there is discussion in the development forums about stuffing a layer of abstraction in the database code - which I read as 'create an ORM.'

Looking over the landscape littered with the corpses of ORMs in almost every conceivable (computer) language, should give pause - at least it does to me.

Why do we ORM?

Well, we have to have the data tables in the Database match up with the Attributes in the Objects, don't we?

Well, No we don't.

There's no reason for the Data Tables to model our 'business process'. That's a knee jerk reaction taught as gospel in C.S. courses.

Couple of Facts:
  • Databases are really good at holding, retrieving, searching, and sorting vast quantities of data. Programming Languages aren't.
  • Programming Languages are really good manipulating, displaying, inputing, validating, mashing, spindling, and folding data. Databases aren't.
Now for a Web Site, much of the content doesn't have a long life, is retrieved by simple, well known searches, ordered by well known sorts, and is displayed in small chunks.

It seems to me that there much of the data used in a web site could be modeled exclusively in the Programming Language and then serialized in XML or JSON for storage in the database. That separates the representation of the data in the application from the representation in long term storage and obviates the need for an ORM.

I like that idea so much, I'm building one now just to see how it works in practice.

Stay Tuned

Friday, March 28, 2008

DRY versus WET

You've got to have either a Snappy Acronym or Fancy Vague Phrase (FVP) in order be credible. So when I decided to write about what I think is wrong with DRY, I came up with 'DRY versus WET' - but I didn't know what WET meant.

DRY - Don't Repeat Yourself - is one of the Mantras of the Current Age of Programming Wisdom.

In case you don't know what a mantra is, it's a sound or phrase which you repeat over and over again while meditating. Meditating is something you do with your mind which brings peace and quiet . . . by avoiding all thought.

The problem I have with DRY is that you have to Think in order to write Good Code. In other words, you need Well Examined Thought - that's where WET comes from. [So now I've got an Acronym that Actually Means Something].

For Example:

Take a look at the Rails source code to see what over-DRY-ness does. After a while you'll see that (almost) every piece of code which could be repeated is turned into a method call. Even stuff which isn't more than a fraction of a line! This is Really DRY - so It Must Be Good. Right? Wrong!

This is Spaghetti [see The New Spaghetti for more ranting . . I mean 'detail'].

Historical Note: Spaghetti code originally referred to the overuse of GOTO (or branch) statements in FORTRAN or Assembler programs, but any program which has a high ratio of Control Transfer statements to Useful Work is really Spaghetti - because it reads like all the code was written on the sides of noodles and then all mixed up.

So . . . Don't DRY up. Get WET!


Think it's worth doing a book about?

Thursday, March 13, 2008

The New Spaghetti

Back before Structured Programming - which everyone now thinks is as natural as breathing - we had a thing called 'Spaghetti Code'.

You wrote Spaghetti Style by using lots of conditional and unconditional GO TO statements to re-use every bit of code you'd written. It was the privative precursor of DRY [Don't Repeat Yourself].

In those days it kind of made sense - for a couple of reasons:

1. we didn't know any better

2. memory was tiny - we measured big machines in KiloBytes and wrote programs using 'overlays' [in case you don't know, an overlay is a chunk of executable code which 'overlays' another chunk in memory. We used to do that because (a) we didn't have enough memory to do the job and (b) there wasn't any automatic memory management]

The result of Spaghetti Style is that nobody could ever figure out how the programs worked and modifying them was next to impossible.

What brought this up?

I've been trying to 'learn Rails' - which is a pretty large Ruby application. Rails has close to 10,000 methods defined, about 1,500 classes, and God only knows what else. The Vocabulary is huge - and so it's a daunting task to say the least.

On top of the huge vocabulary, there it's DRY.

That seems to mean that every chunk of code which is used two or more times is ripped out and turned into a method. The result is that in order to do anything, somewhere near one and a half gazillion methods are called. In order to figure out how almost anything works you have to work through a call stack which is makes the old FORTRAN Spaghetti look like a 2nd grader's maze.

Why is this happening?

I'm guessing, but I think there are three factors driving this result:

1. DRY - This is a misunderstanding. DRY is a hip way of saying 'we want to maintain a single point of control so that we don't have to repeat bug fixes in the code'. DRY shouldn't be a mantra, because it's not a Principal of Good Programming. It's a method to implement an idea which is good programming.

Prior to formulating 'Single Point of Control' as DRY, programmers used some judgement in the size of the methods/functions they wrote. Typical size was 5 to 25 lines or so - enough to do useful work and small enough to be easily understood. Rubyists seem to be addicted to one-line methods which call other methods which call other methods which ... - but they don't Repeat!

2. Agile Development - This seems to go hand in hand with Test Driven Development. Both sound like good ideas, and they seem to work well for 'throw away' applications - which is what most web sites are. After crawling through the Rails Association Code, I've come to the conclusion something critical has been lost in the Agile-ness of the Development: There's no Design Phase.

Test Driven Development has proven you don't have to have an overall plan to build code which works. You can build any twisted mess and if you continuously test, it will pass the test.

But it won't make sense, won't be cohesive, will probably be inefficient, and it will be hell to modify or learn.

Everything has to be designed it is going to work well and be maintainable. That's true for Buildings, Cars, Motorcycles, Bridges, and, believe it or not, Software.

3. Ruby itself.

I love Ruby - almost - because it's a beautiful language and it makes it easy to otherwise hard things. The fact that Ruby is a scripting language means we get instantaneous feedback - and I love that too.

But both of those things - along with Ruby's openness which allows all kinds of self-modifying code [which seems to be confused with meta-programming] makes it easy to get the illusion of 'doing something' when the programmer is actually just churning different implementations with little impact on improving the goals of the application being developed.

Does that make sense? If not, read it again.

Activity != Progress

It takes discipline and experience to handle as powerful a tool as Ruby is without creating a mess.

Don't believe me?

Just take a look at the source for Mongrel and compare [or rather Contrast it] with a bunch of the Rails code. Zed Shaw writes awesome, easily understood code. Somebody in the Rails Core doesn't.

Friday, February 22, 2008

Hey Ruby: Where's the Main Program?

So as I get deeper into Ruby and Rails, I've found I've been going absolutely nuts! trying to understand how the programs work.

I think I finally figured out my problem. (That's right, it's a problem with ME for a change!)

There's no 'main' in a Ruby program.

Most executables - as in /usr/local/bin/foo - are two to ten line Ruby scripts which set up some environment variables and 'require' the 'real' script. The 'real' script usually 'require's a bunch of other scripts, instantiates an object, and calls what appears to be a random method.

That's it.

Step back and contrast this with C, Java, Python*, FORTRAN, COBOL, the Boot Sequence of a computer, ... Need I go on? Everyplace else there is (or pretty much is) a well known function, procedure, 'program', file or something where execution starts for every program which is written. Even Javascript kicks off with an 'onload' function (except in JQuery, but I think that's really a wrapper).

Notice I've left out Lisp, Scheme, Smalltalk and a whole bunch of other languages.

I don't know much about them - from a practical point of view - except that they are interpreted, don't (to my knowledge) have a 'main' entry point, and allow programmers to modify the base system so that it is unique to their program.

Now, I know someone might argue that all Ruby programs start by firing up the interpreter and loading the environment, so I'm wrong - as usual. But that's not what I'm talking about.

'main' is the start of the logic of the program. Knowing this gives (the generic) you a place to start unraveling the logic of a program. This seems like a 'good thing' (trade mark), but is it?

The whole idea of 'main' was really invented so that other programs (loaders) would know where to start executing a program. This is a 'convenience' with compiled programs: programs where the source code is processed into (possibly virtual) machine instructions and then further processed into an executable image saved in a file. This executable image is then loaded, source code agnostically, into the execution environment of a computer and run. It's a simple fact that the 'loader' needs a place to start and it doesn't know from beans about the source code - hence the well known 'main'.

You don't need this for interpreted, dynamic languages, but the S/W community seems to have carried this structure forward.

I think Ruby represents a significant break from this architecture.

Ruby 'programs' seem to be a bunch of symbiotic objects, executing in a common (probably specialized) runtime environment. Various behaviors (programs) can be achieved by instantiating different objects and calling appropriate 'starting' methods. This is similar to, but much more fluid than, writing a program within the constraints of a published API for, say, Windows, or a purchased library.

I like it.

I don't know how to understand it easily yet, nor do I think anyone really knows how to document it yet, but I think it's an advance.



* Python really doesn't have a 'main' either, but it has taken the 'flavor' of 'main' stipulating that code can be conditionally executed if it is 'run' as though it were the 'main' program AND a chunk of code is included of the form:

if __name__ == '__main__':
some code

This chunk typically goes at the bottom and is encouraged when the code in the file can server either as the basis for a stand-alone program or an importable unit to another OR for testing.

Sunday, January 27, 2008

DSL Design - that's Domain Specific Language

A Domain Specific Language - for those who don't know - is a bunch of functions named so that writing function calls 'reads' like natural language. It seems to be sprouting wildly in Ruby - most probably because Ruby doesn't require parentheses around function arguments and Ruby programmers are kind of rebels anyway.

For example:
rabbit_jumps_in_the_hole :hole_size => 10

Even if you don't understand Ruby Syntax, you know what the function call does.

Think of it this way:
  1. Function/methods are really 'verbs'.
  2. Function Options are 'adverbs'
  3. Object Identifiers are 'nouns'
  4. Object Attributes are 'adjectives'
  5. Class Names are 'class names' [gottcha!!!!]
I think we should think of learning one of these DSL things the same way we think about learning a new Language.

This can be either a good thing or very bad.

Size Matters

Which is easier to learn: A language with 10 verbs or one with 1,000?

Just for the heck of it, I recently tried to get a count of the 'verbs' in Rails 2.0.2. I ran 'egrep -r 'def [a-z]' on all lib directories and came up with:
  • actionmailer/lib 694
  • actionpack/lib 1393
  • activerecord/lib 1134
  • activeresource/lib 125
  • activesupport/lib 577

  • Total 3923
In contrast, the Merb Framework is much smaller:
  • merb 713
  • merb.rb 19
  • tasks.rb 0

  • Total 732
Of course, this isn't fair because Merb doesn't come with an ORM [Object Relational Mapper library (bunch of database access functions - for those really out of it)] [or Active Record Pattern Implementation, for those . . . - well, you know who you are], so you have to add that in.

But Merb gives you a choice of ActiveRecord - with it's 1,100 verbs; DataMapper - with about 500 methods; or Sequel - with about 600.

So learning Merb should be easier than Rails because the vocabulary is about 1/3 to 1/2 the size.

Synonyms are Bad

Which is easier to Learn: a language with one word for each concept or with two or more?

A programming language or environment isn't meant for composing poetry, novels, or movies. It's supposed to precisely express a procedure. Period. It should be concise. That makes it easier for Programmers to understand.

Case closed. DSL's should be concise, singular, and boring - but very, very accurate.

Corollary:

The Rails Inflector is a mistake in every possible way:
  • It Expands rather than Tightens the vocabulary of the Rails DSL
  • It injects confusion because Programmers now have to worry about singular and plural forms depending on context
  • It doesn't work:
    • 'XMLClass'.underscore -> 'xml_class'
    • 'XMLClass'.underscore.camelize => 'XmlClass'
    • 'slave'.pluralize == 'slaves'
    • 'slave'.pluralize.singularize == 'slafe'
  • It wastes lots of cycles doing it - machine, programmer, and learning
Distance is Good

I'm talking about the distance between words. For example frog is very close to frogs but far from toads. That makes it easier to tell a frog from a toad in print than in real life.

Good DSL design should not only use expressive and concise identifiers, but should also keep them far apart, especially when the referents do significantly different things.

Again, picking on Rails, the methods update_attribute(attribute) and update_attributes(attributes) are very close together, but one bypasses attribute Validation. Can you tell which one by the names? Don't you think it's important to know?

DSL is Not Documentation

Most DSL seem to grow more or less organically. The Ruby universe is filled a lot of apparently useful packages with virtually no documentation. Almost all of them have fairly reasonable API documentation - which allows 'one' to learn what each of the 'verbs' in the DSL do, but that's like learning to drive a car by reading Glossary of the Parts! It Just Don't Work.

It's hard as hell to learn a system without some sense of what the thing is supposed to be doing and how it's put together.

Don't belive me?

Figure out a Car from stuff like this:

Wheel - 1. circular object in contact with ground; 2. circular object interfacing driver to directional controls.
Nut - 1. Device for attaching wheel; 2. driver in other automobile; 3. nutritious snack
etc.

That's API doc and that's what you've got when all there is is the DSL.

'nuff for now

Friday, January 25, 2008

Is Java Bad for You?

How do Heavily Constrained programming environments - such as Java, C#, and friends - effect our thinking and creativity?

I think the goal of heavy constraints and requirements started out as a way to get better code by automatically checking as much stuff as possible mechanically. It all started with compile time type checking and has extended into things I don't want to know about.

Anyway, the result is that it's hard to write code with these tools. From what I hear - and I don't do Java, C#, and friends - the 'programmers' spend most of their time figuring out what API's and Design Patterns to use. I don't find that fun at all.

I usually spend most of my time trying to better understand the problems I'm trying to solve and creating software structures which mimic the nature of the problem. After I've coded and tested one of these structures, I call it a 'solution'.

In other words: I don't use Design Patterns and don't think in terms of API's.

Does that make me a dinosaur?

I don't think so.

I gravitate toward unrestricted programming environments. My first introduction was SCO Xenix around 1986 or 87. I became really excited as I realized how easy it was to do mundane tasks by stringing filters together in pipelines. I could accomplish more useful work in 1/10th the time [or so it seemed] than I could writing special programs to do the same thing.

In addition to being faster, it was more fun. I spent more and more of my energy solving problems rather than conforming to code writing rules.

The same thing happened when I discovered Python - and now to a similar extent Ruby. Scripting languages with good support for dynamic strings, arrays, hashes and objects are wonderful. They handle all the details of what I need - as a coder - to do the job.

How do I keep from hurting myself when the Programming Environment doesn't keep tabs on me?

Well, it's not a problem. I just test as I go and keep rewriting my code so it works, is more succinct, and tighter. [I guess you call that Refactoring now - we used to call it rewriting]

The facts are that Anyone can write bad code - and restrictive frameworks don't stop them. Anyone can also write good code - if they take the time to learn how and pay attention to what they are doing. And restrictive languages don't help with that either.

When I need a solution - I just think one up (or two or three) up and try it out. It's easy to write the code, change it, test it, and refine it. In a verbose, API laden environment like Java, that cycle isn't so easy - or at least is a heck of a lot more verbose.

I suspect that restrictive environment programmers get dulled down by the drudgery of just writing the code and learning all the API's. There is so little room for creativity that they lose it - creativity is something you have to practice and cultivate.

As a result, they get used to solving problems by applying packages and patterns. They don't really design: they apply old designs to new problems and hope that they work. [BTW, that's the reality behind Anti-Patterns]

So, I guess it makes sense: if everything you do is a cut-and-paste of something somebody else thought up, you would apply that to Design as well.

God, that's boring!

If I'm write, then Design Patterns are a result of Boring Programming Tools which create Bored, Dull Programmers and more Bad Code.

I don't think that's a good thing.

What do you think?

Wednesday, January 9, 2008

Code Efficiency in Ruby

I got interested in the qualify of Ruby code in Rails when I noticed what appears to me to be a useless method in the ActiveRecord code. Specifically, ActiveRecord::Base.save is a public method which calls the private method ActiveRecord::Base#create_or_update. That doesn't make a lot of sense to me, because it could be replaced by making 'create_or_update' public and then aliasing 'save' to it.

So I decided to check to see what the superfluously method call costs.

I wrote a test which performed a simple task [incrementing an instance variable by a random number between 1 and 10] using five (5) different ways of accessing the instance variable and invoking the action.

The class definition is at the bottom of this post.

I then ran these methods 10,000,000 times using four different ways of invoking the methods:

  • directly calling the methods - e.g. foo.inc_instance_variable()

  • invoking via the method's 'call' attribute - e.g. foo.inc_instance_variable.call()

  • invoking via the method via 'send' - e.g. foo.send('inc_instance_variable')

  • invoking via 'eval'ing the string - e.g. eval 'foo.inc_instance_variable'

The Precent Results are simply the run time divided by the minimum run time for all tests converted to a percent increase.

Here's the summary:

  • Invoking via an Alias doesn't cost anything

  • unnecessarily accessing an instance Variable via an accessor slows down about 20%

  • The unnecessary method/function call slows down by about 25%

  • Combining the unnecessary call with accessor access slows down about 45% - so the effect is linear

  • Invoking by the 'call' method slows it down by about 13%

  • Using 'send' slows down about an additional 45%

  • Using eval slows the process down by something on the order of 400%, but the effect is not linear, so 'eval' must be doing some additional mucking about.

So, what's the point? None, if you're satisfied with glacial execution speeds.

On the other hand, it's something you should know if you are writing critical code and have to make choices about how to implement it.

As usual, your mileage may vary. The full program code is at http://www.clove.com/downloads/method-call-timing-tests.rb.

Here are the detailed Percentage Results

Percent Results for direct method of invocation
foo.inc_var_as_instance 0.99
foo.inc_var_as_instance_alias 0.00
foo.inc_var_as_method 19.89
foo.inc_var_as_func_and_instance 25.28
foo.inc_var_as_func_and_method 45.17

Percent Results for call method of invocation
foo.inc_var_as_instance 14.06
foo.inc_var_as_instance_alias 12.93
foo.inc_var_as_method 32.53
foo.inc_var_as_func_and_instance 42.19
foo.inc_var_as_func_and_method 59.52

Percent Results for send method of invocation
foo.inc_var_as_instance 42.05
foo.inc_var_as_instance_alias 46.02
foo.inc_var_as_method 60.65
foo.inc_var_as_func_and_instance 75.57
foo.inc_var_as_func_and_method 94.03

Percent Results for eval method of invocation
foo.inc_var_as_instance 393.89
foo.inc_var_as_instance_alias 410.94
foo.inc_var_as_method 420.03
foo.inc_var_as_func_and_instance 458.10
foo.inc_var_as_func_and_method 577.84

Here are the raw timing results:

Result using Direct Calls
foo.inc_var_as_instance 7.070000 0.040000 7.110000 ( 7.306318)
foo.inc_var_as_instance_alias 7.000000 0.040000 7.040000 ( 7.155283)
foo.inc_var_as_method 8.390000 0.050000 8.440000 ( 8.585970)
foo.inc_var_as_func_and_instance 8.780000 0.040000 8.820000 ( 8.950350)
foo.inc_var_as_func_and_method 10.160000 0.060000 10.220000 ( 10.378127)

Result using .call
foo.inc_var_as_instance 7.990000 0.040000 8.030000 ( 8.204646)
foo.inc_var_as_instance_alias 7.910000 0.040000 7.950000 ( 8.061320)
foo.inc_var_as_method 9.280000 0.050000 9.330000 ( 9.502992)
foo.inc_var_as_func_and_instance 9.940000 0.070000 10.010000 ( 10.249010)
foo.inc_var_as_func_and_method 11.170000 0.060000 11.230000 ( 11.439927)

Result using 'foo.send '
foo.inc_var_as_instance 9.950000 0.050000 10.000000 ( 10.220120)
foo.inc_var_as_instance_alias 10.220000 0.060000 10.280000 ( 10.462063)
foo.inc_var_as_method 11.250000 0.060000 11.310000 ( 11.514329)
foo.inc_var_as_func_and_instance 12.290000 0.070000 12.360000 ( 12.583919)
foo.inc_var_as_func_and_method 13.590000 0.070000 13.660000 ( 13.891333)

Result using 'eval '
foo.inc_var_as_instance 34.550000 0.220000 34.770000 ( 35.387702)
foo.inc_var_as_instance_alias 35.740000 0.230000 35.970000 ( 36.670451)
foo.inc_var_as_method 36.390000 0.220000 36.610000 ( 37.296297)
foo.inc_var_as_func_and_instance 39.050000 0.240000 39.290000 ( 40.033456)
foo.inc_var_as_func_and_method 47.420000 0.300000 47.720000 ( 48.638035)

Here's the class definition:

class Foo
attr_accessor :var

def initialize
@var = 0
end

# access the instance variable directly
def inc_var_as_instance
@var = @var + 1 + rand(10)
end
# access instance variable directly, but us an alias
alias_method :inc_var_as_instance_alias, :inc_var_as_instance

# access instance via accessor method, even though inside class instance
def inc_var_as_method
self.var = self.var + 1 + rand(10)
end

public
# add an additional method call to accessing via direct access to instance variable
def inc_var_as_func_and_instance
inc_var_as_instance
end

# add an additional method call to accessing via accessor
def inc_var_as_func_and_method
inc_var_as_method
end
end

Saturday, January 5, 2008

Design Patterns and the Fall of S/W

One nice thing about having a blog nobody reads is that I can say anything I want without worrying about it biting my butt.

I hate Design Patterns.

It's that simple.

I hate the guys who promote them.

Most of all, I hate the s/w industry - especially the programmers - for being duped by these guys.

And I'm qualified.

I received an engineering education and am a self taught computer 'something'. I'm not exactly a programmer, although I've written an awfully lot of code in a variety of environments - all the way down to machine code on microprocessors up to hokey database/user interface stuff. I've done device drivers and created my own little languages using lex & yacc and 'in the raw' in C, Python, and awk. And I've been doing this over 40 years - so I've earned the right to be a grouch.

The Design Pattern guys are the current generation of Yordon (sp?), Codd(sp?), and Bouch (sp?): Consultants who watch other people create software while telling them how to do it right. None of them actually do anything, but the sure create a lot of bad advice.

I remember buying three (3!) books by Peter Codd(sp?) on Object oriented programming and design only to find out that he admitted that he didn't really know anything about it. The idiot had gotten excited about the idea, so he and his group spent a year puttering with it and writing books - probably giving lectures and doing expensive consulting with BIG companies at the same time.

It takes years of experience doing something to understand the concepts. [things move quickly, but our brains take a while to catch up]. Hell, it takes 10 years plus to design, implement, and knock most of the bugs out of any programming language.

The Design Pattern guys are the absolute worst! The claim that they are defining these things to add clarity to the software process and they they write the vaguest crap imaginable. Don't believe me?

Grady Bouch: "In the world of software, a pattern is a tangible manifestation of an orginization's tribal memory." - CoreJ2EE Patterns (introduction) [clipped from PHP 5 Objects, Patterns, and Practice - by Matt Zandstra]


Merrian Webster Dictionary: "1. an ideal model. 2. something used as a model for making things. 3. Sample"


Which is clearer? If you like Bouch - then you need to join a consulting company and stop pretending to write code.

I'm starting to froth at the mouth, so it's time to simplify things. Here's a simple procedure to see for yourself.

1. go to a book store and pick up one of Martin Fowler's many books on patterns. WARNING: Do not buy the book.

2. Select a pattern at random and read the first paragraph describing it.

3. Answer yourself honestly: Do I know what this 'pattern' is well enough to describe it in a single sentence?

if No - read the rest of the description and try again

if still No - replace book on shelf.

if Yes, please send that sentence to me.

Thanks,
Mike