Tuesday, November 27, 2007

Here's a thought... ActiveRecord in PL/SQL

With all the talk about putting logic in the DB and hiding SQL behind stored procedures, wouldn't it be really intuitive if Oracle simply provided a default ActiveRecord implementation for every table in the DB? So, instead of returning a vanilla record type, FOO%ROWTYPE would return an Oracle-generated Object type that had accessors/mutators (getters/setters) for all the columns in table FOO and methods to UpdateRecord and DeleteRecord along with a static method to AddRecord. If you want to avoid any collisions with existing syntax, we could simply have a new attribute- something like FOO%OBJECTTYPE.

My experience shows that most reluctance to encapsulate SQL behind PL/SQL programs stems from the fact that people actually have to write (and subsequently maintain!) that wrapper code. People just don't want to do the donkey work (write the same boilerplate code over-and-over agian for each Table). And personally, I don't think a code generator is very good in this case either. The generated code simply breaks down under maintenance (add/modify/drop column from table).

But, what if all that was done by the database?– CREATE the Object Type when a Table is CREATEd and propagate changes whenever the table is ALTERed; plus maintain the accessors (which could very well enforce the integrity constraints declared on the Table)!

It'd be a great way for the Oracle community to show some benevolence towards (lowly! :P) application developers who can't write PL/SQL even if their life depended on it, let alone SQL!

Also, if you left the <Table>%OBJECTTYPE as NOT FINAL, those who can write PL/SQL could extend it to add even more (business) logic in the database! Whopee and horaay!!!

Friday, October 12, 2007

Easy ain't Good!

Just last week, I managed to finally put down Donald A Norman's Design of Everyday Things (DOET)- a fascinating read about the little frustrations of everyday life inflicted on us by impatient capitalists and disconnected designers (or any combination thereof!). However, as I was reading the last chapter, the case study entitled How Writing Method affects Style got me thinking about how we tend to ignore the flip side to advances in technology in favour of immediate benefits.

A little background- Donald Norman's case study describes an accelerated timeline of writing techniques from the quill-and-ink up to voice-recognition. He observes that with advances in technology making it easy to...

  • write almost as fast as you think; and
  • correct mistakes
... our writing has become more colloquial and unstructured. [However, he does go on to say that newer tools such as voice-based word processors help maintain structure; but the colloquial nature still remains]

To study a more concrete subset, parallel observations can be drawn from the advances in written communication, as opposed to just writing techniques.

We have come a long way from traditional letter writing to recent voice-based solutions. The pen-and-paper letters were well structured. They would be drafted and then... re-drafted! Sentences would be rephrased and reworded to present the most content using the least amount of text. A lot of thought went into such communication, because you knew this wasn't a dialogue and you wouldn't have a chance to explain any misunderstandings, atleast not very soon.

Then, somewhere along the way, we invented the Memo. In our effort to be short & pithy we (mostly) ended up vague & ambiguous. Soon, computers brought us email, letters in binary; and it was good for some time. That is, till IRC invaded the web. Slowly email turned into a sort of e-Memo and chat took communication by storm. Now people consider it normal to be bombarded with a string of 'electronic communique's. The fact that the line between chat and email is growing thin is proven by people who choose to carry on lengthy one-line-conversation chains over email.

To consider similar progression in an alternative technology, how about publishing and retrieval of online content? Online search and indexing services such as Yahoo and Google proved to be a boon. They helped us sift through the glut of information dumped onto the web as a result of years of indiscriminate publishing by people who finally found a medium to vent their 'creative' outbursts! Somehow, I always thought that search engines were invented out of the necessity to find signal in existing noise. However, they ended up becoming invitations to generate EVEN MORE noise. Armed with the knowledge that some web crawler will eventually index their site/blog/micro-app, the Web 2.0 community got busy building more-and-more 'information' into the web in the form of opinions, reviews, cross-blogs and flame wars!

As if that wasn't enough, Google gave us GMail! A direct invitation to actually CREATE a greater waste of resources. Just think of the volume of useless email still sitting around in your GMail account, just because Google gives you 2 GB of storage space. In this day and age of environmental preservation, I shudder as the sheer waste of resources- mail storage space, connected storage media, CPU cycles, electrical power, heat emanated by these devices, energy wasted on resulting cooling... and Google itself isn't far behind in the waste with it's ultimate redundancy matrix!

The following line pretty much sums it up.
Ease can also lead to callousness.

Seems like every day we're making it easier to waste some more.

Update 24-October-2007: Great! Another invitation to create greater waste!

Wednesday, September 12, 2007

The Purpose of Software

I've been meaning to write about this for quite some time and now I've finally found a forum.

This post on Dratz's Confessions of an IT Hitman isn't the first place where I've heard this said. Although I wouldn't go so far as to say that Chet is completely biased towards data-centric-thought-process and Databases, his letter to the CIO was a little one-sided; probably because he was trying to drive his point home. I am more inclined to lean towards Ralph Wilson when he reiterates the old addage that if a hammer is your only tool, every problem is a nail.

I'll admit- Data is the most important artifact that business generates. But, IMHO, the most important output of the software development activity is, by far, the simulation of a business process.

This follows the analogy that every program is a process-

a systematic series of actions directed to some end.

A program/system 'does' things; most evidently facilitates, automates or accelerates a given business function. Well, atleast good software.

Data generated by a system changes as much as the business processes that generate it; in fact data goes out of date faster. Today's data will be archived and stored for 'future reference', but will rarely be in 'active duty' as long as the processes that manipulate it. For all Thomases doubting my experience in this matter, I work in a team that maintains a Mortgage Origination System used in North America by a MNC bank. We recently wrote a job that archives data pertaining to about 8,000 mortgages every month; and they just archive unprocessed loans, add closed loans to that and the monthly figure would sky rocket.

Anyway, getting back to the point. I believe there is some merit to Chet's point of putting logic in the database, but...

you should put as much of the logic in the database as humanly possible

is a little extreme! I mean I don't want to run an INSERT and have an ORA-06502: PL/SQL: numeric or value error thrown to know that I entered one-too-many-zeros or entered a date in the wrong format! Trust me, there's dudes who stretch the idea this far.

Sidenote: Several situations warrant the temporary storage of data that doesn't meet all the integrity constraints. Well designed UI's would allow users to set arbitrary savepoints- even when ALL required data has not been entered. Think of all the times you were filling up a 53 field web form; and realized at field 24; that you need to run down to the drawing room to pull out the referral code for a vendor discount printed on a coupon that you got in the mail 3 months ago; but you don't want the form to timeout while you go fish the coupon out of the bowl in which you keep all of your discount coupons. So, you're sitting there gritting you teeth, seething away, wondering why the fr*cking developer didn't think of putting a SAVE button there! And when you do put one in, if the application is tied in to the database- you have data in inconsistent state; or you end up designing a schema that doesn't enforce all the integrity constraints declaratively. That's where a Domain model (sprinkled with a healthy dose of the NullObject pattern) comes into picture.

Good design is about tradeoffs. I'm not promoting 'Rail'ed webbers who think ActiveRecord is a Silver Bullet, but I'm also not saying that the ActiveRecord Pattern doesn't have any merit. IMHO, it can be implemented as well in PL/SQL as it can be in Ruby/Java/C#. The point is- where does it make the most sense given the constraints (environment) you are working with?

The notion that data is the MOST important thing in an application leads to the development of webapps that babysit single dimensional databases.

That's just saaad! Where is the CREATIVITY?

Update (04-Oct-2004): This just hit me, given that replacing an application (medium-to-large-scale purpose built, not some COTS) almost always involves data migration effort, representation of data (database) is as transient as the process (application)!

Monday, September 03, 2007

Fragments of a wanderer

It's been too long since my last post. In my meanderings through the lonely forest, I stumbled upon a familiar voice, words from an old friend.

My mind wanders past the signs in my subconscious–

The world is an alien place; ideas I don't need, much less want, are forced onto me; my dreams come not to me anymore; tranquil sleep is but a distant memory.

Everytime I think about that, I wonder if this is the world we leave for the generations to come; will this be our legacy? Have we failed our unborn children, just as we accuse those before us?

Or... could I just be looking at it all wrong?

I think it's going to be a while before I can write straight again.