Wednesday, March 26, 2008

The next step in Operating Systems

Last week's column resulted in some interesting feedback, thank you! One reader pointed me to a 16 year old demo of NeXTSTEP Release 3 in which you can clearly see that Graphical User Interfaces and easy-to-use applications today in operating systems like OS X, Ubuntu, Windows Vista and BeOS are nearly the same as 16 years ago in NeXTSTEP. So what can you learn from this? That the operating system itself doesn't matter. Because if it would, we should have seen some more advancements in the past 16 years.

What does matter both in the enterprise environment as well as to consumers are applications. To business people this is a no-brainer. So why is nearly everybody in the IT industry talking about the operating system? And do people in IT departments get upset that Microsoft will stop selling Windows XP, while they are not ready to upgrade to Vista? To be honest: do we care?

From an user perspective we shouldn't care. Because an operating system's primary function is to hide the details of the unintelligent hardware from the application. So that an application developer can focus on real functionality and not waste time and money to get some of the basics of his application working on the hardware. If you look at the average level of sophistication in enterprise applications in use today, it is no surprise that "the computer says no" from time to time.

Life would be simple if there was one universal operating system free of charge. Quite some people believe that Linux is that operating system and are putting a lot of effort on pushing in this direction. But such an universal operating system is nearly impossible to create. The huge variety of applications on different form factors of hardware put such a wide range of options that the OS developer can maybe create such software only with a myriad of configuration and tuning options. Hardly anything that follows the "keep it simple" principle, as you need the equivalence of a rocket scientist to get it working properly.

We users should push the industry towards a different direction. What I call the "software appliance road". It is likely that we are willing to pay for a software application, because an application can add value to what we are trying to achieve. The application comes packaged with an operating system which will allow it to run on any hardware form factor the application was designed for.

How can we achieve this goal? I believe the answer lies in hardware virtualization. Virtualization will allow us to separate software from hardware. Once we keep pushing in that direction I bet you will see that hypervisor software (which allows the execution of a standardized virtual hardware) will be implemented in hardware. The device and hardware component builders of this world can continue building more efficient hardware (energy waste comes to mind) and application developers can pick the operating system they think is best suited for the application. This also means that not every application will be available to run on every hardware. But once we all get into "appliance mode" we don't care.

A challenge? You bet it is. But money rules. Don't spend anything on "the latest and greatest" if you can't qualify and quantify tangible results and added value. At some point the industry will listen.

P.S. Yesterday on Dutch news there was an item on a poll sponsered by Logitech about the amount of technology purchased by consumers who don't know how to operate it. In 29% of cases only one member of the household is able to operate it. The amount "invested" is 8 billion euros (12.5 billion dollars) in the Netherlands which has 7.2 million households. Food for thought.

© Peter Bodifée 2008. All rights reserved

Wednesday, March 19, 2008

Are you on the right road (map)?

Robert X. Cringely once wrote in InfoWorld:

"If the automobile had followed the same development cycle as the computer, a Rolls-Royce would today cost $100, get a million miles per gallon, and explode once a year, killing everyone inside."

Robert brings up a point worth thinking about when you create and implement solutions using IT technology. I am not sure what his thoughts were when he wrote this, but I see a parallel with people jumping on the latest and greatest computer technology because it has the status of a Rolls-Royce, procurement costs are peanuts and the claim on resources is also not worth talking about. But there is also a "subtle" statement that this "latest and greatest" doesn't give us the promised or implied benefits.

Lately I have more questions then answers if all these technological "advancements" in IT are still "pushing" us in the right direction. Don't get me wrong; having a strong engineering background I do appreciate technology in general, in particular when mankind benefits from it.
But making things faster and bigger may not be a benefit.

With the current state of art and common practices in Information Technology I start to have my doubts. Therefor I would like to appeal on all digital architects (enterprise, integration, software, solution, infrastructure) AND users (both organizations as well as individuals) to challenge your thinking with the question: Are we on the right road?

The following experience could illustrate the off-track route.

You stand in front of a counter talking to a representative who stares at his/her computer:
Rep: "How can help you?"
You stated your problem or question.
Rep: "What is your name?"
You give your name (or any other relevant information).
Rep: "You are not in the computer (or system)."
You think: "No, I am standing in front of you."

In the above example: how come you as a person are perceived to be in a computer? What went wrong? More seriously: when information is not in sync with reality, why do users tend to take the information for being the truth? Because a computer doesn't make mistakes? Did we loose our capabilities to observe and use our brain?

A more current example are the phenomena of "social networks" on the internet. While being a fan of networking - using IT and digital communication can make it easy to do so - I think that human individual should be able to stay in control of their personal information. It turns out to be that some makers of social networking software don't want users to permanently delete their data they (re)considered not to be public anymore. And I haven't even started to talk about what governments and commercial organizations do with data about you.

My feeling tells me that it is time to rethink where we are heading in the information age. Just plainly buying the latest and greatest hardware and software offered to us by the developer is certainly not the right direction. Why are so many people and organizations going down that road? Blissful ignorance?

More people and organizations are starting to become aware of the security related threats as well. And this isn't unrelated to the above mentioned "ownership of information". Doc Searls wrote as Senior Editor of Linux Journal recently about "Who is in charge of security?". Related to this are the efforts on VRM, which stands for Vendor Relationship Management, the opposite of CRM. Recommended reading for inspiration!

Like to hear/read your comments!

© Peter Bodifée 2008. All rights reserved

Wednesday, March 12, 2008

Tips for the architecture documentation process

In IT architecture documentation serves several purposes. While the younger generations seem to ignore the written word and drawings, it remains an important tool for communication in many organizations. Not only between the architects and the developers, also between the sponsors and the architect or architecture team. And let's not forget the remaining stakeholders.

The challenge here is to create a controllable process without putting to much limitations on the creativity needed during the creation of the architecture. So first of all you shouldn't try to implement a process. Confused? Read on.

What really matters is the documentation trail and approvals. Haven't you seen those documents labeled with statuses like: draft, concept, preliminary, to be reviewed, final or any arbitrary combination of those labels? Were you immediately clear AND sure where you were standing with the documentation? My experience so far is that nobody takes notice of the status, as there are no universal accepted values related to these labels. In other words: useless. Oh, by the way, quite a few implemented document management systems are dysfunctional within the IT organization.

You don't need to look far for a solution. You just take version management and apply it to documents as software engineers do with code. But don't go down the road of releases with mayor, minor versions with revisions. That complexity is not needed here. A simple release and version nummer in the format release.version is sufficient (e.g. 0.3, 1.1).

And now the "trick" to get to a documentation trail and approval without really trying to implement a process.

Give version numbers a meaning (or status if you will). Be rigid with the definition, don't tolerate diversions. It is amazing how quickly word of mouth will spread, the acceptance of the "process" is almost viral. We started this at the corporate level in a large organization with several autonomous divisions and within a few months all people involved with architecture documentation were aware and acted accordingly. And this was achieved without any systems and software for document management! Just "free flowing" documentation, usually shared with e-mail. Oh, it helps to ignore the whiners.

Based on our experience give the following value (read meaning or status) to a version number.

x.1 - x.4

Internal versions for the team who creates the documentation

x.5

First version to be shown to the lead architect in the applicable domain

x.6 - x.7

Additional internal versions

x.8

Version to be formally reviewed by the architecture board (or whatever group is responsible for signing off on the content)

x.9

Version to be approved by the sponsor (e.g. CIO, business management)

x+1.0

Final (approved!) version



x is the release number and should start with 0.

Those who observe closely notice that the version number also represent the approximate percentage of completeness. 0.5 is 50% ready, 0.9 is 90% ready, etc.

That's all! Enjoy the results.

© Peter Bodifée 2008. All rights reserved

Wednesday, March 5, 2008

User Experience overlooked in IT architecture?

User Experience is often thought to be a topic for the User Interface designers. If you leave it up to them you may be heading for disaster, because chances are high the IT solution becomes a monster. Not that there aren't any good UI designers, even though there are many bad ones. It is because the User Experience is probably the "architectural glue" between the human and the machine.

While the relevance of the user experience seem so obvious, IT developers and IT operation staff are notorious when it comes to considering the needs of users. They often take their own perception on IT matters being the same as for anybody else. To illustrate this one can easily fire up a lively conversation on how users are treated at help desks when they have IT issues getting their work done. (Click here if you don't see a video)



Can the developers be blamed? Not really when you see the premises on which they do their work. There are often no decent guidelines for the user experience as architects often think of the guidelines as something for the UI designers. In my view the ones to blame are the architects, and certainly not only the application architects. Any organization implementing IT systems involving humans should take a very serious look at the User Experience. And this starts with doing away with the notion of "the user". Because he or she really doesn't exist!

A good way of getting the discussions concerning user experience in the right direction is to create personas. In short personas are fictitious characters that are created to represent the different user types within a targeted demographic that will or might use an IT solution. Personas are said to be cognitively compelling because they put a personal human face on otherwise abstract data about users. When you carefully look around it will not be very difficult to describe and name the relevant user types.

But there is more about User Experience then just personas. Without going to elaborate here a lot about how architects could deal with the User Experience please take a look at Simon Guest's talk on this topic titled "Putting the User back into Architecture". I am not an advocate of Microsoft, but I like the framework Simon created. You can find his recorded talk at the Canadian Strategy Architecture Forum 2007 here.

In the first 10 minutes he presents the framework architects at Microsoft use, after that he goes in depth explaining this framework, often in in relation with Microsoft products. On a funny note: about 5 minutes into the talk he has the "mandatory crash" of their software: in this case PowerPoint. So much for User Experience ;-)

P.S. The example of a human "talking" to the IT systems using IM (Instant Messaging) is a very interesting concept. My bank currently allows it's internet banking customers to use Microsoft's IM to obtain balance and transaction information. Any question that can not be answered is offered to be passed on to the real people at the internet part of the bank. While it still is very simple it is definitely very elegant experience to get this information without having to login on the internet banking site.

© Peter Bodifée 2008. All rights reserved