Wednesday, September 17, 2008

Virtualization - Ready for take off?

In the past couple years quite some organizations used virtualization techniques to gain significant benefits, including financial ones. But studies tell us that the majority of the IT market is still not virtualized. At this moment VMworld is taking place in Las Vegas and the blogosphere is full of buzz about the new, sometimes free, products and possibilities. It is no longer the exclusive domain of VMware anymore. Now Microsoft, Citrix and Sun together with their ecosystems are also playing their music very loud.

The big question which probably most organization face: is this the next fad of the IT industry and what's in it for me? Other questions might be: what has virtualization to do with cloud computing? And what the heck is cloud computing? And what is she talking about?



OK, back to the topic. As written before in this column, virtualization has many benefits. But the biggest benefit is that it could relieve us from the importance of the operating system, which now serve as the glue between the applications and the hardware needed to run the application. This is important as less dependencies means more choice. Basically we don't want to have only one choice for an operating system on a PC, server, router, storage controller etc. Because end users really don't care what the OS is, they only care about the applications.

Back in April I announced an experiment on end-user device virtualization. We ran the experiment using VMware pocket ACE to find out what the user experience is. The client is an educational institute, the end users are students, teachers and staff. The result is that despite some minor glitches, the end user has no objections to run his application in a virtual machine on his physical device (read PC, laptop). Admitted, the technology is still in it's early stage, but the experiment proved the fundamental principle that the OS is not relevant to end-user. We ran several applications only available for XP or Linux on Vista and XP machines. That is in a virtual machine with Windows XP or Linux as the guest operating system and the host machine running Vista or XP. User saw no real issues. Just make sure the USB storage device is not too slow. The advantage of this set-up is that the required additional hardware to make this happen is minimal: you need a USB based storage to store the virtual machine(s) and potentially some more main memory on the PC or laptop. Typically costs that don't create a headache.

This is definitely different from the view of big enterprise PC vendors when asked about the future of the desktop and desktop virtualization. They immediately point out that VDI (Virtual Desktop Infrastructure) is the road ahead. What they are basically saying is that they want to sell you tons of servers and centralized storage to run the applications in virtual machines on SERVERS. Now this may be appropriate in some cases (and possible), but are we not wasting our current investments in the end-user devices? Some are quick to point out that you should use thin-clients as they consume less energy. I just see more dollars/euros in their eyes. The energy companies are already laughing all the way to the bank. It is probably smarter to shut down end user devices automagically when not in use. Verdiem comes to mind.

The whole idea of virtualization is that it doesn't matter where you run your application. Ideally the virtual machine specification is standardized (we are not there yet!), you can pick the OS that matches with the application(s) and any one who can provide you with a platform which can run this virtual machine will be a candidate for execution. This is where the link with cloud computing can be made. In the "cloud" (which derives it name from the symbol to obscure the real thing) there is technology (servers, storage, networks) to host your application and to let you access it. Joe Weinman wrote a nice article on "The 10 Laws of Cloudonomics" for further reading.

You just need to worry where you store your virtual machine, which is basically a set of a few files. This is important as current legislation will force you to be particular about it. Some countries won't allow you to store your company financial data (which can be part of your application environment) in another country. Other countries provide you very little privacy protection and some don't even have laws against theft. And that is a real concern. More about that in the future.

© Peter Bodifée 2008. All rights reserved

Wednesday, September 10, 2008

Data security - state of affairs

Regularly we read in the news about data security breaches. What is happening? Are the criminals getting better at it? Probably. But what about defense?

Sadly it looks like that organizations have no compelling reason to protect the data. Bruce Schneier, chief security technology officer at BT Group, said while being interviewed by the Wall Street Journal: "For the most part a company doesn't lose its data, they lose your data" (source).

Which brings me to the point on how data from or about an individual is being dealt with. And even about laws on data. A law like "those who hold any data on some one else can be held liable for theft of this data". And "stealing data is a criminal offense". This may not be the correct wording - bear with me - it is meant to provoke a thought about data ownership.

We live in the information age and data has value. Anything with value that is stolen is a criminal offense, right? So we made laws based on the idea "thou shall not steal". But it still happens as there are individuals who feel they will not get the punishment they deserve when stealing physical goods. We even went to the point we made laws against stealing ideas and inventions. But where are the laws for stealing data? Like Bruce Schneier I am in favor of laws that allows for real punishment in case of data theft.

Law like "a business has to protect the data of it's customers" doesn't help enough because there will always be loopholes. Or even worse: a law that requires businesses to disclose data security breaches. What is the use if the damage is already done to the victim? And if it will only cost the business money when they disclose?

Why do I bring this up? Because I think it is time to rethink where we keep our person related data, but also who exclusively holds the access control. Personally I don't have a problem to physically store my data or data about me outside of my personal environment (so with a trusted party), but I would love to be in full control who has access to it. Not only that, but I would also require to be able to maintain this data, in order to keep it synchronized with reality.

To be honest, this "world upside down" idea is definitely challenging to implement given the current state of applied information technology. I don't have any solutions, not even some vague idea about guidelines how to proceed to this new order. So I challenge every one to think about this. Because it effects us all. How would you feel if something that belongs to you is stolen and you had no means to prevent it from happening?

Love to hear from you!

© Peter Bodifée 2008. All rights reserved

P.S. I was away from writing my weekly column for personal reasons. More news at eleven.

Wednesday, May 7, 2008

Control vs management dilemma

Most organizations struggle with on how much management should be applied to IT in particular when it comes to supporting the end user in their daily needs. Back in the old days when all IT was central (mainframe or mini computer) and users had only terminals, IT organizations were in full control and needed little management attention. Shops who still have mainframe(s) are still operating this way. But when control is relinquished to end users, more management attention is needed. In those cases annual TCO per end-user device runs easily into multiple of thousand dollar/euro.

The dilemma is that it seems you have to choose between users having control and thus high costs for system management or unhappy users as they have no control. Is there a best of both worlds? Users still in control as much as desired without the added management costs? I believe that striving for solutions which support this is the way to go. But why is this so hard?

We probably should redefine the way we all deal with IT components. Take for example the software stack that is ultimately presented to the user as the desktop. It includes the operating system, the GUI and some tools to configure it. Why should IT have control over this? No one in their right mind thinks of telling employees to put their phone in the upper right corner of their desk or to demand the order in which the pens should be arranged. Maybe we should stop doing this when it comes to the virtual desk on the user's PC (keep in mind were the P stands for!).

Now we all know about the implications of leaving it up to end user. Security risks, undesired help desk calls to name a few. How about educating the user on how to maintain his own PC? We are able to do that when it comes to houses, cars, etc., so why not personal computers? The only thing organizations have to worry about is how to provision safe access to the organization's applications and data.

Isolation is the key word. This comes always up when people talk about application virtualization. Now applications are not virtualized, as they can't appear to be there but aren't there. But they can be isolated in restricted operating environment. The big question out there still is: what is the isolated operating environment and how can it still interact with others. Because many see that several applications on end user devices interact with each other in what I believe are undesirable methods: direct using each other's API. Because when applications are isolated these API's won't be usable. For example a corporate bookkeeping software won't be able to export directly into Excel if the bookkeeping software is isolated from Excel, which by the way is something you don't want to do if you care about (referential) integrity of your data.

Data exchange between applications should ideally be done through shared databases or through messaging interfaces (a.k.a. middleware). Those are guaranteed to work when applications are isolated in a restricted operating environment. When the application infrastructure is set up like that, we can then easily classify all applications in terms of level of control for the user and the required management. And thus create a "best of both world".

© Peter Bodifée 2008. All rights reserved

Wednesday, April 30, 2008

ESX getting competion from Hyper-V?

Recently I attended an information sharing afternoon organized by two of the big names in the IT industry who like to maintain excellent relationships with Microsoft. One was a computer manufacturer and the second makes chips. Guess who. But that is not the reason for this weeks column.

The speakers were all technical: consultants, CTO. The marketing folks stayed home and the sales people didn't show up as this was about a beta product which can not be sold yet. Expected shipping date is apparently august 2008.

So what did we learn? That Microsoft is finally realizing that their previous strategy on virtualization is in the trashcan and that their previous products that had virtual in their name didn't cut it. Going the route using a hypervisor has finally arrived in Redmond. I think a bit too late. One of the speakers (an independent consultant who used to work for Microsoft and is a MVP) believes that with the Microsoft name (and therefor re$ources) it will get a big push and catch on.

Bob Muglia (SVP @ MSFT) stated in a company announcement that the adoption rate of virtualization so far is limited due to "simply too cost-prohibitive and complex".

Well, I partially disagree. Yes, today nothing in IT infrastructures is easy as a walk in the park. As you shouldn't ask a cabin attendant to take over the pilot's job, you shouldn't expect the "setup.exe - next - next - finish" generation to create "Dynamic IT provisioning Datacenters" either. Virtualizing a server park using VMware ESX isn't cost-prohibitive. Those who think it is, should redo their math. The main reason why many IT organizations haven't virtualized their production servers is that Microsoft support is a pain when it comes to supporting those environments. And when something goes bad, the IT department doesn't want to be reminded they were running an unsupported setup.

When you try to follow the discussions on the internet forums (e.g. Ars Techica) you will find yourself pretty quickly lost if you don't know both products very well. And to me they seem more like flame wars.

Back to the subject. Yes, VMware will feel some heat from Microsoft in the near future. The first release of Hyper-V is running years behind the capabilities of ESX. And let's not forget the extensive testing of all the hardware is still long way to go. So if you need to standardise on virtualization technology (which I think you should) the safe bet will still be ESX for a while. I was disappointed with what I saw and heard from the experts. An "all Microsoft" shop will have to wait longer before they can start virtualizing, because the first release of Hyper-V will not be enough compelling to make the switch to virtualization. And let there be no mistake about it: virtualization has so many benefits that it will dwarf most costs to get there.

© Peter Bodifée 2008. All rights reserved

Wednesday, April 23, 2008

Realistic user requirements?

Users always expect information applications and it's underlying infrastructure to perform without a glitch and produce the expected results immediately. You sometimes wonder why you have to answer to requests with "the impossible will be done immediately, miracles take a little longer".

First there is nothing wrong for users to set the bar high in order for IT to provide meaningful stuff. But IT people are like the users of the systems they provide: lazy. So getting into action to solve a requirements involves work. You get the picture. If a machine can do the work, why should a human work. As long as it is mechanical type of work, I don't mind. Although using physical work to exercise and therefor stay fit and healthy isn't bad either.

Why do IT people find it difficult to solve user requirements besides the fact that it involves work? I believe a lot of users don't fully understand the complexity of designing, developing and delivering IT solutions. And because the movies spoil the brain of Joe Average with the capabilities of computers:

1. Word processors never display a cursor.
2. You never have to use the space-bar when typing long sentences.
3. Movie characters never make typing mistakes.
4. All monitors display inch-high letters.
5. High-tech computers, such as those used by NASA, the CIA or some such governmental institution, will have easy to understand graphical interfaces.
6. Those that don’t have graphical interfaces will have incredibly powerful text-based command shells that can correctly understand and execute commands typed in plain English.
7. Note: Command line interfaces will give you access to any information you want by simply typing, “ACCESS THE SECRET FILES” on any near-by keyboard.
8. You can also infect a computer with a destructive virus by simply typing “UPLOAD VIRUS”. (See “Fortress”.)
9. All computers are connected. You can access the information on the villain’s desktop computer even if it’s turned off.
10. Powerful computers beep whenever you press a key or the screen changes. Some computers also slow down the output on the screen so that it doesn’t go faster than you can read. (Really advanced computers will also emulate the sound of a dot-matrix printer.)
11. All computer panels operate on thousands of volts and have explosive devices underneath their surface. Malfunctions are indicated by a bright flash of light, a puff of smoke, a shower of sparks and an explosion that causes you to jump backwards.
12. People typing on a computer can safely turn it off without saving the data.
13. A hacker is always able to break into the most sensitive computer in the world by guessing the secret password in two tries.
14. You may bypass “PERMISSION DENIED” message by using the “OVERRIDE” function. (See “Demolition Man”.)
15. Computers only take 2 seconds to boot up instead of the average minutes for desktop PCs and 30 minutes or more for larger systems that can run 24 hours, 365 days a year without a reset.
16. Complex calculations and loading of huge amounts of data will be accomplished in under three seconds. Movie modems usually appear to transmit data at the speed of two gigabytes per second.
17. When the power plant/missile site/main computer overheats, all control panels will explode shortly before the entire building will.
18. If you display a file on the screen and someone deletes the file, it also disappears from the screen (See “Clear and Present Danger”).
19. If a disk contains encrypted files, you are automatically asked for a password when you insert it.
20. Computers can interface with any other computer regardless of the manufacturer or galaxy where it originated. (See “Independence Day”.)
21. Computer disks will work on any computer has a floppy drive and all software is usable on any platforms.
22. The more high-tech the equipment, the more buttons it will have (See “Aliens”.)
23. Note: You must be highly trained to operate high-tech computers because the buttons have no labels except for the “SELF-DESTRUCT” button.
24. Most computers, no matter how small, have reality-defying three-dimensional active animation, photo-realistic graphics capabilities.
25. Laptops always have amazing real-time video phone capabilities and performance similar to a CRAY Supercomputer.
26. Whenever a character looks at a monitor, the image is so bright that it projects itself onto their face. (See “Alien” or “2001″)
27. Searches on the internet will always return what you are looking for no matter how vague your keywords are. (See “Mission Impossible”, Tom Cruise searches with keywords like “file” and “computer” and 3 results are returned.)

Source: somewhere found on the internet, assumed to be public domain
Advise: whenever an user expresses a request or requirement keep on asking until you find and can agree on the needs of the user. The "fantastic demand" is just a way of starting a conversation. Take it with humor and we all have a good time!

© Peter Bodifée 2008. All rights reserved

Wednesday, April 16, 2008

Benefits of end-user device virtualization

In many organizations managing the desktop and laptop PCs seems like a bottom-less pit. It doesn't matter how much resources you throw in it, the situation barely improves. This makes you think that you have reached a point where no further improvement is possible. If the end-user would think that this is acceptable, we would all be happy. But the truth is very far from that.

The fundamental problem is that full standardization of the software stack on the end-user device has become nearly impossible. While some organizations go to great length to reduce the number of applications in order to get the support costs at acceptable levels, the hardware manufacturers make it difficult to maintain a stable operating system software stack (more commonly described as "image" by system managers). IT support organization face difficult challenges, which usually are barely understood by end-users. As a result it often has to sell "no" to end-users who require more flexibility to adapt to changing business needs.

The solution is to marry the operating system stack with the underlying hardware like Apple does with Mac OS X on Apple hardware. Apple fans will tell you in great length that after they switched to this platform, the horror and nightmare to keep the chosen operating system stack running on the variety of hardware equipment is over. But not every organization is willing to pay for the premium that Apple offers. And serious alternative vendors are not available. Unless you take a different perspective.

It will be easy to marry the operating system stack if the underlying hardware interface wouldn't be so volatile. This can be achieved by using a virtual machine. Using a hypervisor which decouples the actual hardware interface from the virtual hardware interface would put some relief on the maintenance of the operating system.

Most end-user device virtualization environments are set up for two major reasons as far as I can observe:

1) provide multiple machines on one physical machine
2) provide a secure virtual machine on an insecure physical machine or the other way around.

The first reason is very popular with software developers en testers. It is actually so popular that many people have the perception that this is the only useful application of client side virtualization. But the second reason is equally powerful. So why is this not ubiquitous?

An explanation could be that this set-up adds another layer in the stack, which is perceived by both end-users and support staff as added complexity instead of simplification. Well I don't know for sure because I am not an average end-user nor a IT support person. In the coming months I plan to run an experiment within one of my clients to see if the perception is true or not. In this particular environment the organization faces the challenge to allow more unmanaged (und therefor insecure) physical devices on their network while guaranteeing a secured access from certain corporate client applications to centralized corporate data. More news in the future.

© Peter Bodifée 2008. All rights reserved

Wednesday, March 26, 2008

The next step in Operating Systems

Last week's column resulted in some interesting feedback, thank you! One reader pointed me to a 16 year old demo of NeXTSTEP Release 3 in which you can clearly see that Graphical User Interfaces and easy-to-use applications today in operating systems like OS X, Ubuntu, Windows Vista and BeOS are nearly the same as 16 years ago in NeXTSTEP. So what can you learn from this? That the operating system itself doesn't matter. Because if it would, we should have seen some more advancements in the past 16 years.

What does matter both in the enterprise environment as well as to consumers are applications. To business people this is a no-brainer. So why is nearly everybody in the IT industry talking about the operating system? And do people in IT departments get upset that Microsoft will stop selling Windows XP, while they are not ready to upgrade to Vista? To be honest: do we care?

From an user perspective we shouldn't care. Because an operating system's primary function is to hide the details of the unintelligent hardware from the application. So that an application developer can focus on real functionality and not waste time and money to get some of the basics of his application working on the hardware. If you look at the average level of sophistication in enterprise applications in use today, it is no surprise that "the computer says no" from time to time.

Life would be simple if there was one universal operating system free of charge. Quite some people believe that Linux is that operating system and are putting a lot of effort on pushing in this direction. But such an universal operating system is nearly impossible to create. The huge variety of applications on different form factors of hardware put such a wide range of options that the OS developer can maybe create such software only with a myriad of configuration and tuning options. Hardly anything that follows the "keep it simple" principle, as you need the equivalence of a rocket scientist to get it working properly.

We users should push the industry towards a different direction. What I call the "software appliance road". It is likely that we are willing to pay for a software application, because an application can add value to what we are trying to achieve. The application comes packaged with an operating system which will allow it to run on any hardware form factor the application was designed for.

How can we achieve this goal? I believe the answer lies in hardware virtualization. Virtualization will allow us to separate software from hardware. Once we keep pushing in that direction I bet you will see that hypervisor software (which allows the execution of a standardized virtual hardware) will be implemented in hardware. The device and hardware component builders of this world can continue building more efficient hardware (energy waste comes to mind) and application developers can pick the operating system they think is best suited for the application. This also means that not every application will be available to run on every hardware. But once we all get into "appliance mode" we don't care.

A challenge? You bet it is. But money rules. Don't spend anything on "the latest and greatest" if you can't qualify and quantify tangible results and added value. At some point the industry will listen.

P.S. Yesterday on Dutch news there was an item on a poll sponsered by Logitech about the amount of technology purchased by consumers who don't know how to operate it. In 29% of cases only one member of the household is able to operate it. The amount "invested" is 8 billion euros (12.5 billion dollars) in the Netherlands which has 7.2 million households. Food for thought.

© Peter Bodifée 2008. All rights reserved

Wednesday, March 19, 2008

Are you on the right road (map)?

Robert X. Cringely once wrote in InfoWorld:

"If the automobile had followed the same development cycle as the computer, a Rolls-Royce would today cost $100, get a million miles per gallon, and explode once a year, killing everyone inside."

Robert brings up a point worth thinking about when you create and implement solutions using IT technology. I am not sure what his thoughts were when he wrote this, but I see a parallel with people jumping on the latest and greatest computer technology because it has the status of a Rolls-Royce, procurement costs are peanuts and the claim on resources is also not worth talking about. But there is also a "subtle" statement that this "latest and greatest" doesn't give us the promised or implied benefits.

Lately I have more questions then answers if all these technological "advancements" in IT are still "pushing" us in the right direction. Don't get me wrong; having a strong engineering background I do appreciate technology in general, in particular when mankind benefits from it.
But making things faster and bigger may not be a benefit.

With the current state of art and common practices in Information Technology I start to have my doubts. Therefor I would like to appeal on all digital architects (enterprise, integration, software, solution, infrastructure) AND users (both organizations as well as individuals) to challenge your thinking with the question: Are we on the right road?

The following experience could illustrate the off-track route.

You stand in front of a counter talking to a representative who stares at his/her computer:
Rep: "How can help you?"
You stated your problem or question.
Rep: "What is your name?"
You give your name (or any other relevant information).
Rep: "You are not in the computer (or system)."
You think: "No, I am standing in front of you."

In the above example: how come you as a person are perceived to be in a computer? What went wrong? More seriously: when information is not in sync with reality, why do users tend to take the information for being the truth? Because a computer doesn't make mistakes? Did we loose our capabilities to observe and use our brain?

A more current example are the phenomena of "social networks" on the internet. While being a fan of networking - using IT and digital communication can make it easy to do so - I think that human individual should be able to stay in control of their personal information. It turns out to be that some makers of social networking software don't want users to permanently delete their data they (re)considered not to be public anymore. And I haven't even started to talk about what governments and commercial organizations do with data about you.

My feeling tells me that it is time to rethink where we are heading in the information age. Just plainly buying the latest and greatest hardware and software offered to us by the developer is certainly not the right direction. Why are so many people and organizations going down that road? Blissful ignorance?

More people and organizations are starting to become aware of the security related threats as well. And this isn't unrelated to the above mentioned "ownership of information". Doc Searls wrote as Senior Editor of Linux Journal recently about "Who is in charge of security?". Related to this are the efforts on VRM, which stands for Vendor Relationship Management, the opposite of CRM. Recommended reading for inspiration!

Like to hear/read your comments!

© Peter Bodifée 2008. All rights reserved

Wednesday, March 12, 2008

Tips for the architecture documentation process

In IT architecture documentation serves several purposes. While the younger generations seem to ignore the written word and drawings, it remains an important tool for communication in many organizations. Not only between the architects and the developers, also between the sponsors and the architect or architecture team. And let's not forget the remaining stakeholders.

The challenge here is to create a controllable process without putting to much limitations on the creativity needed during the creation of the architecture. So first of all you shouldn't try to implement a process. Confused? Read on.

What really matters is the documentation trail and approvals. Haven't you seen those documents labeled with statuses like: draft, concept, preliminary, to be reviewed, final or any arbitrary combination of those labels? Were you immediately clear AND sure where you were standing with the documentation? My experience so far is that nobody takes notice of the status, as there are no universal accepted values related to these labels. In other words: useless. Oh, by the way, quite a few implemented document management systems are dysfunctional within the IT organization.

You don't need to look far for a solution. You just take version management and apply it to documents as software engineers do with code. But don't go down the road of releases with mayor, minor versions with revisions. That complexity is not needed here. A simple release and version nummer in the format release.version is sufficient (e.g. 0.3, 1.1).

And now the "trick" to get to a documentation trail and approval without really trying to implement a process.

Give version numbers a meaning (or status if you will). Be rigid with the definition, don't tolerate diversions. It is amazing how quickly word of mouth will spread, the acceptance of the "process" is almost viral. We started this at the corporate level in a large organization with several autonomous divisions and within a few months all people involved with architecture documentation were aware and acted accordingly. And this was achieved without any systems and software for document management! Just "free flowing" documentation, usually shared with e-mail. Oh, it helps to ignore the whiners.

Based on our experience give the following value (read meaning or status) to a version number.

x.1 - x.4

Internal versions for the team who creates the documentation

x.5

First version to be shown to the lead architect in the applicable domain

x.6 - x.7

Additional internal versions

x.8

Version to be formally reviewed by the architecture board (or whatever group is responsible for signing off on the content)

x.9

Version to be approved by the sponsor (e.g. CIO, business management)

x+1.0

Final (approved!) version



x is the release number and should start with 0.

Those who observe closely notice that the version number also represent the approximate percentage of completeness. 0.5 is 50% ready, 0.9 is 90% ready, etc.

That's all! Enjoy the results.

© Peter Bodifée 2008. All rights reserved

Wednesday, March 5, 2008

User Experience overlooked in IT architecture?

User Experience is often thought to be a topic for the User Interface designers. If you leave it up to them you may be heading for disaster, because chances are high the IT solution becomes a monster. Not that there aren't any good UI designers, even though there are many bad ones. It is because the User Experience is probably the "architectural glue" between the human and the machine.

While the relevance of the user experience seem so obvious, IT developers and IT operation staff are notorious when it comes to considering the needs of users. They often take their own perception on IT matters being the same as for anybody else. To illustrate this one can easily fire up a lively conversation on how users are treated at help desks when they have IT issues getting their work done. (Click here if you don't see a video)



Can the developers be blamed? Not really when you see the premises on which they do their work. There are often no decent guidelines for the user experience as architects often think of the guidelines as something for the UI designers. In my view the ones to blame are the architects, and certainly not only the application architects. Any organization implementing IT systems involving humans should take a very serious look at the User Experience. And this starts with doing away with the notion of "the user". Because he or she really doesn't exist!

A good way of getting the discussions concerning user experience in the right direction is to create personas. In short personas are fictitious characters that are created to represent the different user types within a targeted demographic that will or might use an IT solution. Personas are said to be cognitively compelling because they put a personal human face on otherwise abstract data about users. When you carefully look around it will not be very difficult to describe and name the relevant user types.

But there is more about User Experience then just personas. Without going to elaborate here a lot about how architects could deal with the User Experience please take a look at Simon Guest's talk on this topic titled "Putting the User back into Architecture". I am not an advocate of Microsoft, but I like the framework Simon created. You can find his recorded talk at the Canadian Strategy Architecture Forum 2007 here.

In the first 10 minutes he presents the framework architects at Microsoft use, after that he goes in depth explaining this framework, often in in relation with Microsoft products. On a funny note: about 5 minutes into the talk he has the "mandatory crash" of their software: in this case PowerPoint. So much for User Experience ;-)

P.S. The example of a human "talking" to the IT systems using IM (Instant Messaging) is a very interesting concept. My bank currently allows it's internet banking customers to use Microsoft's IM to obtain balance and transaction information. Any question that can not be answered is offered to be passed on to the real people at the internet part of the bank. While it still is very simple it is definitely very elegant experience to get this information without having to login on the internet banking site.

© Peter Bodifée 2008. All rights reserved

Wednesday, February 27, 2008

IT Standardization is the Result of a Principle

When asked, smart people will tell you that technology independence is a very important principle for IT architecture. The most important reason is that technology changes over time. Mostly the changes are fundamentally minimal; paradigm shifting changes occur less frequent. But even when the changes seem minimal, it remain necessary to define a standardization policy with it's associated guidelines to create a technology independence.

Please note: IT architecture is mainly about principles, policies and guidelines and less about designs and/or solutions.

The whole purpose of such a standardization policy is to define the right standards. Taking the wrong components as standards can actually cripple the IT organization and make changes very expensive and even cost inhibitive. With all the devastating effects for the user community.

What are the right standards? Even though a lot of organizations pick strategic solutions and solution providers as their standard, this is not the right starting point. A good starting point are the interfaces. Your guideline for standardizing could be things like "open" (that is not-propriety, not bound to royalties and/or licenses), well accepted and so on. If you concentrate on those qualities you can be assured that many people were involved in making sure that the interface has the necessary attributes to make it useful.

Look for anything with "protocol" in it's description (or title). For example it is perfectly OK to say that IP is your standard for networking. You may laugh - since this is a no-brainer today, but some 20 years ago people would look as if you were nuts, because in those days a lot of IT shops opted for IBM's SNA. And SNA stands for Systems Networking Architecture. But if you take a closer look it was a propriety design of a set of solutions for networking systems together. Not even close an architecture. Which brings me to the point that architecture is probably the most ill-used keyword in descriptions used in the IT world. But that is a subject for a different post.

If you focus on standardization of interfaces you and the IT organization will be able to keep things under control while remaining flexible. The problem is that you won't find many people today understanding the purpose of a standardization effort. A lot of people, in particular in the user community, think that standardization is all about limiting choices to reduce costs. Yes, if you standardize on products, services and solutions you will have to fight battles. Battles, because the organization will think that they are being limited. Isn't IT about supporting the mission of the organization?

Make it part of your plan to explain first what standardization is all about. After sometimes lengthy discussions within the IT organization, people turn to me and ask for my preference or opinion. I usually get only one question: Is solution A better then solution B? This question obviously is the conclusion of the wrong discussion. My usual answer is "It doesn't matter what you choose, as long as you choose." Which leads to restarting the discussion but now with a different route.
If there is no time to loose, use one of my favorites: "Standardization is not about the color of the PC, the color of the cables, the operating system or the application software to be used for office automation. Standardization is needed for how applications and systems communicate with each other so that the users can maximize the information derived from the data that is stored in these systems. When you take this into account, which solution would best fit according to you?"

© Peter Bodifée 2008. All rights reserved

Wednesday, February 20, 2008

Implementing IT Architecture is all about change

Implementing an IT Architecture is changing the way we look at the IT systems, the way we deal with them and so on. A change management approach appears to be best method to drive the implementation. Now change management in IT is a subject for a book on it's own, so for now I will only give some practical suggestions for IT architects how to position themselves during the implementation phase. Because this is where 90% of the architect's work is done.

As many can attest, change also seem to cause resistance. Almost like the third Newton law "actio et reactio". Why do people seem to resist to change? Well they don't, people change all day. Who sticks to his plan every day? Lack of insight what is going to happen, or even worse lack of influence on what is going to happen results in lack of control, which creates the anxiety and stress that people hate so much.

You typically see that there are many roads to Rome. The choices often makes implementers nervous whether their choice was right. The desire of people to control things in life and the fact that there are many ways to reach a destinations is something you should take advantage of. Let the people involved decide for themselves how to get to Rome. Appeal on their own brain power. Don't worry where they currently are and what they are doing. Waste of energy.

The lead IT architect should be very clear what the destination (reference architecture) is, this is no subject for debate. Those in doubt about the validity of the destination should search their own, but not argue with the IT architect. But how one gets to the destination can be very different for various stakeholders. And when they can create their own marching orders, they feel much more confident to pursue the change instead of resisting them. The lead IT architect should just join the program leader in his helicopter and make themselves available for anyone who is or seems to be lost.

Now I don't want to suggest that the destination is final. IT systems are very dynamic in nature and therefor the IT architecture is not so rigid as in the physical world of architecture. People may get confused when they are heading in a certain direction that the destination has moved. Instead of allowing the concept of a "moving target" which creates a lot of uncertainty, I prefer to explain that it is journey. We have therefor intermediate destinations. And let the organization decide themselves if and when it is worth continuing the journey.

© Peter Bodifée 2008. All rights reserved

Wednesday, February 13, 2008

To virtualize or not: that is the question



Virtualization is hot. Vendors are screaming that their product is "virtualized". Hype? Curse or blessing? Questions which will give you tons of answers depending on who you ask.

Let's first address what virtualization is. "Virtually" according to the dictionary is "in essence or effect but not in fact" or "seems to be present". The best description I ever found on the internet what virtualization is in an IT environment was written by Willem Joustra and can be found on his blog. He does away with application virtualization for good. Leaves us with virtual hardware in the forms of virtual cpu, disk, networks, etc. and virtual operating systems.

So am I adding now more words to the (mostly technical) discussion already taking place? No, I want to see if this notion of virtualization can ultimately solve all the challenges the IT organization faces that are directly or indirectly related to physical instances of hardware and operating systems. Challenges in maintaining them through their operational life. Because I envision that the flexibility possible with virtualization will make total new concepts possible in IT, having a dramatic impact on how we view hardware and operating system. In the end, the user's only concern is application functionality.

The effect of virtualization to the user is already visable in networking, data storage and to some extent in servers (mainly in high-end and in mid-range, less in low-end). The only area which seems to be untouched is end-user devices. Yes, seems, because it exists and only due to the huge number of end-user devices it is not visible on a large scale.

Virtualization on an end-user devices (most common devices today are still desktop PCs and laptops) gives the possibility to take your created environment - your set of applications customized to your liking - from one device to another device without a major effort. You can actually keep the virtual hardware in your wallet! Just think of the freedom you now have. I am currently involved in creating a major change for schools on how to implement the IT based learning environment. Using this virtualization concept makes innovation possible on when and where children use their own learning environment without having to supply everyone with a laptop, which introduces a new series of problems instead of solving ones. This is just one of the examples.

Any discussion of the pro's and con's of a particular virtualization technique in a product should be left to IT engineers to challenge their thinking and to product marketing managers who have nothing better to do. It is not something an user community has to worry about. If you have no need to be on the (b)leading edge, just wait 6 months, observe and you will be able to tell what product was hype and which ones stayed around. What the user community should worry about is how to break the conventional thinking on how use of machines fit into the total IT Architecture. Because machines can be easily virtualized without loosing functionality. That was discovered more then 35 years ago and still is applicable today.

© Peter Bodifée 2008. All rights reserved

Wednesday, February 6, 2008

The board's view on IT Architecture

The usefulness of having an IT Architecture (I rather prefer value instead of usefulness) for an organization is often subject to debate. To end the debate, it usually helps if the board (or executive team) fully supports the IT architecture, not only the existence of IT architect(s) in the organization.

Some people claim that a typical board can't understand the IT architecture by itself (who can?) and therefor is not likely to support or enforce an IT Architecture. In this case it is very likely that the organization's IT architects are at the top of their own Babylon Tower. Not necessarily a smart thing to do, as it will not lead to a wide acceptance of IT Architecture, certainly not with business executives.

Some of you had the privilege to meet with board members in order to present a proposition on which they should or could take a decision. For those who have not: the following questions are in the head of the board members:

  • What is in it for me?
  • Why should we (read our organization) do this?
  • How much will this contribute to the bottom line this year and the next years?

Yes, it is that simple. They are held accountable and need to clarify their decisions to the shareholders. Even if the board is one person and the only shareholder!
No, "Partnering within our ecosystem will greatly benefit the IT mission to simplify the use of the Enterprise Messaging Bus using broadcast protocols on the secured transparent network" or similar wording will not make their day.

In quite a few organizations the responsability on IT matters within the board is taken by the CFO. Unfortunately many CIO's - despite their title - are not members of the executive team and do report to the CFO. If your CFO is the person to give the final nod, make sure that (s)he get the view on IT architecture in a language (or numbers to be specific) that is clearly understood.

Financial people love ROI: Return On Investment. With the formula for Arithmetic ROI it is simple to see if the investment is profitable or not. But calculating the investment values of a complex IT infrastructure or an elaborate software platform is not everybody's piece of cake. So IT has to satisfy the board with something else.

We are talking about the cost necessary to implement the change towards the IT solutions according to the IT architecture, the current cost of the IT landscape (usually already known) and the anticipated cost of the future IT landscape. And let's not forget the cost of creating the IT architecture and to maintain it. Good Architects have a fee, not an hourly rate.

Keeping in mind that I am talking about all costs*: both capital and operational expenditures (CAPEX and OPEX). I always suggest to use only the hard facts: the future combined CAPEX and OPEX should be lower then the current one. Divide the cost of change by the savings achieved and you will have the duration it takes before the savings are effective. If this period is longer then the life span of the solutions then you should think twice before going ahead. It certainly will not impress the CFO. Tip if your initial numbers don't look good enough: check if you included the cost of all relevant IT staff.

So how much detail should the IT architect bring to the picture? Not too much, because it will sidetrack the audience and you will get IT into slippery territory. If the IT architect is trusted using credible sources, the consolidated numbers should give enough insight to the board. And as for the cost for change: be bold when quoting numbers. Rather too much instead of having to say "oops" at a later stage.

As you can see, IT Architecture for Board Members is all about a financial view on the IT landscape. IT architects should demonstrate the financial viability creating, implementing and maintaining an IT Architecture.

*) Doing a TCO analysis - Total Cost of Ownership - is worth the effort if IT also would like to get a good financial understanding of the stuff they implement and maintain.

© Peter Bodifée 2008. All rights reserved

Wednesday, January 30, 2008

IT solutions being monsters?

From time to time we run into IT solutions that look like monsters. And after the irritation is over (if that ever happens), you start to think what has caused the creation of such a beast. I have found that a lot can go wrong on the path from defining the problem to setting the requirements for the engineer(s) to create a solution. And according to Murphy: anything that can go wrong ... will go wrong.

So preventing creating monsters is making sure that in this phase nothing goes wrong. A lot of people argue that this is impossible. Without saying that perfect execution can be garantueed, positive results can be achieved using a process toward the phase where the engineers start doing their thing. A process that is actually understood by everyone involved.

Looking at how the phyiscal world is created, you often see that this part of the proces is driven by someone who calls him/herself an "architect". Now why should this be any different in the IT world?

So what do architects do? Let's start with what they don't do:


  • Create a detailed list of observations of the current situation.
    Why would you be interested in looking at something you don't want to see anyway?

  • State what are technical problems in the current situation and how this should be addressed.
    A detailed analysis in order to address the issues is work for those who develop the solution: the engineer.

  • Produce complicated drawings which only highly experienced engineers can understand.
    No matter how much you trust the architect, you still want to get a feeling what you will end up with.

Architects create views on the solution in order to communicate with the people who want this solution in the first place to see if the solution meets with the expectations. In the process of creation the architect actively listen to a wide range of users. And when the engineers start doing their part they are available to the engineer if something is not completely clear. I mean architects are human too.

An architect in the IT world isn't any different. Looking at what people with "architect" in their title say and do, will give you sometimes reason to believe that in the digital world things are different. They way I see it, the architect function is exactly the same.

Now "views" created by architects come in all sorts of shapes and sizes. In future posts I will address in more detail what views are helpfull and stuff that is a waste of time and money. What is important is that these views (which often are referred to as "the architecture") have a function and are not a goal by itself: the allow for communication between the user and the engineers. So that in the end you won't have a monster!



© Peter Bodifée 2008. All rights reserved

Wednesday, January 23, 2008

All right, let's get started!

It is already a while ago when I thought about sharing my experiences with the world using the latest communication stuff, blogging being one of them. So what am I up to? Well, not something that is written in one article. I have this vision that creating IT based solutions should not be such a pain as it has been up to now. With pain I mean that the solution is not ready on time, costs way more then originally anticipated and still doesn't do what it is supposed to do. I figure that you know enough examples in your own environment.

So how can you and I contribute? By giving insight into what really matters when people work together to solve a challenge. And this insight is in a few words: applying architecture to create a thing (called artifact in architecture terms) that is useful to humans who would like to use it.

Realizing that this is still pretty high level and for many humans very obscure. This is my challenge: share with the world using all possible means of communication that it is not as hard as it seems. Starting with this blog I will use common language understood by most humans and "infotainment" to make it happen. So feel free to comment if something is not clear.

I think of this as a journey. While I have a picture in my head how we will interact in the future so we have a better life on earth, it is not easy for me to just write a story. And certainly not in english, which is not my native language. So what am I going to do on this journey? Well, I want to coach anyone who wants to be on his or her journey to reach the same destination. A destination that is not clear yet. Confused? I get it, it will not be a short trip.

Let me conclude this initial article with a wish that many people will work/experience with me on this journey. While I realize that this all is still very vague, my intentions are very clear: be very practical about making progress and getting results. I don't do this just for my own pleasure or to boost my ego.

P.S. I see this endeavour also as large scale experiment to test my hypothesis that "every problem is in the end caused by a communication problem". This is something that motivates me and should not be something you have to agree with.

© Peter Bodifée 2008. All rights reserved