Wednesday, April 30, 2008

ESX getting competion from Hyper-V?

Recently I attended an information sharing afternoon organized by two of the big names in the IT industry who like to maintain excellent relationships with Microsoft. One was a computer manufacturer and the second makes chips. Guess who. But that is not the reason for this weeks column.

The speakers were all technical: consultants, CTO. The marketing folks stayed home and the sales people didn't show up as this was about a beta product which can not be sold yet. Expected shipping date is apparently august 2008.

So what did we learn? That Microsoft is finally realizing that their previous strategy on virtualization is in the trashcan and that their previous products that had virtual in their name didn't cut it. Going the route using a hypervisor has finally arrived in Redmond. I think a bit too late. One of the speakers (an independent consultant who used to work for Microsoft and is a MVP) believes that with the Microsoft name (and therefor re$ources) it will get a big push and catch on.

Bob Muglia (SVP @ MSFT) stated in a company announcement that the adoption rate of virtualization so far is limited due to "simply too cost-prohibitive and complex".

Well, I partially disagree. Yes, today nothing in IT infrastructures is easy as a walk in the park. As you shouldn't ask a cabin attendant to take over the pilot's job, you shouldn't expect the "setup.exe - next - next - finish" generation to create "Dynamic IT provisioning Datacenters" either. Virtualizing a server park using VMware ESX isn't cost-prohibitive. Those who think it is, should redo their math. The main reason why many IT organizations haven't virtualized their production servers is that Microsoft support is a pain when it comes to supporting those environments. And when something goes bad, the IT department doesn't want to be reminded they were running an unsupported setup.

When you try to follow the discussions on the internet forums (e.g. Ars Techica) you will find yourself pretty quickly lost if you don't know both products very well. And to me they seem more like flame wars.

Back to the subject. Yes, VMware will feel some heat from Microsoft in the near future. The first release of Hyper-V is running years behind the capabilities of ESX. And let's not forget the extensive testing of all the hardware is still long way to go. So if you need to standardise on virtualization technology (which I think you should) the safe bet will still be ESX for a while. I was disappointed with what I saw and heard from the experts. An "all Microsoft" shop will have to wait longer before they can start virtualizing, because the first release of Hyper-V will not be enough compelling to make the switch to virtualization. And let there be no mistake about it: virtualization has so many benefits that it will dwarf most costs to get there.

© Peter Bodifée 2008. All rights reserved

Wednesday, April 23, 2008

Realistic user requirements?

Users always expect information applications and it's underlying infrastructure to perform without a glitch and produce the expected results immediately. You sometimes wonder why you have to answer to requests with "the impossible will be done immediately, miracles take a little longer".

First there is nothing wrong for users to set the bar high in order for IT to provide meaningful stuff. But IT people are like the users of the systems they provide: lazy. So getting into action to solve a requirements involves work. You get the picture. If a machine can do the work, why should a human work. As long as it is mechanical type of work, I don't mind. Although using physical work to exercise and therefor stay fit and healthy isn't bad either.

Why do IT people find it difficult to solve user requirements besides the fact that it involves work? I believe a lot of users don't fully understand the complexity of designing, developing and delivering IT solutions. And because the movies spoil the brain of Joe Average with the capabilities of computers:

1. Word processors never display a cursor.
2. You never have to use the space-bar when typing long sentences.
3. Movie characters never make typing mistakes.
4. All monitors display inch-high letters.
5. High-tech computers, such as those used by NASA, the CIA or some such governmental institution, will have easy to understand graphical interfaces.
6. Those that don’t have graphical interfaces will have incredibly powerful text-based command shells that can correctly understand and execute commands typed in plain English.
7. Note: Command line interfaces will give you access to any information you want by simply typing, “ACCESS THE SECRET FILES” on any near-by keyboard.
8. You can also infect a computer with a destructive virus by simply typing “UPLOAD VIRUS”. (See “Fortress”.)
9. All computers are connected. You can access the information on the villain’s desktop computer even if it’s turned off.
10. Powerful computers beep whenever you press a key or the screen changes. Some computers also slow down the output on the screen so that it doesn’t go faster than you can read. (Really advanced computers will also emulate the sound of a dot-matrix printer.)
11. All computer panels operate on thousands of volts and have explosive devices underneath their surface. Malfunctions are indicated by a bright flash of light, a puff of smoke, a shower of sparks and an explosion that causes you to jump backwards.
12. People typing on a computer can safely turn it off without saving the data.
13. A hacker is always able to break into the most sensitive computer in the world by guessing the secret password in two tries.
14. You may bypass “PERMISSION DENIED” message by using the “OVERRIDE” function. (See “Demolition Man”.)
15. Computers only take 2 seconds to boot up instead of the average minutes for desktop PCs and 30 minutes or more for larger systems that can run 24 hours, 365 days a year without a reset.
16. Complex calculations and loading of huge amounts of data will be accomplished in under three seconds. Movie modems usually appear to transmit data at the speed of two gigabytes per second.
17. When the power plant/missile site/main computer overheats, all control panels will explode shortly before the entire building will.
18. If you display a file on the screen and someone deletes the file, it also disappears from the screen (See “Clear and Present Danger”).
19. If a disk contains encrypted files, you are automatically asked for a password when you insert it.
20. Computers can interface with any other computer regardless of the manufacturer or galaxy where it originated. (See “Independence Day”.)
21. Computer disks will work on any computer has a floppy drive and all software is usable on any platforms.
22. The more high-tech the equipment, the more buttons it will have (See “Aliens”.)
23. Note: You must be highly trained to operate high-tech computers because the buttons have no labels except for the “SELF-DESTRUCT” button.
24. Most computers, no matter how small, have reality-defying three-dimensional active animation, photo-realistic graphics capabilities.
25. Laptops always have amazing real-time video phone capabilities and performance similar to a CRAY Supercomputer.
26. Whenever a character looks at a monitor, the image is so bright that it projects itself onto their face. (See “Alien” or “2001″)
27. Searches on the internet will always return what you are looking for no matter how vague your keywords are. (See “Mission Impossible”, Tom Cruise searches with keywords like “file” and “computer” and 3 results are returned.)

Source: somewhere found on the internet, assumed to be public domain
Advise: whenever an user expresses a request or requirement keep on asking until you find and can agree on the needs of the user. The "fantastic demand" is just a way of starting a conversation. Take it with humor and we all have a good time!

© Peter Bodifée 2008. All rights reserved

Wednesday, April 16, 2008

Benefits of end-user device virtualization

In many organizations managing the desktop and laptop PCs seems like a bottom-less pit. It doesn't matter how much resources you throw in it, the situation barely improves. This makes you think that you have reached a point where no further improvement is possible. If the end-user would think that this is acceptable, we would all be happy. But the truth is very far from that.

The fundamental problem is that full standardization of the software stack on the end-user device has become nearly impossible. While some organizations go to great length to reduce the number of applications in order to get the support costs at acceptable levels, the hardware manufacturers make it difficult to maintain a stable operating system software stack (more commonly described as "image" by system managers). IT support organization face difficult challenges, which usually are barely understood by end-users. As a result it often has to sell "no" to end-users who require more flexibility to adapt to changing business needs.

The solution is to marry the operating system stack with the underlying hardware like Apple does with Mac OS X on Apple hardware. Apple fans will tell you in great length that after they switched to this platform, the horror and nightmare to keep the chosen operating system stack running on the variety of hardware equipment is over. But not every organization is willing to pay for the premium that Apple offers. And serious alternative vendors are not available. Unless you take a different perspective.

It will be easy to marry the operating system stack if the underlying hardware interface wouldn't be so volatile. This can be achieved by using a virtual machine. Using a hypervisor which decouples the actual hardware interface from the virtual hardware interface would put some relief on the maintenance of the operating system.

Most end-user device virtualization environments are set up for two major reasons as far as I can observe:

1) provide multiple machines on one physical machine
2) provide a secure virtual machine on an insecure physical machine or the other way around.

The first reason is very popular with software developers en testers. It is actually so popular that many people have the perception that this is the only useful application of client side virtualization. But the second reason is equally powerful. So why is this not ubiquitous?

An explanation could be that this set-up adds another layer in the stack, which is perceived by both end-users and support staff as added complexity instead of simplification. Well I don't know for sure because I am not an average end-user nor a IT support person. In the coming months I plan to run an experiment within one of my clients to see if the perception is true or not. In this particular environment the organization faces the challenge to allow more unmanaged (und therefor insecure) physical devices on their network while guaranteeing a secured access from certain corporate client applications to centralized corporate data. More news in the future.

© Peter Bodifée 2008. All rights reserved