As many of you know, we've recently done a deal with Fujitstu to refresh our desktop estate. We're going to have the latest Office, the latest Windows, and all of it will be delivered via the network to clever thin client terminals or thin spec notebooks. Considering we're presently struggling along with XP, this will be a bit of a leap from where we are presently.
I 'd imagine we'll have this new environment - or an incremental iteration of it - for maybe a decade. Perhaps a bit more. It takes that long for a desktop operating environment to get so long in the tooth that no matter what you do you can't keep it going any longer. And we'll keep it that long because the costs of getting a working desktop in the first place are so absurdly high that doing anything else is completely irresponsible.
Personally, I think it likely this is the last version of Windows anyone ever widely deploys, though.
The reason? I think they'll be fewer workloads that actually require a heavy deskop stack. Today, of course, we have all this legacy that's coupled to the desktop, but in a decade, I really doubt that will be the case. Most stuff will arrive via the browser.
Furthermore, its not impossible to imagine that they'll be ubiquitous wirless networking everywhere, even those difficult places outreach workers sometimes have to go. So we won't need a heavy desktop stack in order to make sure offline works.
And, in a decade, our security colleagues will no doubt have found clever ways to let us do all this without the fortress stuff they presently require of us.
Can you really imagine we'd need much more than a tablet-like device in such an enviornment? I can't.
Oh, I know there'll still be some occasions you'll need everything local, maybe developing apps for example, but by far the greatest marjority of our workers won't need that stuff.
The real question, I think, is what Microsoft will do to restore the value of Windows. In the past, their strategy has been to shift the operating system up the value chain, taking more specialised functionality from apps and embedding it in the base platform. They did that with Internet Explorer, for example. With messaging queueing and transaction services, and a whole pile of other things that were once separate apps.
That's obviously over because most of the action is now happening in the datacentre (or the cloud).
From a strategic point of view, if you're designing the future technology estate of a large organisation, that last thing it makes sense to do in this kind of context is build stuff that depends on a desktop stack. Furthermore, decoupling legacy from the desktop stack also has to be on the agenda, because you just can't count on that stack being relevent in 10 years time.
It feels funny, doesn't it, thinking about Windows in the context of it being irrelevent, after all these years we've relied on it. I guess it proves, again, that change is the only constant.
It's about time too. The complexity has become greater for diminishing returns. Funny how agile and flexible are used as marketing mantras to distract from this fact.
Great article and hope it makes CEO's adopt a more challenging approach to where they are and whey they need to be going with their technology investments and to start demanding a much higher return.
Posted by: William D | November 01, 2010 at 10:30 AM
I had exactly this conversation here a couple of weeks ago.
The complexity of the desktop OS now is too great, most users probably only use 10% of it.
Posted by: Daniel James | November 01, 2010 at 11:04 AM
VDI - I'm currently PMing this for another organisation (albeit much smaller) with a deadline from having nothing to a having working, scalable and rolled out solution of 6 months. I agree the possibilities here are great, especially with the UK's network infrastructure getting a decent upgrade.
There are cost savings with future-proofing hardware too, the average VDI terminal lasts 7-10 years, compared with 3-5 for a desktop.
Application Virtualisation means even if there are new Windows editions in the future, the refresh may simply involve a quick install onto a gold image back in the server, and a simple group policy change to users to get them onto Windows 8. Simplicity itself (almost).
Posted by: Glasgow_Red | November 01, 2010 at 11:40 AM
Here's an interesting take on this from Ray Ozzie - http://ozzie.net/docs/dawn-of-a-new-day/
Posted by: Nick | November 01, 2010 at 06:10 PM
Yes, I already saw that... its a very interesting parting note, isn't it.
Posted by: James Gardner | November 01, 2010 at 06:17 PM
I can certainly see the move to thinner clients and cloud/datacentre computing as a fairly likely future model for many people's and staff's access to the various apps they use and a valid one for many use cases. It harks back to the old school of mainframes and client terminals.
However one thing that rarely gets much mention is that this puts massively increasing importance on the reliability, capacity and availability of the network. When the network goes down then all your staff are suddenly unable to do most of their useful work until it's back up and running again. Also what about the classic lunchtime slowdown on network access. When it's just slow loading of the BBC News website then it's not so important, but when it's access to your word processor, or a business critical app, then it becomes a huge problem.
yet when I see all this talk about cloud computing and thin clients and cost savings, there rarely seems to be much talk about work to significantly improve the network. What are your thoughts on this?
My personal prediction for the future of computing is more of a hybrid model with fairly capable but self-contained end-user devices which run apps and save data locally, but sync data and state with remote servers in the background. Thus you get much of the benefit of thin client but still have usability when network access goes down.
These end devices (which may well be tablets or laptops or whatever) would then be very well secured but VPN across insecure carrier networks back to "home". Home here could either be multiple places for a consumer device (ie google, flickr, etc today) or a small number of cloud/datacentre based locations for corporate devices.
Posted by: David | November 02, 2010 at 11:20 AM
That's a good point - and actually goes to something else we've been considering of late, which is the long term future of private networks. de-perimeterisation is coming of age, you know, and in the decade timeframe, its really rather likely that all this fortress stuff will be substantially over. Its the bottleneck of fortress architecture which makes all this stuff hard today, I think.
Posted by: James Gardner | November 02, 2010 at 11:57 AM
People tend use the fortress model because it's mostly well understood and is comparitively "easy". the hybrid model I suggested above (and I have it more fully fleshed out in my head) is not easy either and requires a fair amount of expertise to implement properly. "hard" and "expertise" usually translate to "expensive".
I do think that a good prototype could be put together with today's technology though.
Posted by: David | November 02, 2010 at 02:22 PM
to be honest, this applies today with the thick desktops. If the network goes down at our place, all the network shares, internal websites, and email goes with it - and takes 80% of our work too. People end up playing solitaire or chatting until its fixed. Think about that - its not just the public internet that we're reliant on.
I guess the answer is greater redundancy - multiple network feeds to the internet infrastructure. The data network is becoming just like phone and electricity feeds.
Posted by: AndyB | November 02, 2010 at 11:25 PM
The need to move away from the fortress approach is paramount. Five years back GSi / GSx / N3 etc. seemed excellent ideas - they are now nothing more than a barrier to getting things done and do so much to prevent collaboration, even within the public sector.
Posted by: Phil | November 03, 2010 at 01:27 PM
With that I can agree completely.
Posted by: James Gardner | November 03, 2010 at 01:30 PM
What is the browser going to run upon? How are you going to install / upgrade the browser as time goes by.
And then, if you want access to the GPU so that you do anything nice, what are you going to do? Install a graphics driver?
Oh, then you need to control you network interface.
Ah, you also need to display stuff on the screen, therefore you need a display driver.
Oh, and then you are going to have some people that have different monitors, sizes, resolutions. So you will need some mechanism to update / change those drives.
Then you need somewhere to persist those drivers, and something to safely manage that persistence, and to secure that persistence.
.....
....
Guess what, you need an operating system.
Posted by: ross | November 05, 2010 at 05:13 AM
Can you be fully productive in browser today? I don't think so. Even HTML5 is not enough... We will need to make browser much more capable
Posted by: Konstantin | November 08, 2010 at 08:51 AM
Maybe not today, but what about in 5 years... or ten?
Posted by: James Gardner | November 08, 2010 at 09:30 AM
To my mind dumping the desktop and moving to 'the cloud' is a massive step backwards. It takes us back to the type of computing environments we had 30 years ago where all processing is done 'somewhere else' and locally you just enter data and look at the results. Exactly what Microsoft strove to get us away from by introducing MSDOS and then Windows.
Posted by: Timbo | November 09, 2010 at 08:22 AM
The problem has already moved on from the desktop IMO.
The problem for most large organisations is how to move on from IE6 or similar to the latest IE or Chrome, or FF, etc. when all their browser based applications don't run on the newer browser version.
It is the age old cyclic problem. Whatever platform you move to - thin, thick, hybrid, etc. - will not be quite so standard as it promises. Nor will it last for more than a few months before something new trumps it and we're back in the cycle of "the other way is better".
At the end of the day, it is what you run, not what you run it on that counts (and that's the bit people get wrong time and again).
Regards
The Enterprising Architect
Posted by: Jon H Ayre | November 10, 2010 at 07:24 PM