Friday, August 22, 2008

Published prediction - LINUX

What Linux Will Look Like In 2012

Our open source expert foresees the future of Linux: By 2012 the OS will have matured into three basic usage models. Web-based apps rule, virtualization is a breeze, and command-line hacking for basic system configuration is a thing of the past.

By Serdar Yegulalp, InformationWeek -->Aug. 14, 2008 URL: http://www.informationweek.com/story/showArticle.jhtml?articleID=210002129

What will desktop Linux be like four years from now?

In the time it takes most college students to earn an undergraduate degree -- or party through their college savings -- Linux will continue to mature and evolve into an
The gOS "Space" desktop distills the Linux desktop to its bare, Web-driven essentials.
(click for image gallery)

The single biggest change you'll see is the way Linux evolves to meet the growing market of users who are not themselves Linux-savvy, but are looking for a low-cost alternative to Microsoft (or even the Mac). That alone will stimulate enormous changes across the board, but there are many other things coming down the pike in the next four years, all well worth looking forward to.

Over the course of the last four years, Linux has taken enormous strides in usability and breadth of adoption. Here's a speculative look forward at what Linux could be like a few years from now -- or, maybe we could say what Linux ought to be like.

For-free Versus For-payExpect to see a three-way split among different versions of Linux. Not different distributions per se, but three basic usage models:
1. For-pay: Ubuntu's in-store $20 boxes are a good example. For a nominal cost, you get professional support for Linux as well as licenses to use patent-restricted technologies (e.g., codecs for legal DVD playback).
Expect this to at least gain nominal momentum, especially if the cost is no more than an impulse buy and people understand that Ubuntu can non-destructively share a machine with Windows. Also expect at least one other Linux company to pick up on this model (openSUSE, for instance), and to have preloads on new systems incorporate such things if they don't already.

2. Free to use: This is the most common model right now -- a free distribution with support optional, and additional optional support for closed-source components: proprietary, binary-only device drivers.

3. Free/libre: These distributions contain no components with patent encumbrances or other issues, in any form. Distributions like gNewSense or Blag Linux already do this, and an upcoming version of Ubuntu (8.10 / "Intrepid Ibex," due in October) will also feature a wholly free installation option. What's also important is that over the next few years, the distinctions between these three licensing models will become heavily accentuated by both the Linux community and by the creators of these distributions themselves. This should help solidify for many non-technical people the distinction between free-as-in-speech and free-as-in-beer.

KDE 4's new desktop metaphor promises to give Linux users a radically new desktop experience.
(click for image gallery)

The Desktop

This year we've seen the appearance of a number of possible models for the Linux desktop of four years from now. One is KDE 4, which despite a rocky first release is quickly drawing attention for its forward-looking approach to desktop management. Its new desktop metaphor, named "Plasma", has just started to strut its stuff. After four more years and a bit more third-party development, it stands to be a lot more than just a visual curiosity, and become an actual way to get work done.

If KDE 4's new approach is too daunting, the Mac OS X inspired gOS desktop -- especially in its "Space" incarnation -- distills the Linux desktop down to its bare essentials. The gOS interface also serves as a front-end for many common web applications, one of the biggest ways people will do work on Linux in the first place. Expect to see many more variations on these kinds of stripped-down, click-and-go interfaces as ways to allow a growing base of non-technical Linux users to get on board with Linux. Pros will always still be able to drop to a command line, though.

Hardware

Right now, in 2008, Linux is present in a great many hardware devices without most people ever knowing about it. By 2012, it'll be a brand name unto itself, thanks to the exploding netbook market, where Linux has proven itself to be a solid way to build an inexpensive computing platform. By that time, many first-tier manufacturers like Dell ought to be offering such devices -- and those that already do (like HP) will probably be looking seriously at offering more Linux-based gear. (As of this writing, Lenovo's just announced the IdeaPad S10 netbook with Linux in certain territories.)

Phones are already among the devices nowusing Linux as well, and it's also a growth market. ABI Research projected that by 2012, Linux will be powering something like 40 million mobile devices shipped that year alone. The definition of "mobile devices" is also expanding: in addition to netbooks, look for a great many Linux-powered devices with open architectures (the OpenMoko FreeRunner, for instance) that are designed to move between niches and fill more than one need at once. No discussion of Linux hardware would be complete without some discussion of hardware compatibility. Obviously there's going to be increased attention towards open-source device drivers for existing hardware, but another trend is the growth of hardware with open accessibility and standards. If any major hardware maker doesn't have Linux drivers for their product by 2012, either as a first-party product or as a community effort, they can expect to be singled out for it almost immediately.

Asus's "netbook" Eee PC and similar machines are just the first of what promises to be a healthy platform for Linux.
(click for image gallery)

Applications

What'll you be running on Linux in four years? Chances are you'll be running a lot of what you have now, just with a new revision to the left of the decimal point. OpenOffice will be either in or fast approaching its fourth revision, with features like interoperability with Microsoft VBA macros, a native 64-bit edition and quite possibly an entirely new interface that isn't hidebound by the program's legacy requirements.

Another important thing to expect is the use of the browser as an application deployment framework, or at least attempts at same. This is already happening to a great extent on multiple platforms -- e.g., Gmail instead of Outlook or even Thunderbird -- but projects like Google Gears are aimed at making the desktop, the browser and the network work in both connected and disconnected ways.

StorageAs of this writing, a 1-terabyte consumer-grade drive has hit the market for about $175. In four years, a terabyte will easily be half that much, and a home media server with an array a few terabytes in size wouldn't be out of the question. One possible way to organize all of that space is through Sun's recently open-sourced ZFS file system, which allows easy growth and management of file systems.

Right now, however, the licensing for ZFS only allows it to be used in Linux's user space -- not an impossibility, but perhaps over the next few years Sun can allow ZFS to be relicensed in a more GNU-friendly fashion to allow it as a kernel add-on. (It's also possible to run ZFS in an OpenSolaris implementation such as Nexenta, along with all your other favorite Linux-y apps.) System ConfigurationIs it optimistic to expect that by 2012, command-line hacking for basic Linux system configuration will be a thing of the past? One can hope, especially for things like display configuration, which should be auto-detected and configured touchlessly. This is crucial if Linux is to make headway with regular users, although putting Linux on devices like netbooks, where the hardware is a predictable and controllable factor, should help.

If there's any system configuration issue that divides Linux devotees, it's package management -- how to handle the wealth of software installed in a given Linux distribution. It's probably unrealistic to expect the plethora of distributions out there to consolidate on a single, one-size-fits-all package-management system, especially since each distribution tends to be married to its particular package management system. That said, the use of something like PackageKit as a packaging-neutral front end for a distribution might make transitions easier.

Also, the Conary package manager project offers some possibilities that deserve broader adoption, such as the ability to download and apply only changes to a particular package. That saves on bandwidth, which in turn ought to be a bonus for Linux users in developing countries where bandwidth is at an extreme premium.
Virtualization Virtualization in the Linux kernel -- either in the form of KVM or Xen -- will make it that much easier to run Linux side by side with any other operating system, either as a way to migrate non-destructively from an existing Windows installation or as a way to expand Linux's own native functionality (for instance, by running multiple kernels each tailored for different needs).

Another possibility is to allow Windows apps to run side-by-side with Linux apps, using a ReactOS to work as a Windows container under this scheme.

Linux On Servers

It's almost foolish to expect Linux's dominion on the server side will wane -- servers are where Linux has fared best, and all the signs point to that only becoming all the more the case. The real question is, in what form?
A major part of the answer lies, again, in virtualization: Linux's mutability allows for its use not only as a server platform but as hypervisor and container for other operating systems. That said, there's more than one way to do such things -- KVM and Xen are two major contenders, but both function in markedly different ways and are probably best suited to different types of work. Xen's best for running as close to bare metal as possible, but KVM lets a particular Linux instance function as a container for other OSes. To that end, over the next four years, the question won't be "Which one's the winner?" but "Who's using each for what?"

Conclusions

The difference between the Linux of four years ago and the Linux of today is striking enough -- not just in its diversity, but in the way it's consolidating its strengths as a server platform, an OS for portable devices and emerging hardware markets -- and as a way to make the most out of whatever else we see in the next four years, too.

6 comments:

edcs855 said...

How do you think the microkernel (MINIX, QNX and Midori) operating systems will compare to the monolithic (Windows, UNIX, Linux) operating systems in 2012 and beyond? Initially, Linux began as a microkernel before incorporating portion of FreeBSD and other UNIX variants. Both the Mac OS X and FreeBSD employ portions of the microkernel in their current OS with the inter-process communication (IPC). Where do you see Microsoft’s Midori OS in the next 4 years, which is potentially slated to replace the Windows OS genre? Do you believe third party software vendors will be up to the tasks of rewriting their products to fit the Linux and Midori architectures?

Jason said...

"KDE 4's new desktop metaphor promises to give Linux users a radically new desktop experience."

Yup, we'll go from functional and plain to useless and fancy, just like Microsoft!

askill said...

There seems to be a trend to bloated SW. I don't see the predicted trends as an advance necessarily. I could run my first PC on an OS that fit within 64K now the OS fits on 6 DVDs. The original OS had everything I needed but them I'm used to using a command line.

As LINUX and Windows development tends to towards a common point, I think they will both lose their true value (other than being the source of constant security patches).

Chris' Blog said...

In the future I would like to see and have access to the internet at any time and location. My hope is that the devices that allow me to do this would use open source software like Linux. I foresee a future where I can have access to any song, movie or television show on my mobile device at any time. I would like to see on-demand television. I set this mobile device next to a flat panel display and could have access to movies, television shows when I want to see them during my personal schedule using internet technologies and open source software/operating systems.

Currently if you purchase a computer it can cost more for the software then the hardware. If you consider MS Office can cost in the neighborhood of $400, antivirus $50 dollars per year, Vista $120 dollars etc… On one of my computers I use Linux Ubuntu and Open Office, so it did not cost me anything for the software. My hope is that when the above technologies becomes reality, that the software is open source, which will allow for more creativity.

askill said...

Open source software forces a shift in the way money is made. The source of funding would have to be in consulting and specialized packaging of the software.
Looking at the desktop environment, it would ap[pear that the majority of Linux distribution have been poointed towards the server market and the geeks. Ubuntu and some of the other distributions are finally starting to move into the desktop environment; however, there are still major obstacle that need to be addressed before even Ubuntu is ready for prime time desktop (read that as addressing the needs of technology challenged).

Steve's CS855 Blog said...

The one problem that I see with this is in getting mainstream (ala Grandma) adoption. I was in a computer shop not too long ago and there was an obviously non-technical lady asking about Linux. The store had the cheap (but not free) versions and she was arguing with them, because it was advertised as open source and free. She wound up buying a copy of XP for considerably more money, but it purely the perception that kept her from changing.

Non-tech people are willing to spend extra money, as has been proven with the explosion in Mac sales recently, but if they're told it is free, but the store tries to charge them even a small amount, it becomes a rip-off in their minds.

I think that is going to be the big thing to overcome, not the technology.