[Microsoft's New Win-Win Strategy: Post 3 of 5]
My last posting, "Scrap The Windows Codebase" explained some of the reasons why the current Windows codebase might be better off in the trash bin. My mention of "Linux" caused myriads of knee-jerk reactions from other bloggers (see the Channel 9 forums, for example). But, this post will only talk a little bit about Linux, and a lot about what might be needed for the future, and why the operating system has become too central a focus for Microsoft.
What IS Microsoft's Business?
I'm not sure Microsoft ever wanted to be in the operating system business. There is been a lot of revisionist Microsoft history floating around, and the original goals of Gates, Allen, and eventually Microsoft are sometimes lost in the rhetoric of their successful business strategies.
Today, Microsoft knows their operating system is the cement that glues their business strategy together.
Microsoft executives describe how desktop applications "widen the 'moat' that protects the operating system business". At the same time, Joachim Kempin in his 1999 testimony said "We are not in the operating system business. We are in the computing business." While at first glance this seems like defensive rhetoric to distract Jackson's team from their OS focus, keep in mind that Kempin was with Microsoft since 1983, and his chief responsibility was developing their relationships with PC manufacturers and OEM distributors. Kempin, perhaps more than anyone else, saw the operating system as a useful business tool, one which wielded legendary power over extending Microsoft's computing products to every desktop.
While there is no doubt that this strategy has yielded tremendous financial rewards for Microsoft, it is nonetheless as strategy based, not upon technologies or innovation, but upon tie-ins, bundling deals, and partnerships. In this sense, it limits Microsoft because, unlike other companies, they do not need to create the best products, only the most viable ones suitable for their (so far) successful business strategies.
But, when Gates is at his best talking about Microsoft's strategies, there is not a mention of predatory business practices, and the "moat" that protects the operating system. Gates, in a notable 2003 interview with Fortune, said t "One was our vision, which has not changed since the day the company started." The vision was, according to the interview, "the idea that you could buy PCs from many different hardware companies, and yet they would all run the same software".
C'mon Bill! Let's look back briefly. Bill originally thought computer languages were the company's business. He went to IBM for an Operating System. Only when IBM couldn't come to terms with Digital Research for an OS license, did Gates see an opportunity, buy QDOS, and license it back to IBM. This sounds a bit closer to what Kempin might describe as "the vision": bundling and aggressive licensing.
All in all, I still believe Gates wants to serve the needs of consumers. One of the advantages Apple has over Microsoft is that their vision of ease-of-use and style is strong by comparison to the muddled vision of Microsoft. This wavering statement of vision, and inability to reconcile what Microsoft says with what it does is one of the reasons that so many Microsoft-bashers conclude that Microsoft is simply greedy and predatory. Without clear vision, all successful business might appear so.
At his best, Gates talks of the dream "of a PC on every desk and in every home". I think he still believes that comptuers are good for people and that the business mission is to do whatever is necessary to enhance the computer experience.
What does all this have to do with operating systems and my Windows TNG idea? Well, two things.
First, I believe Microsoft has become so distracted by the importance of their operating system as the glue for their predatory business practices and the "moat" that they have stopped innovating. Worse, the size and complexity of the operating system itself has slowed down even progress on Microsoft's business strategies. If ever Microsoft needs innovation, now is the time.
The second point Bill makes better than I can. In the Fortune interview, when asked "What can Microsoft do for small business?". His answer, in full, is
Making our software simpler will probably have more dramatic impact with small business than anywhere.
Anyone who is following the current Vista releases knows that Microsoft is not moving toward simpler, but toward a far more complex, multi-faceted operating system. The OS has taken over the company, and it's taken over the consumer's view of the product line. Windows TNG is one perspective on how Microsoft might change that.
A New Platform
If Microsoft is going to return to the goal of extending the power of the PC and creating greater consumer value, they need a new platform. What might that platform look like?
For a moment, let's speculate about the feel of a new product line, and the technical underpinnings. Consider the profile of the products and technologies I'm suggesting, and don't get hung up dwelling on a particular flaw or inconsistency, as there will be many This is, essentially, a "White Paper".
It's first worth considering exactly what the criteria might be for a new Windows.
Goal 1: Unbundling
Windows may not be too big if you consider all that it does. And Windows may not be as monolithic as it seems. Internally, there are many layers and boundaries. But, it is delivered as a monolithic product, and because of that, organizational dependencies have been allowed to remain. High level application changes, UI changes, and kernel changes all end up on one huge Gannt chart. Despite attempts to avoid them, true development dependences exist between layers that simply should be separate.
What if, instead of a 50 million line Vista, we had the following:
- A Windows TNG Kernel OS which has a separate release cycle, is developer-configurable, and was used in both desktop, embedded, and special-purpose applications by Microsoft and third party developers. It would have drastically simplified security, a lightweight process model, and could be built in custom configurations by developers (similar to the way Windows CE is delivered). Small configurations may have only a 400K footprint. Large ones would have no larger than a 1.5M memory footprint. (1-3 million lines of code)
- A Windows TNG UI (essentially "Avalon-in-a-box"). Again, a separate release cycle, and custom-configurable. This is a developer's product. Because it is separate from the kernel, competing UI models can be developed and delivered separately. Much of the "compatibility" with older Windows applications lies here. (3-5 million lines of code).
- A set of UI applications bundled with installers which is what users see as "Windows". It installs and runs very much like Windows does now. It is completely separate from the other two components, has its own schedule and can even be purchased in its unbundled form. This is especially appealing for corporates who may want the important security or functional benefits of a new kernel but do not want to retrain users until later. (5-10 million lines of code).
The new product has a technology profile shown in the (highly speculative) drawing at right. Click on it for a bigger version. Notice that:
- The OS is unbundled, configurable, and separately shipped across all versions.
- The Microsoft UI is optional, but shipped with the consumer product as standard.
- On embedded devices, there is a more compelling case for vendor-specific UIs.
- Specialized embedded devices, such as wearable devices, benefit from a very lean kernel with no GUI overhead.
Most importantly, all variants use the same kernel.
What happened to the other 32-41 million lines of code? Well, they may exist somewhere, for example:
- A product that might be called "Windows Legacy". It's a virtual machine that runs a version of "Vista Minus" (or even "XP Minus"). Maybe it comes free with Windows, and is even installed by default for a while. It will run 100% of all old Windows applications. Thought it provides compatibility, it also sends a strong signal to customers that there is a dividing line beyond which compatibility may not be guaranteed in the future, and it serves to define where that line is at.
- Specialized Server applications such as IIS, remote management software, etc.
Most of the savings comes from the next goal...
Goal 2: Reduced Complexity
Windows has become far too complex, and needlessly so. For example, the past 15 years have seen 4 different phases of graphic support:
- The pre-NT GDI model present in products such as Windows 95 and Windows 98.
- The user-mode GDI model present until Windows NT 3.51.
- The kernel-mode GDI and GDI+ model of NT 4.0 and XP.
- The new Avalon framework for Vista.
While all of these are improvements, the compatibility requirement means that the legacy of these four models will exist for a long time. This makes the product tremendously complex, holds back progress, and adds to the number of details applications programmers must learn.
Similarly, the Windows security model is orders of magnitude more complicated than Unix-based systems. While Unix-based systems clearly are overly simple and not a good model either, the Windows model provides a detailed object-level security model which is far too expensive and yields few real benefits. Rather than being used by developers, most security settings on most objects created in the Win32 API are left unchanged. On a more pragmatic level, the weak passwords of most Windows users, coupled with the tendency of most non-corporate users to run as Administrator have rendered any security scheme irrelevant.
Vista is about to add yet futher layers of complexity on top of the already existing layers. Localization, security, object management, and graphics are all about to be reinvented in Vista. And yet, the old way will still remain.
Reducing the complexity of features such as these is essential, especially if upward compatibility is important. The worst thing you can do for future systems is saddle them with complicated features that are superceded and replaced constantly as versions are released.
Reduced complexity can also have performance benefits. For example, the Windows process model has always been criticized as "heavy". Creating a process takes at least 10 times longer than creating a thread and frustrations over a complex process model can hamper efficient use of resources. A new process model where threads and processes have the same weight, and processes can be created cheaply and easily would create greater performance opportunities in server-based applications. This becomes especially important in real-time and embedded applications where lightweight processes are almost essential to development.
What about Compatibility?
Compatibility is one of the most difficult constraints in moving forward. While Microsoft wants to say with 100% certainty that "Your application will run", the continual requirement of compatibility hampers progress on newer, superior technologies. In addition, compatibility makes Windows itself less flexble. If compatibility issues extend deep into the kernel and user APIs, then trying to deploy the Windows kernel in tiny embedded products will be almost impossible.
Microsoft will have to draw a dividing line in the sand, as Apple has done. Applications on one side of the line will run with few if any changes, and this should represent about 90% of the applications developed in the past 10 to 15 years. This means that some kind of "compatibility library" will need to be developed.
Isn't Microsoft Already Doing This? Why start over?
Well, yes, and no. While many features in Vista are targeted at these problems (such as the "Server Core" version of Vista), there is no true unbundling, and Vista will remain a monolithic product. It will take years, or even decades, to gradually pull the pieces of Vista apart, and while those attempts are made, developers will continue to add more. Attempts by Microsoft to truly create a layered operating system out of XP with no layering violations have been difficult.
Make or buy?
In many ways, what I'm suggesting is obvious. Microsoft knows they'll have to replace Windows. That's why projects like Singularity exist. And, the goals I've spelled out above (including unbundling) may already be on the drawing board.
In theory.
In practice, the sheer size of Windows, and the compatibility juggernaut, will make everything take longer. If Microsoft wants to replace Windows by 2015, it will take until 2025. If they want it to be "fully compatible", then even Singularity will be hampered by the very same issues outlined in my previous post and this one.
Can Microsoft, and their customers, wait until 2025 until repeated evolutionary steps solve all the problems I've mentioned? Will Microsoft continue to have such dominance that people will wait? Unless Microsoft acts more quickly, it is inevitable that some competitor, probably Apple, will finally be able to attract large numbers of Windows users with an offering that, to users, appears similar enough to Windows from a purely functional perspective.
Maybe it's time for Microsoft to do what they've done so often before: Acquire technology which solves the problem.
Here comes the L-word
Everything I've said until now has tried to make a case that:
- The liability of the Windows codebase, including Vista, will slow Microsoft's progress to the point where vulnerability to competitors becomes threatening to Microsoft within the next 5-10 years.
- Microsoft is solving these problems, but not fast enough for their users or shareholders.
- It's worth considering if there are potential technologies which can be aquired to solve the problem.
Linux has the potential to solve Microsoft's problems, but it's important to look at the potential of Linux rather than the current reality. Consider that Microsoft Visual Basic was originally purchased by Gates as Tripod by Alan Cooper, and SQL Server was written by Sybase originally until Microsoft negotiated exclusive rights to all Windows code. What those products are today only a slight resemblance to what they were on the day they were purchased.
So, rather than look at Linux the way it is today, imagine what it would be like if Microsoft were to adopt the Linux platform, participate in and fund development, and drive the direction of Linux forward to meet Microsoft's own needs.
Technically, Linux has the following strengths:
- It is a successful platform in use today, which is benchmarked and compared side-by-side with Windows. In server-based applications, it often comes out ahead of Windows in some performance and security benchmarks. Rather than being an "idea", it is an actual contender and is being used side-by-side with Windows in many corporate production environments.
- Its security model is remarkably simple and even could be called antiquated. Yet, for some reason, it has held up very, very well and is considered at least as secure as Windows. Because it is so simple, it will be easy to upgrade and replace. But, because it is working adequately now, replacement can be done carefully and cautiously while focusing energy on more important issues such as desktop innovation.
- It has an efficient lightweight process model that is a superset of the one provided by Windows (that is, Windows process model can be built on top of the Linux process model).
- It has been competing vigorously with Windows, and there is already a large device driver base. In fact, vendors of hardware consider Linux to be second only to Windows in their priority for releasing device drives, and many vendors already do.
- It has an entrenched development model which is popular in universities and many businesses. thus, it is not necessary to spend significant time or effort on development products, especially for low-level drivers and server applications.
- It has a configurable kernel which can be used in everything from tiny embedded devices up to very large multiprocessor systems. The kernel is small, modular, and extremely robust.
- It is much newer than Windows, and has very little legacy code by comparison.
- An enormous amount of effort has already been done to create Win32 compatibility layers. WINE, Crossoffice, and Xen are three specific technologies designed to run MS applications under Linux. Rather than criticize these as inadequate and lame (they are), consider what would happen if Microsoft were to take over development of one of these. Progress would be rapid and the problem of compatibility would be cleanly isolated.
- It has a shared library model which allows multiple concurrent and incompatible versions of software to co-exist simultaneously without the need for extensive additional technology investment or developer education.
Linux has several weaknesses to consider:
- The X-Windows platform is interesting, but outdated. While a client-server windowing system has clear advantages, its API is more arcane and complex than Win32. Microsoft (and the market) would be served well if Microsoft were to build (or buy) a new, more modern and capable graphics application framework for Linux.
- The Open Source model requires a substantial investment in legal work and planning. While some of the technologies would clearly remain Open Source, Microsoft would want to engineer as many components to be proprietary as possible. This may dictate packaging and delivery "mechanics" in some ways. Since many companies are already combining Open Source with proprietary products in their deliveries (such as Redhat), I am confident Microsoft can negotate this minefield in an aggressive and innovative way that would impress us all.
But, the biggest benefits of adopting Linux aren't technical at all...
Linux: The Business Reasons
While the technical issues can be argued to death, the business reasons for adopting Linux give Microsoft significant advantage.
Keep in mind that Microsoft has always been excellent at the adoption and assimilation of technologies that are already in the marketplace. I remember when the Web was something Microsoft said they weren't interested in! (Yes, it's true). Despite the Java legal debacle (which Microsoft could have avoided, I believe), their adoption of Java was highly successful and if managed properly could have avoided having to "reinvent" replacements such as C#.
So, here are what I believe are the most compelling reasons why Linux is a good "buy" choice:
- Microsoft completely eliminates open source as a competitor. By embracing Linux, almost all open source efforts suddenly lose most of their "shared mission" to compete with Microsoft.
- Microsoft extends dominance of the Office applications onto every desktop. Efforts like Open Office, Sun Office, etc. become truly jokeware.
- Competitors like IBM, RedHat and Sun start to shake in their boots as they realise that multi-million dollar investments they've been making in Linux have now directly benefited their most feared competitor.
- Microsoft takes firmer control over how the GPL is applied to products. Keep in mind that many products you buy contain both open source as well as proprietary components. Windows TNG would be no exception. But, because it was bread-and-butter to MS, they would apply their significant legal capability more productively.
- The Open Source "religion" would become diffused. Most open sourcers would be horrified to think that MS is "taking over". As more and more MS successes occurred in the Open Source arena, the "we love Linux and software should be free" crowd become more and more marginal.
The Bottom Line
Sure, it's a minefield. But, Microsoft is on a slow-burn right now to creating less and less competitive products while others are creating more innovative products with shorter delivery times. By unbundling, creating leaner development strategies, adopting some proven technologies, and dominating the open source space, Microsoft can reinvent the entire industry.
That leaves Microsoft more time to outdo the very competitor who is making the greatest advances toward Microsoft's market: Apple.
The next segment of this post will explore how Microsoft might refocus and use their time to create a truly next generation desktop by creating a proprietary application layer on top of Linux and OS/X.
strange use of the word technical.
1. It is a successful platform in use today, which is benchmarked and compared side-by-side with Windows. In server-based applications, it often comes out ahead of Windows in some performance and security benchmarks. Rather than being an "idea", it is an actual contender and is being used side-by-side with Windows in many corporate production environments.
Not sure how this is "technical". "it often comes out ahead of Windows in some performance and security benchmarks". You may want to add something like " in certain limited circumstances." ;)
As a take away, can we assume that by this, you mean Windows beats Linux in performance and security in most of Windows core areas, but there are some specialty environments where Linux out performs Windows.
2. Its security model is remarkably simple and even could be called antiquated.
I will leave this alone.
3. It has an efficient lightweight process model that is a superset of the one provided by Windows (that is, Windows process model can be built on top of the Linux process model).
My understanding is that this is just plain wrong. Not my area of expertise, but the lightweight process design is considered a flaw in Linux (I have a vague memory of Linus admitting as much in forums -- that his was a project to get the simplest implementation rather than the state of the art. I am not sure what is meant by the Windows process model can be built on top of the Linux process model -- certainly it can -- but not efficiently.
4. It has been competing vigorously with Windows, and there is already a large device driver base. In fact, vendors of hardware consider Linux to be second only to Windows in their priority for releasing device drives, and many vendors already do.
And how is this supposed to be a "technical" advantage for Linux over Windows?
5. Does not seem technical, and I am not sure you want to inherit an "entrenched" development environment.
6. It has a configurable kernel which can be used in everything from tiny embedded devices up to very large multiprocessor systems. The kernel is small, modular, and extremely robust.
This is a legit point, and this is one of the reasons that Linux does well in certain small specialty roles, but it is likely the reason that Windows is better in the vast majority of real server, real desktop cases.
7. It is much newer than Windows, and has very little legacy code by comparison.
Again questionably technical. More importantly, legacy code is nothing but positive. It is pure gravy if old code works on your new system. If Microsoft wanted to not have legacy code, they could simply abandon their declaration and work on making it worth while, and tweak their already superior kernal and OS. If you like the Directory/File Naming conventions in Unix, AND you think compatibility is a negative thing, make some rather trvial changes (in the grand scheme of things) to the security model and the File system. If you want to drop compatibility, drop compatibility. If you think having the new code loosely based on the old code creates a pyschological reason MS is unwilling to drop compatibility, then it is a pyschological reason, not a technical one.
Early in the talk of Vista, Microsoft considered making a lot of things VISTA only, but they got pressured by customers who wanted support for the new features in WinXP. A lot of these customers have a pretty good idea of what is going to be good for them.
8. An enormous amount of effort has already been done to create Win32 compatibility layers. WINE, Crossoffice, and Xen are three specific technologies designed to run MS applications under Linux. Rather than criticize these as inadequate and lame (they are), consider what would happen if Microsoft were to take over development of one of these. Progress would be rapid and the problem of compatibility would be cleanly isolated.
In number 7, legacy is bad. Number 8 is that we can run emulation. OK, fine, but how is that better than offering native support, the current status quo.
9. It has a shared library model which allows multiple concurrent and incompatible versions of software to co-exist simultaneously without the need for extensive additional technology investment or developer education.
It is unclear to me, certainly no expert on Linux, in what sense you mean this. Version compatibility is a hard problem. Shared Library compatibility is a much harder problem. This type of problem is probably also exponentially related to size of the installed user base. Again, I do not know Linux well, and I am not sure exactly what you mean by this, but my intuition tells me that if you scaled Linux installations out to match Windows installations, coupled with the same level of administration that you are actually going to get on all those desktop installations, and you will resurface the same problems (and I cannot see how open source does not make this significantly worse by giving to much control to too many developers, or by limiting innovation by creating excessively strict interfaces).
I know you think you have already made the case for dumping the Windows codebase, but I am pretty sure you have not. Most if not all of the reasons you suggest for moving to Linux, must already assume that dumping the codebase is a good idea, because most of the reasons for Linux that you suggest actually favor the existing Windows Codebase ~ I would love to hear from some of the OS Kernal guys about the comparative value of the lightweight vs. Windows process model ( definitely not my area ).
As for the business reasons, I am not sure they are actually much of a benefit. Microsoft is better of with some competitors, and open source is a great complement to Windows, it fills niches which do not have financial incentives, and a great deal of open source is writeen for Windows.
Apple has made some advances against Microsoft, epsecially in compositing (a simpler trick than for MS because they control the hardware), but are not a serious threat. With Vista, Microsoft will have surpassed Apple for good in all areas except Unix compatibility, and it is not clear that MS would even like to crush Apple any more than they already are given that monopoly status and restrictions would be sure to follow.
Posted by: theCoach | June 14, 2006 at 01:07 AM
You can't take over the linux kernel development process and steer it into your direction. Not possible.
Posted by: Joseph | June 14, 2006 at 03:37 PM
The problem with your argument is that you're looking at this from an engineering standpoint rather than from a user standpoint. As an engineer, you want things to be sleek, clean, and elegant. This requires redesigning things so that they don't have all that nasty backwards compatibility to deal with, and anything that does implement it is easily contained. Modularity is not a feature that users care about, it's a feature that engineers care about because of its aesthetic value.
As a user, I just want everything to work. The system must be adaptable to new uses and features, but why would I upgrade if everything I have now doesn't keep working? I don't want the programs I'm using now to not work as well just in order to get some new programs to work. What do I care if some module is optional? That just means it might not be there when I need it.
So what would happen if MS rewrote Windows so it's small and modular and drastically simplified? Well, they already did that, and it's called Windows CE. It runs on just about any processor you can find and can fit in just 350k! Of course you have to sacrifice a lot in the design of a system that can be that small, so you would probably want to use Windows XP Embedded; it can get down to as small as 5MB.
It turns out that Windows is already modular in a sense, so why can't we pare Windows down to the bare minimum ourselves? Because it wouldn't really be that useful. Disk space is cheap, and I doubt you could even find a company making drives smaller than 40GB anymore. Leaving features out just makes it harder for a software vendor to support their products and makes it harder for users to use software. I mean, you would probably say "I don't plan to run any games or office apps, so I can easily save a few megabytes by not installing OLE and DirectX". But then Java applets wouldn't work anymore because it turns out that they activate using OLE interfaces and display via DirectX. Or perhaps you would leave out ODBC because you don't plan on using any databases, but then mail merge wouldn't work. Would you want to have to support that?
Now, look at the Linux commercial desktop application market. You'll notice that there isn't much to be found. Without going into all the reasons why, one biggie is that Linux is just too expensive to support as a platform. Your user could be running any of a hundred or more distros, each with a hundred different installation options. How do you even know where to install? (/usr/bin? /usr/local/bin? /opt?) Ordinarily install locations are just compiled in, but most ISVs don't want to be distributing source code. Once you start compiling, now you're in dependency hell because any non-trivial app is going to use all kinds of libraries.
Java was supposed to solve all these problems, but you'll notice that Java apps haven't exactly taken the desktop market by storm the way people thought they would 10 years ago. Why? Well as it turns out, the object-oriented way to write a GUI app is to define GUI objects like buttons and textboxes, then telling the system "execute this function when somebody clicks on this object". Unfortunately Java didn't include any way to do that (i.e. Java has no function pointers). Microsoft saw this shortcoming and modified Java to add a feature called "delegates" that would enable simple drag-and-drop GUI creation. Sun didn't like that idea, so they sued, and now we're stuck with "inner classes" and "listener patterns". I won't even go into JNI here, but it essentially means that it's hard to interact with the system in ways Sun doesn't want you to. That's why C# is so popular now.
That's right; MS didn't just reimplement Java and call it C#. They took the basic idea (OO language, VM, C-based syntax), and designed it correcly so that it would be useful to Windows programmers (and from looking at Gnome, Unix programmers too). That means a C-based OO language that compiles to a portable VM, but it includes things like events and delegates to make it easy to write GUIs, it makes it easy to call system services (P/Invoke), it encourages interoperability with other languages (as opposed to the JVM, to which most languages cannot directly compile), and it allows for expansion (generics were easy to add to C# and VB, hard to do right for Java).
Linux would be the same way. The only way to make it run Windows apps is to build Windows on top of it. You would have to extend it so much that it would barely be recognizable as Linux. At that point, why even bother starting with Linux? Why not start fresh with something new? But then you're just reinventing NT.
And quite honestly, I don't understand what your big hangup is with lightweight processes. Who cares if it takes an extra 2.7ms to create a process? The only time you should really care about process creation overhead is when you're creating lots of them. The only reasons I can think of that would cause this would be some large shell scripts or an old daemon like inetd that spawns a process for each connection.
The old inet daemons had to fork for every connection because Unix didn't have async IO or multithreading. Since NT has had those for 13 years, there's no reason you would write a Windows service like that. That leaves only shell scripts, which are generally written differently on Windows -- I prefer Perl, myself.
I'm still waiting to be convinced.
Posted by: Gabe | June 15, 2006 at 06:43 PM
Hi Gary, interesting article.
I would have thought the first article in the series would have given the evidence for Windows architectural complexity causing a significant number of security problems? Otherwise you are solving a problem that does not exist. Looking through all of the Windows vulnerabilities, how many can be attributed to, what I think you suggest might be, an overly complex architectural design of Windows? To me it seems most are buffer overflows, etc that are being addressed in Vista through efforts such as the SDL. *Some* of the exploits can get further because of the design of Windows, but is it truly because of a possible lack of proper separation between different layers?
Also, with any Consumer "OS" (OS/platform/applications), their will be vulernabilities. It will never be truly secure. Considering the popularity of Windows, do you really think their is any other product in this category that has less vulnerabilities? How much real benefit are you going to get out of a completely new OS when their will always be bugs? Will it cut the number of bugs down by an amount above the threshold required to make it of any significance to the end user? And, even since XP SP2, it seems that social engineering has become the main focus, and possibly will always be the weakest link that attackers will target. And what about the applications? What is the point of making an ultra-secure-perfect Windows, when their will still be applications that can be exploited? The exploits may not get far, but how far will they get with Vista anyway? Where is the evidence that architectural complexity comes into it?
Also, their are other ways to mitigate threats, such as two-factor authentication, backing up data, securing data, encryption, etc.
All these points suggest, even if Windows was not completely perfect from an architectural standpoint, making it better will not be of any significance, and just means more problems, and a new learning curve for Microsoft. Who knows how Linux will go when it stands up to the scrutiny of Windows.
Unless your proposal addressed bugs that make up a large percentage of security issues with Windows, it does not seem like there is an actual basis for this idea.
These are my initial thoughts anyway.
Posted by: BobTurbo | June 24, 2006 at 05:47 PM
I would also like to add, how can you tell that the architecture of Windows is irrepairably complex? It would have to be fundamentally flawed to need changing, and I don't know if that is the case. While I know there are difficulities in rewriting something, it depends on how flawed that something is to begin with. Modulation makes changing these things a lot easier.
Either way, you could be right in regards to needing to find a different architecture. It is difficult for me to tell from an outsiders perspective.
Posted by: BobTurbo | June 24, 2006 at 10:00 PM
there are more untrue statements in this blogpost and comments than true. get tha facts not FUDS.
linux kernel contain only small part of POSIX/Single UNIX Specification. linux do not need to care legacy code for compatibility with apps based on screwed API.
and it is not much newer. starting in 1991 based on POSIX stadards and 2 decades of UNIX tradition.
lightwight processes - one of the BIGGEST ADVATAGES of UNIX systems. simple, efficient. i bet you heard about fork(). that's why threads are not used as much as in windows. spawning a process is not a problem - it's really simpler to fork than to create threads and care about them.
simple security - complex enough in most cases. more advanced ACL's can be installed.
WINE, CrossOver Office, Xen - very little overhead. some people say that even some games are running better by WINE than windows (they say something about better memory management). you don't have to build windows on top of it. simple libraries and a program which executes non-native binary file.
OpenOffice - good enough for most people. OO Calc inferior to Excel, but gnumeric is close enough.
linux filesystem hierarchy standard - somthing wrong with it? more standard than My Something directories not used by all software
where to install (/usr/bin, /opt etc) - wtf? just install the package. no one's asking where it should go. instalation from package manager and even sources is trivial.
> Microsoft completely eliminates open source as a competitor.
no comment
> By embracing Linux, almost all open > source efforts suddenly lose most of their "shared mission" to
> compete with Microsoft.
the world is not about microsoft, certainly not the opensource world.
Posted by: none | June 25, 2006 at 12:31 PM
This is very interesting site
Posted by: micheal | July 06, 2006 at 12:34 PM
You got to be joking me, the GPL is the product of Stallman and the Free Software Foundation, and MS legal team is going to steer it in its' direction? Then OpenOffice and others are suddenly going to vaporise because we can now buy Microsoft Office? I have openoffice because it does just about everything MS office does and is free. Then the Open Source Movement magically dissolves as people can now buy Microsoft again? Microsoft just scraps all previous effort & backwards compatibility to crank out a nice-looking distro?
Your Windows TNG sounds like crackware to me, cause that is what you've been smoking friend. ;)
Posted by: Orthuberra | July 10, 2007 at 10:58 PM