Was the .NET Framework a Failure?

I’m a big, big fan of the .NET Framework. I have always felt that the whole framework is a very well thought out, organized and incredibly stable environment. I thoroughly enjoy developing for it and know a lot of people who feel the same way. That’s when I was surprised to see that Andrew C. Oliver thinks that the .NET Framework is a failure:

Recently, I’ve been airing Java’s dirty laundry. Some folks took my position that Java was trailing .Net technically to mean that I thought .Net was winning. Nothing of the kind — in a number of important ways, .Net is a failure.

He bases his argument on 3 main points, none of which I agree with:

  • Marketing
  • Killing Visual Basic
  • The larger and larger emphasis in Cloud Computing

Marketing

I’m not going to fully refute this point, other than the fact that they didn’t extend the .NET name to Windows Server like he said they were thinking of doing. I will agree that Microsoft has a habit of naming things very… interestingly. (Windows Communication Foundation and Windows Presentation Foundation, anyone?) But I also don’t think that blasting the .NET name on everything really would have been a bad thing. It would have acted as a unifying word that ties each product together, almost like what Apple does with their products – iPhone, iPad, iMac, iCrap, iEtc. And even if it might have caused a little bit of consumer confusion, developers are the only ones that care or matter, and they all know what it means.

The death of Visual Basic

The second point that Andrew makes is that .NET killed Visual Basic, which was one of the bigger languages at the time:

Unfortunately, .Net also cannibalized Microsoft’s most successful corporate IT development environment, Visual Basic. Look at Indeed’s Job Trends, Tiobe, and so on, and you’ll notice that .Net’s rise is all about Visual Basic’s decline.

What I don’t understand, though, is why this is (was) even a problem. Just because the .NET Framework was released doesn’t mean anyone had to stop developing using Visual Basic. Sure, Microsoft stopped supporting it, but I believe that was a necessary move forward. .NET gives developers a safe, clean and cohesive development framework to develop modern applications. Visual Basic doesn’t. The goal was to slowly but surely ween everyone off of Visual Basic and onto the .NET Framework. But if you actually look at the most recent Tiobe index, though, Visual Basic is in 7th place, 6 places ahead of Visual Basic.NET. So much for killing it off.

The larger and larger emphasis on Cloud Computing

Andrew’s last point is the one I disagree with most, and the point is that Cloud Computing makes the .NET Framework obsolete:

Now it’s too late for .Net. As we move from IaaS to PaaS and SaaS, folks are simply unlikely to care about operating systems. If you don’t care about operating systems, why not code as if you don’t care about operating systems and code for the cloud? We’ve seen recently that Azure isn’t lighting the world on fire. Why would we expect that to change?

I’m not arguing that the traditional Desktop as we know it isn’t dying. Most users of the Internet store all of their information in the cloud already, knowingly or not. Email, Facebook, Dropbox, Maps. All of these are services everyday people use and they don’t need a desktop environment to use them. We know that traditional desktops are dying and cloud is coming in, but why is that stopping .NET? The beauty of the way the .NET Framework was designed is how easy it is to extend to other platforms. Eric Lippert points this out when explaining why they chose Intermediate Language:

Suppose you have n languages: C#, VB, F#, JScript .NET, and so on. Suppose you have m different runtime environments: Windows machines running on x86 or x64, XBOX 360, phones, Silverlight running on the Mac… and suppose you go with the one-compiler strategy for each. How many compiler back-end code generators do you end up writing? For each language you need a code generator for each target environment, so you end up writing n x m code generators.

Suppose instead you have every language generate code into IL, and then you have one jitter per target environment. How many code generators do you end up writing? One per language to go to IL, and one per environment to go from IL to the target machine code. That’s only n + m, which is far less than n x m for reasonably-sized values of n and m.

By choosing IL, that allows any program any developer writes to run on any platform with a jitter. I can write something in C# on Windows and know that it will run on 32 bit Windows, 64 bit Windows, Windows Phone, XBOX, etc etc etc. So why can’t Microsoft write a jitter for Android? Or iPhone? Andrew points out that people will continue to care less about Mono (which I don’t necessarily think is true) but that doesn’t mean Microsoft can’t write a first party jitter for any platform they want.

The biggest thing I don’t understand is why he’s claiming the cloud will lead to .NET’s demise. Languages using .NET can be used to create these services. ASP.NET can be used to create these services, too. ASP.NET is more relevant today than it ever has been. It’s used to create some of the most feature rich and frequently visited websites on the Internet. It will be around for a long time. Also, although Azure has taken a while to get off the ground, I think that it will come to relevance in time when the need is greater. For now, companies will continue to need websites and custom applications utilizing .NET.

I believe the .NET Framework is far from dead. I think it even has the capability, corporate backing and community support to continue to grow. I’m going to do as much as I can to see that it does no matter what Andrew says.