Before I get into this post, let me make a few points very clear: I am not a fanboy of any single OS. I have both a Mac and PC at home and use both regularly. I have even used Linux from time to time and I think all three are viable choices. I am also not a huge Flash proponent. I think Flash has its place and that it's an important part of the web. But I also think there's plenty of room for improvement.
So it's important to know that I am not writing this with an agenda for or against any particular company or platform. With that out of the way, let's examine why Flash behaves so poorly on Macs and take a look at the Flash debate in general.
Flash is a Standard
Flash is a undeniable staple of the web. Adobe claims that 98 percent of the world's computers have Flash installed. It is used by hundreds of thousands of websites for video, ads, interactive features, games, file uploading, photography showcase, etc. When it comes to rich media and interactivity online, Flash is king. There are, of course, alternatives that are gaining traction. Microsoft's Silverlight platform has a number of advantages over Flash and is used by a number of notable sites including Netflix. The new HTML5 standard includes a video player tag that makes embedding and playing video in webpages easy across platforms without using any third party plugins.
But Flash is still important. HTML5, while excellent and very likely the future of the web, is not ready to replace Flash yet. Not by a long shot. Plus, HTML5 is really only useful for video at the moment. For things like games and interactive media, Flash is still the best option. And if you think Flash games aren't important, think again: the most popular Flash game on Facebook, FarmVille, has over 80 million active users and the company behind it is making millions of dollars.
Admittedly, Flash is overused at times. Personally, I think websites built 100% on Flash are a hugely bad idea. Not only will they fail to load at all on most smartphones, but they totally prevent page linking. A great example is professional photographers. For some reason, the photography world loves building websites that are totally reliant on Flash. As a visitor to the site, there is no way for me to send a link of a specific page to someone else. A specific picture I like? The page of pricing info? Impossible to link to...my only choice is to link to the homepage and give instructions for digging to the page I want to share. Not cool.
So love it or hate it, the reality is simple: Flash is a firmly established standard on the web. The average user probably doesn't know or even care about the Flash debate. They simply want to be able to visit their favorite websites and have them work. And for most people, that requires Flash.
Flash Performance on OS X
According to an Ars Technica benchmark test, on a Windows machine, watching a Hulu video (which, of course, uses Flash) requires 7 percent of the CPU power. By contrast, watching a Hulu video on OS X uses almost 60 percent. And, as any Mac user will tell you, watching a Flash video will almost always cause the computer's fans to kick into high gear. Who's to blame for this disparity? Some blame Adobe for making a terrible product, which might be partly true. But ultimately the blame falls on Apple.
In Windows, Flash uses hardware acceleration for decoding H.264 video, if available. This means that Flash video decoding is faster and uses considerably fewer system resources on a Windows computer. But on a Mac, hardware acceleration is not possible because Apple does not provide app developers with H.264 hardware acceleration APIs. This means that Flash is forced to rely more heavily on the CPU and performance is degraded. Here's what Adobe's Flash Blog has to say about this:
But let’s talk more about the Flash Player on the Mac. If it is not 100% on par with the Windows player people assume that it is all our fault. The facts show that this is simply not the case ... Unfortunately we could not add this acceleration to the Mac player because Apple does not provide a public API to make this happen. You can easily verify that by asking Apple. I’m happy to say that we still made some improvements for the Mac player when it comes to video playback, but we simply could not implement the hardware acceleration. This is but one example of stumbling blocks we face when it comes to Apple.This is also mentioned in Adobe's Flash 10.1 release notes (page 15):
In Flash Player 10.1, H.264 hardware acceleration is not supported under Linux and Mac OS. Linux currently lacks a developed standard API that supports H.264 hardware video decoding, and Mac OS X does not expose access to the required APIs.The latest version of Flash does improve performance on Mac systems, but without access to hardware acceleration, that improvement can only go so far. Without it, Flash will always perform better on Windows.
Nobody really knows why Apple makes some of the decisions it does. They are a notoriously secretive company, so most of this is speculation. However, their probable reasoning does make some sense. Most likely, their biggest reason for limiting Flash's ability to access hardware acceleration is out of concern for the user. Apple prides itself on having the best user experience possible, and if a plugin like Flash were able to access deeper system functions, a plugin crash would be much more likely to cause user experience problems.
This is why Apple likes to control all the aspects of their products. And their own video playback product, Quicktime, performs nearly flawlessly on OS X, decoding even HD H.264 video without slowing the system down. Apple argues that by tightly controlling these things, they are providing the best, most consistent user experience possible. And in that respect, they're right. Nowhere is this more apparent than the iPhone. Those who criticize the iPhone are quick to point out currently missing features (like multitasking), as a huge downside. But iPhone users will tell you that their phones are consistently fast, and have strong battery life. It's a definite trade off, but Apple's reasoning definitely makes sense.
So in the case of Flash, Apple likely wants to control the hardware and software interaction to continue providing a good experience for users.
Why This is a Bad Idea
People could (and have) argued that Flash should be eliminated, regardless of performance, because it is a closed system owned by one company, and that the web should be based on open standards. I do agree with this in theory, but at the moment, this is akin to saying cars shouldn't run on gasoline. Yes, there are alternatives to gas that are gaining traction, but they're not ready yet.
Imagine if Ford were to announce that all their cars would be fuel cell based starting tomorrow. It would be great for the environment, great for money saving, and be a better experience for drivers, but it would also be a huge inconvenience. Fuel cell refueling facilities are extremely scarce and just aren't ready for widespread use yet.
I realize that the analogy isn't perfect, but it gets the idea across. HTML5 is excellent and it shows what the future of the web can be. But it's not ready to replace Flash yet. No other standard can replace Flash in all uses yet. And for one company to simply decide that they aren't going to support a standard is, I think, counterproductive. Sure, the iPad may get better battery life without supporting Flash, but by not supporting it, Apple is simply not providing the "best way to experience the web" (their words). It may be the best battery experience, or the best speed experience, but the best web experience is the one where users don't have to think about what plugins are installed, because everything works. At the moment, this experience only exists on Windows systems because Flash and other plugins will continue to work poorly on OS X as long as Apple continues to limit their ability to perform.
While Apple is doing its best to eliminate Flash from the Internet, other partnerships are being made that hope to make Flash better and easier. The Open Screen Project, headed by Adobe, has over 50 companies from a wide range of industries that promises to "Enable consumers to engage with rich Internet experiences seamlessly across any device, anywhere."
A notable part of this partnership is the newest version of Google's Chrome browser which has Flash included. This is a bigger deal than most people realize, and is more than a simple bundling of two separate products. Google built Flash into the Chrome update engine, and actually built a new framework for plugin architecture. This means that all Chrome users will get updates to Flash automatically and silently in the background, eliminating the security risks of outdated versions. And the new plugin updater system means that other plugins (like Silverlight, for example) might soon be able to work in the same way.
What this means for users is that plugins like Flash and Silverlight will become part of the browser, much the same way Java rendering is part of the browser now. Ultimately, I think this is what's best for the future of the Internet. Website designers can choose whatever development platform they prefer (HTML5, Flash, Silverlight, etc.) and end users barely know the difference; their favorite websites just work.
There are Always Trade Offs
Must HTML5 and Flash be mutually exclusive? Personally, I think choice should always be important. For example, I'm a definite Android fan, but I don't hate all other options. Windows Phone 7, iPhone, BlackBerry, etc. are all excellent products with their own set of happy users. The same holds true for the age old Windows vs. Mac debate. They both have their own strengths and both have a place in the world.
Flash is a powerful tool for web designers, advertisers, game developers, etc. and there's no very little reason for the outcry against it. Mac users are often very vocal about their disdain for Flash and hatred for Adobe...and yet Apple is more to blame for Flash's poor performance. That's the trade off for Apple's superior user experience.
If Mac users don't like Flash's poor performance, they should complain to Apple or switch to Windows. Much the same way that if Windows users don't like Quicktime's poor performance, they should switch to a Mac.
Ultimately, yes, it would be best for all websites to work on all platforms, but as long as we have corporate interests and business partnerships, that's simply not going to happen. Apple is trying to make the web better by eliminating standards, Microsoft is trying to make the web better by creating their own standards, and Google is trying to make the web better by embracing as many existing standards as possible.
This corporate drama will likely continue for a while as standards come and go. Apple's moves make Apple lovers and Flash haters happy, Microsoft's moves make Microsoft fans happy, Google's moves make open source advocates happy. But what about the average user? They're the ones who stand to suffer if their favorite websites don't work. Think of your non-tech-savvy relatives and imagine trying to explain that they can't play their online games because Flash player is outdated, or because Silverlight isn't installed, or because their iPad doesn't support that particular website.
The Internet is built on multiple standards and it will always be that way. Choice is what drives the Internet, the tech industry, and the free market in general. Web developers should be able to create their websites in whatever standard they like without fear that it won't work on certain devices. I hope choice continues to be important to huge tech companies so we can all enjoy the best Internet experience possible.