The “Bad Enough” Issue
Last Tuesday at D8, the annual technology conference hosted by the News Corporation, Steve Jobs gave a rare interview to the Wall Street Journal's Walt Mossberg and Kara Swisher. Much of it was spent deflecting criticism - like the closed nature of the App Store and the recent string of suicides at Foxconn's manufacturing facilities. When the conversation came around to Apple's exclusion of Flash on iOS devices, Jobs had this to say:
The way we've succeeded is by choosing what horses to ride really carefully, technically. We try to look through these technical vectors that have a future, and that are headed up. Different pieces of technology kind of go in cycles. They have their springs and summers and autumns and then they go to the graveyard. So we try to pick things that are in their springs. [...] And Flash looks like a technology that really had its day, but is waning, and HTML5 looks like the technology that's on the ascendancy right now.
Given all the debate around this issue, this is obviously a really contentious point, and not something that most of the industry adheres to! So, leaving Flash in particular aside, let's examine the general philosophy: is Apple's frequent exclusion of "waning" technologies in its newest products worth the loss of functionality and decrease in the quality of user experience?
Apple would argue that when they exclude these technologies, they actually *improve* user experience. This is for a myriad of reasons: maybe they removed a legacy standard that had since become redundant and irrelevant over time, or there's a widespread "better" technology that provides the same functionality as the old tech. But just because a new standard is widespread and gaining popularity doesn't mean that a majority of consumers aren't still locked into the old one.
Just ask all those first-generation iMac users stuck with a giant stack of useless floppy disks. And sometimes a newer standard is actually worse than the one it's replaced. When sub-$1000 HD camcorders began shipping a few years ago exclusively with USB 2.0 ports, it made sense to abandon the FireWire port on some Macbook Pro models, but since USB 2.0 is orders of magnitude slower than FireWire, the exclusion drastically impacted the performance of peripherals that do still take advantage of the faster standard, like large-capacity external hard drives.
However, abandoning a legacy standard can convince consumers that have been holding out to make the switch, which has a generally positive effect on innovation. Despite whatever political stance you have on the issue, it's hard to deny that HTML5 and CSS3 just perform better than Flash in Safari, and forcing the switch from floppy to optical discs also eliminated the worry that your data could be magnetically erased. For a company as obsessed with sublime hardware design benefits as Apple, there are also immense practical advantages to be found when dropping old standards. The component which lets a device support FireWire is physically larger than the one for USB 2.0. When the original iPod dropped FireWire support a few years back, Apple was able to avoid increasing the thickness of the device despite increasing storage capacity.
In the end, it really depends on the use case. Apple was probably right to drop the floppy drive when it did, considering how utterly obsolete it is today. And FireWire no longer makes a lot of sense when you're just loading up music to an iPod nano or a lower-capacity iPhone. But excluding a FireWire port on a laptop that's often advertised for creative professional use is a job that requires transferring hundreds of gigabytes of media to and from a computer on a regular basis, and that's pretty inexcusable. In that light, Adobe would probably be more persuasive in its campaign to put Flash on mobile devices if it shifted its criticism of Apple from a political context, like why Apple's business decisions are hurting innovation, to a design context, like the use cases in which Flash can provide an experience that native apps and HTML5 cannot.
Comments
“...excluding a FireWire port on a laptop that’s often advertised for creative professional use…”
Apple has dropped FireWire 400 which has only minor performance advantages over USB 2.0, FireWire 800 is alive and well on all MacBook Pro models, on iMac, Mac Pro and even the Mac Mini! Only the MacBook and the Macbook Air are without FireWire ports. Obviously the Air has no room, and the MacBook is not for professionals.
What I find “pretty inexcusable” is the enormous hole in your argument!