Friday, March 8, 2013


Just yesterday I read a post in a group where a developer proudly presented how his game runs on pretty much any iOS device. But he could not do that without saying he would never develop for Android because of the Fragmentation issues. A similar thing happened today on Twitter, where @gaminghorror wrote "App developers: stick toiOS! Seriously:" and was heavily retweeted, the one I read first prefixed with "Holy cow! 80% of active device == 156 device models".

Actually, this makes me angry. I develop software since I'm about 11 or 12 years old, and I'm now 41. I've seen how the transition from Assembler and fixed hardware in a few systems (most sticked to the Commodore 64) broke the neck of many a famous game designer of this time, when not only "Home" computers started to have an operating system but also dynamic ram management and, oh my god, with the advent of later Amiga Versions and the big success of the pretty modular IBM compatible PCs it was over for them. They didn't think outside the box of full control and the comfort of pretty much knowing what hardware there is.

So gaming died out, didn't it?

Uhm, no. And not even consoles of today, despite having the same predictable hardware per franchise and iteration, work that way in coding anymore.

If you develop an AAA Title today, you either built or licence an engine, and its part of said engine to make sure it runs on as many systems as possible without bothering the game designers with it. E.g. Rockstar built the RAGE engine. While the engine has different code on any platform the game is available on, the parts that make an individual game, design, scripting and so on tough are the same. And that same engine makes it possible to have the game on a 640x400 display, as well as... holy cow, are these three Full HD displays? The engine is now five years old, but it was used in as recent (and different) games as L.A. Noir and Red Dead Redemption. That the later is console only is pure politics, not a technical limitation.

But that Rockstar is hardly an indie, and can throw 150 million us dollar on a project. So is that article on Flurry right, "Are Indie App Developers Becoming an Endangered Species?" and if so, is this because of Androids Fragmentation((c) Steve Jobs) issue?

Darn no! Well, may be simple minded ones that will go the same route as the 8 bit game developers, the ones that retweeted. See, not only is that assumption wrong, in reality its just the other way around. The breeding ground for applications (Apps ((c) Steve Jobs) has been the Web (first with CGI than more and more advanced languages on the server side), Java, Flash and now HTML5 and what they all have in common is an independency from a certain platform on the side of the user. And all had to manage the fact that they cannot exactly predict the size of the screen or even the means of input on the users side. Works like a charm on most systems, with the exception of one: iOS

meet the relevant part of the iOS SDK agreement:
"3.3.2 — An Application may not itself install or launch other executable code by any means, including without limitation through the use of a plug-in architecture, calling other frameworks, other APIs or otherwise. No interpreted code may be downloaded or used in an Application except for code that is interpreted and run by Apple’s Documented APIs and built-in interpreter(s)."

So Java Runtime and Flash Player or anything like it are not allowed on iOS. And JavaScript (as in HTML 5) only works as Safari Module, which any other browser on the system has to implement to use it. Development for iOS pretty much means coding in the awful Objective C and some, maybe just some C or C++.

Good that there are still solutions out there. Like Xamarin that compiles from C#, or game engines like Game Maker or Unity. which are all in price ranges accessible to indies.

The article then lists the distribution of android mobile phones over so and so much percent of the android mobile market and compared that with the much smaller number of Apple mobile devices. That is pretty much irrelevant on two grounds:

Different screen resolutions. Actually I don't see the difference here. It starts with 320 to 480 on an iPhone 1 and ends currently with an iPad 4 at 2048 to 1536, all at 4 to 3, oh nos, wait, there is the iPhone 5 with 16:9. So you have to take care of that, too. So that's pretty much for anyone who wants to have the luxury of painting every pixel on the screen at an exact point. Espacially if the framework (the iOS SDK, see above) doesn't really support dynamic UIs. Ususally devlopers don't develop for anything less than the iPhone 3s anymore. With Android you have phones starting at a 200 to 320 display. But users of those don't expect to get the same apps that exists for other resolutions, if you wan't to support anything, lets say below 640x400 today, you have to create the UI for it, just as tablets often are better if you shape the UI accordingly. The most common formats are 640x480 (older or very cheap phones), 800x480 (cheap phones or tablets, and some last gens like the Galaxy S2) and 1280x720 (or 1280 × 800 with extra space for the Softkeys) and in the near future Full HD, 1920x1080. You may notice that those higher resolutions orientate themselvs on the standards for video, displays and TV. The whole reason the iPhone 5 broke the 4:3 ratio was because of that, too, but failed to adjust the pixeldensity. The very lack of support for dynamic screen sizes now hit them back, if they'd change that to a video friendly HD too, the developers would have rebelled, because they had to redo the whole layout, not just stretch it somewhere in the middle ;-)

Image: The Youtube app on an iPhone 5 before it was redesigned to use all the space, while on Android it scales pretty well, here on an Samsung Galaxy S3 in HD, like the video shown.

For games its pretty easy on both systems, you support 2 or 3 resolutions and scale the screen accordingly.

But those many, many different devices? Android itself provides a pretty good abstraction layer, espacially when you use Java, and if you need certain hardware, you can easily check in the code for it. So its pretty much a non issue that only becomes one, when you learned developing on a very limited platform and never done anything else. Speeking of fragmentation only shows every other developer that you are very closed minded, and can only move in a very small canvas, most likely provided by a certain fruit company. It's no coincidence, that the CEO of that company invented its use in this context. That guy was a marketing guru, not a developer, and if you actually listen to older speeches of him he once said the exact opposite. But I love the arrogance that is inherent in it. The whole world does it wrong, only Apple does it right. And you wonder why the rest of the world thinks you also believe in the Second Coming of Jobs?

No comments:

Post a Comment