How Apple's App Store review process hurt Occipital's RedLaser SDK and my very own 'avit iPhone app
Beyond that, however, Occipital is a Value Adder, just like Joe Hewitt.
Instead of being content with having one of the top-selling apps on the App Store, as a Value Adder, Occipital decided to devote considerable time, effort, and money into extending the iPhone platform by sharing their technology and giving any developer the ability to add 1D barcode scanning to their apps via a commercial SDK called the RedLaser SDK.
(Occipital takes a well-deserved 10% cut of commercial apps that use the RedLaser SDK.)
The RedLaser SDK
The RedLaser SDK is a beautiful plug-and-play library that makes it trivial to add high-accuracy 1D barcode scanning for any generation iPhone (not just the 3GS with its auto-focus camera). This is huge. It opens up the iPhone platform to a whole slew of new applications hitherto made infeasible by the poor quality cameras on older iPhone models and the complexity of implementing accurate and efficient 1D barcode scanning from the ground up.
So how do you think Apple responded to such a wonderful addition to the iPhone platform? By working closely with Occipital to help them in any way they could? By buying Occipital and adding their technology to the official iPhone SDK?
Nope, you guessed it: by rejecting apps built with the RedLaser SDK.
Because Occipital used an undocumented API call in a public framework (not a private API call) to bring real-time barcode scanning to the iPhone platform in the RedLaser SDK.
The technical details
The RedLaser SDK which Apple rejected – let's call it the RedLaser Realtime SDK – uses the undocumented UIGetScreenImage call to constantly sample the video stream, trying to detect the presence of a barcode in the image.
In my extensive tests with the RedLaser RealTime SDK while bulding 'avit, I did not encounter any crashes whatsoever and neither did I find any other negative repercussions from the use of this call. As far as I'm concerned, there is no technical reason whatsoever for Apple to have rejected the use of this call apart from the fact that it is an undocumented API call in an otherwise public framework.
Get it straight: This is not Occipital's fault but Apple's shortsightedness
After it came to light that Apple was rejecting the RedLaser Realtime SDK some people mistakenly began to blame Occipital and RedLaser.
The most public of these came from Nick Lansley from Tesco, in his blog post titled Barcode scanning for iPhone – anyone help? wherein he lamented the RedLaser Realtime SDK's use of "'hidden' iPhone API calls" and put out a call for someone to create a barcode scanning component using "published iPhone/Cocoa-Touch API calls and not in any way try and use hacked internal knowledge of iPhone."
Unfortunately, what Nick didn't understand was that any real-time solution that anyone created would have had to use the same undocumented API call and that the only other alternative was what RedLaser ended up implementing in the next version of their SDK – let's call it the RedLaser Burst SDK – which was to use a "tap to scan" flow and the takePicture() API call.
Thankfully, Nick did post a retraction on his blog after this clear explanation by Jeffrey Powers, co-founder of Occipital:
I thought you would like to know that RedLaser, as an algorithm, has nothing to do with unpublished APIs, but that video processing in general is currently impossible without unpublished API use. We have some confidence that Apple will course-correct on this issue with their next OS update.
In the meantime, we do have a version of RedLaser that avoids this problem which became publicly available as of yesterday. It uses what we call "Photo-Burst" instead of the unpublished API, which means it takes a couple of snapshots very rapidly and then processes those momentarily. At the core it still uses RedLaser's state of the art barcode recognition, and it still works on all iPhone models.
Apple to developers: your app's UX be damned?
From the user's perspective, the realtime nature of the barcode scanner in the RedLaser Realtime SDK provides a beautiful user experience:
You simply aim your iPhone's camera at a barcode and move it around slightly until the phone recognizes the barcode. The flow incorporates elements of game dynamics as the user quickly learns the ideal range that a barcode is sampled at via play.
Unfortunately, since Apple decided to reject apps built on the realtime RedLaser SDK, this user experience is now not available to iPhone users.
In light of Apple's rejection of apps built on the RedLaser Realtime SDK, Occipital was forced to devote yet more development time, effort, and money, to reworking a perfectly functional SDK to create the RedLaser Burst SDK, which doesn't use the undocumented API call.
This means that the RedLaser Burst SDK doesn't perform realtime analysis of the video image. Instead, the user has to align the barcode and tap the screen (or a button). At this point, the phone takes two or three pictures of the barcode in rapid succession and attempts to detect the barcode within those images. If it cannot detect it, the user has to align the barcode again and tap to scan again.
This, of course, is a step backwards insofar as the user experience is concerned.
In place of a playful process of continuous feedback, you potentially have a series of failures and retries.
Now, thankfully, RedLaser's barcode recognition algorithms are really advanced so in most cases a single scan with the Burst SDK is all it takes to recognize a barcode. However, the accuracy does drop on older handsets and it may take the user a little while to find the best focal distance for capturing barcodes without continuous feedback.
The saddest part of all this is that Apple has made the decision to reject the RedLaser Realtime SDK not based on any sound technical decision but rather on its shortsighted and inconsistent App Store review policies. Furthermore, it has done so in full knowledge that this decision would lead to less-than-optimal user experiences in apps that used the non-realtime alternative.
Can you see how Apple is acting contrary to its own interests here?
Apple is all about user experience.
Here, they are saying that app developers have to sacrifice the user experience of their apps because, well…, because Apple said so, that's why!
And this decision isn't even in Apple's own best interests or, perhaps more importantly, in those of its shareholders.
How Apple's App Store review process is failing its shareholders
If you understand that Apple's bottom line is inextricably linked to sales of apps on the iPhone platform, you begin to understand how its App Store review process actually hurts Apple and its shareholders.
Let's see how Apple's decision to reject apps based on the RedLaser Realtime SDK, hurts itself and its shareholders:
- Apple makes 30% of all app sales
- Apps that use the RedLaser Realtime SDK will have the best possible user experience and will sell the most possible units, making Apple the highest profit.
- Apps that use the RedLaser Burst SDK will have a sub-optimal user experience – even though that may still be a good user experience, it will not, by definition, be the best possible one. This will likely lead to less units of the app being sold and thus Apple will make less profit than it otherwise could have.
Let me state it plainly: by rejecting apps and whole SDKs based on its shortsighted App Store review process policies instead of based on the value they add to the platform and their user experience, Apple is reducing its own profit and thus failing its stockholders.
And I'm not the only one who thinks so.
Apple's shortsighted policy on disallowing real-time analysis of the video image on iPhones is blocking a whole range of potential augmented reality applications and thus further reducing Apple's potential profits.
Another app not created, another sale unrealized.
Just last night, when I was thinking of writing this blog post, I read a tweet from Tim Sears, author of the excellent Robotvision augmented reality iPhone app (iTunes link):
Spent a few hours last night playing around with face tracking on the iPhone. Too bad rapid image capturing isn't supported. Too bad.
Here was a developer complaining that he couldn't build a cool new iPhone app that used face tracking because of Apple's policy disallowing real-time analysis of the video image from the iPhone's camera. I decided to contact him to ask him what his thoughts were on the subject and he responded with this gem of a quote, which I wholeheartedly agree with (emphasis mine):
I believe Apple needs to change their philosophy to the App Review process, making it less about what technology you use and more about good business decisions. They should be rejecting apps that are abusing infrastructure and are creating a poor experience for the user and accepting apps that are driving innovation. It shouldn't be based around logistical red-tape such as what supported libraries you are using and what text you are entering into your marketing description.
What can I say? Tim said everything I was thinking in one succinct paragraph. I can only hope that someone at Apple reads this, prints out that paragraph in 72pt text, and pastes it all around the Apple campus.
Apple stifling augmented reality apps on the iPhone
Tim goes on to talk about how the burgeoning augmented reality industry on the iPhone is suffering due to Apple's shortsightedness on this issue:
UIGetScreenImage is just one more case for this (on top of the many others such as the CameraOverlay, and the Google Voice fiasco). The young and fertile augmented reality industry is suffering because of it, as well, as there are many more doors to be opened once we can leverage the iPhone for marker-based tracking and object recognition. These are core technical requirements critical to the continued innovation of this industry.
As you can probably tell, Apple's stance on this issue is hurting not just iPhone apps that want to perform barcode scanning but also the take off of what is perhaps the hottest and most exciting mobile application category of our day: augmented reality.
Why this matters to me
As you may already know, this is an issue that is close to my heart since my first iPhone app, 'avit, uses the RedLaser SDK.
Because of Apple's rejection of RedLaser's Realtime SDK, I've had to re-architect 'avit to use the RedLaser Burst SDK.
As an independent developer, this delay has cost me unnecessary time and money.
And yet, it hasn't been all bad.
Although Apple's rejection of the SDK meant that the release of 'avit was delayed (I still haven't submitted it to the App Store – I hope to do so this week after we've signed the license agreement for RedLaser with Occipital), it also meant that I was able to use the extra time to improve the user experience of other parts of the application.
Once I've had a chance to record a new screencast for 'avit, I will write up another post comparing and contrasting the user experience between RedLaser's Realtime and Burst SDK versions in more detail.
And just in case I didn't make it painfully clear earlier, Occipital is in no way to blame for any of this. Their only crime was to try and make the best possible barcode reader for the iPhone.
If anything, Occipital have performed brilliantly and with great patience while firmly lodged between a rock and hard place. They've also been very helpful to me personally during all this with Jeffrey constantly and personally keeping me updated on the latest developments and working with me to make the transition to the new SDK as painless as possible.
Also, before anyone jumps to any conclusions, Occipital in no way asked me to write any of this nor have they had prior knowledge of this post. In fact, knowing Jeffrey's humble and non-confrontational approach, he would probably have preferred it had I not written anything at all. That, of course, would not have been possible since I had to explain to you guys why the user experience changed in 'avit since the initial screencast and I couldn't do that without addressing this issue.
Beyond that, I also feel that the issue needs to be addressed as it pertains to more than just my app but concerns the long-term health of the platform itself. Furthermore, we as developers cannot live in fear of offending the gatekeepers of the platforms that we develop for. To paraphrase Thomas Jefferson:
When developers fear their platform, there is tyranny; when the platform fears developers, there is liberty.
All this to say that I hope Apple will come to its senses and start supporting the real-time analysis of the video image on iPhones so that I can go back to enabling realtime barcode scanning in 'avit, Tim can create his awesome new face-tracking app, and so that someone out there reading this can build something revolutionary that none of us has even thought of yet.
Apple, please hear the rising chorus of discontent from developers who are unhappy with your App Store review process, stop playing gatekeeper and limiting the potential of this awesome platform, and address these issues before the cacophonous crescendo is replaced by a deafening silence.