Google's codec maybe open but it certainly is not free to end users. It comes with two caveats; hardware decoding of video streams is lost and Flash is entrenched.
One of the key components of the H.264 infrastructure is the hardware decoding that is now in tens and quite possibly hundreds of millions of devices. End users lose the value of that hardware if H.264 goes away. That's literally tens of billions of dollars the end user in aggregate has to put up to replace this hardware with a "free and open" codec. Perhaps in the long run all devices will come with hardware to decode WebM, but at the moment not a single device on the market supports it.
I like open source as much as the next programmer, but end users really don't care about it. I see this heading in one direction: Flash. For those that don't know, Flash supports H.264 and Google bakes the Flash plugin into Chrome. This move to drop H.264 does nothing to help WebM and everything to entrench the closed Flash plugin from Adobe. Since almost every site encodes their video in H.264 they will use the easiest means of delivering that H.264 video to end users. Since HTML5 is no longer easy, websites will deliver H.264 video wrapped in Flash packaging as much of the web has done until recently with H.264 HTML 5 delivery picking up significant steam. So we end up with less web standards and a guaranteed proprietary mess right now with Flash instead of a possible proprietary mess that might, maybe manifest in 2015 with H.264.
It could be that if Apple transfered the users over to the new service (if it exists) they would have to grandfather them into the program with the same rates for some time. Given that iTunes has millions upon millions of users already, Apple does not need to migrate people over to keep the user base. This appears to have been a pure technology play for Apple.
Another point to consider is the accounting system that runs iTunes. From what I have heard the system is difficult, at best, to make changes to. Again, I heard this second hand, but supposedly this is the reason that when in app purchase first launched you could not have in app purchases in Free apps. Once they corrected that bit in the accounting system, they launched it in the iPhone OS SDK.
With that in mind, perhaps moving LaLa users into the new system may have proven to be a daunting task. Why bother with the daunting when you can rebuild your user base, with ease from scratch, using your existing sales channels?
In both cases here, with Adobe's solution and the native iPhone language (Objective-C), the programs are being compiled into a binary. The native iPhone apps will be generally faster as they don't have the intermediate layer of instructions being executed on the processor. Since I don't know the specifics of how Adobe is doing their cross compilation I can't say for sure that the apps produced with their tools will be much bigger or much slower. They could be, it's a distinct chance, but not guaranteed depending on how they have implemented this. They might just be slightly slower and slightly bigger.
As a cross platform developer who has plans to develop for Android and Blackberry for our product, I don't find much of a problem here.
Saying you can only write in C, C++ and Objective-C leaves a whole heck of a lot of options open. C runs on just about every microprocessor on earth. Our main libraries for our apps are written in C and run fine on Android and iPhone with minor changes for talking to external accessories. Cross platform development for us is relatively painless compared to the alternatives for our goals.
Sure, I have to write different interface code for both apps, but I would do that with or without the restrictions. I am a firm believer that the native UI elements provide the best experience and the only way you get good native UI elements is to write directly against the OS's standard UI libraries. I won't disagree that you can get usable apps from an intermediate layer, but the effort required in achieving a slick, polished UI with an intermediate layer is next to impossible. It's cheaper to just write directly against the built-in UI libs if you want a great experience... and anyone who disagrees with me is wrong! ;)
Lists are just specialized graphs and the more links you add to the list the further you tread in to general graph theory. Dijkstra, et al. already provided us with proofs for graph traversal algorithms that cover the most efficient methods of traversal. Oh, and Dijkstra was rather anti-hardware. He did almost all his work with a fountain pen and paper as he realized software was most emphatically NOT hardware, but rather a mathematical construct and often symbolized in computer hardware by the switching on and off of electrical potentials. Note I said symbolized, not the same thing as. Software can also be symbolized in the english language, or a formal mathematical specification.
Software == Math and hardware == a turing complete machine that does Math.