Your analysis of the statement is incorrect. I am saying that decompiling the app is not at all a necessary step, because it is not. But the likelihood of someone finding that particular app and discovering the exploit is irrelevant to the argument.
> (As for the rest of your argument, I really can't make any
> sense of it. You seem to be agreeing with me, then claiming
If you are referring to the "actions may be defensible legally" aspect of the conversation, then yes - I am agreeing with you. I hadn't claimed (or meant to) anything to the contrary.
The simple fact is Miller broke his agreement that he signed with Apple when he uploaded a compromised application to the app store. As a result, his license was suspended per that agreement.
Others have pointed this out from slightly different angles...
See John Fenderson's post in this thread, or the "Two things" or "He broke the App store agreement..." threads below.
> The article says he did report the vuln to Apple.
True. After he had already created and gotten an app approved into the App Store. So the vulnerability was public.
> Can you cite any evidence to show when the app was
> uploaded and when Apple was notified?
From the original Forbes article:
"Miller had, admittedly, created a proof-of-concept application to demonstrate his security exploit, and even gotten Apple to approve it for distribution in Apple’s App Store by hiding it inside a fake stock ticker program..."
> Also, how can you prove that a vuln exists without a
> proof-of-concept exploit? Is there any way to confirm the
> vuln without uploading something to the app store?
It depends on what you claim the ultimate vulnerability is. This is being discussed in greater detail in other threads there people are ripping me apart.
There are two vulnerabilities here:
1) The ability to actually run untrusted code
2) The ability to get an app approved in the App Store that exploits that.
In answer to your question for those two points:
1) Yes. This can be shown outside the App Store.
2) No, obviously not, since getting the app approved is the end-goal.
Had he made #1 public first then demonstrated #2 after giving Apple a chance to respond this would absolutely be all about Apple covering something up.
> Without doing so, it may be impossible to confirm and
> demonstrate the bug.
In part, I agree. To expand on something you mentioned earlier in your post...
> The ultimate demonstration is having a signed, trusted
> app do something untrusted.
I would expand on that and add that getting such an app into the App Store (i.e., getting it approved by Apple itself) is the ultimate demonstration. You may have been implying this already by "signed, trusted app".
To show that a signed app, running in the iOS sandbox, could do something untrusted does not require its presence in the App Store. You can demonstrate this on a development platform.
It becomes tricky when you add "trusted" into it -- as you point out. At what point is it "trusted"? When Apple approves it for the App Store?
I'll say "yes" to the above. The app has passed the final review, so it can't be any more trusted then that.
The bump in the road here is that the exploitation was not previously disclosed. First step is to reveal the vulnerability and give the appropriate company (Apple, in this case) time to fix it (how much time is a different discussion).
If the exploitation had been disclosed, it would have provided Apple the opportunity to (1) fix it, or at least (2) watch for it and deny that "trusted" status so it never made it to the App Store.
But, okay. People don't agree with that sentiment. We'll switch up the argument -- creating the app and getting it onto the App Store to demonstrate the full vulnerability was the right course.
Miller should have, or could have: (1) pointed out the issue to Apple privately, (2) publicly exposed the issue and removed the app, or (3) discretely waited until the security conference until exposing it. Unfortunately, he made it public on Twitter well in advance.
I *don't* believe he meant this to be malicious in any way. But he isn't just a "messenger".
Miller did not inform Apple of the bug before hand. He did not provide an opportunity to acknowledge and fix the bug.
He knowingly uploaded an app that violated the terms and introduced a vulnerability. He then, other a month later, made a public announcement that the bug existed and the app to prove it is already in the App Store.
Your timeline of what happened is incorrect, Mike.
The author did not report the bug and wait. He didn't tell Apple he found an exploitation and has an app waiting to demonstrate it. He didn't upload the app to the App Store after waiting a week, month... or however long.
The reality is Miller found an exploit, created an app, and uploaded it to the App Store. After that, he made public that he had found an exploit and demonstrated it using the publicly available app (which had been there for over a month).
There was no bug reported. There was no opportunity to force the corporate hand. What you don't do is exploit that bug first in the working sandbox first.
Did we not read the last paragraph in my original post? The part that points out that one *SHOULD* report bugs and create something that demonstrates the exploitations?
Nice use of ellipsis -- it's like a movie quote! Looking beyond the omission of "although unlikely" in the original sentence, your analysis of the level of effort to discover such an item is over-complicated.
Understanding the nature of what the vulnerability was (in hindsight) illustrates that it could have been discovered by very simple means. As unlikely as the an individual picking this app is.
The discussions over at Gizmodo on this story are at least somewhat insightful from a technical point of view, instead of people just running at it from a "big company keeping down the little guy" point of view.
Apple as a sandbox called the "App Store". They've made rules for playing in the sandbox; like "don't piss in the sandbox". If you piss in the sandbox, you're not allowed to play in the sandbox for a little while.
> Apple's actions may be defensible legally, but only
You're stretching the notion of "legally" a bit there. We're taking a private company here. If Apple wanted to refuse an app because they didn't like the color scheme, they could. They don't have to have a reason to reject an app from the store -- they just can, because they want to.
This isn't a moral question. There are rules set up for developers who wish to participate in the App Store. Miller *BROKE* those rules!
Should Apple be working their butts off to fix this? Yes.
Should they have fixed it sooner? Dunno. Maybe they've been trying since it was discovered. Maybe they've been busy playing table tennis instead.
Could Apple have looked the other way? Could they have punished him a little less? Could they still reverse or revise the decision? Yes. Yes. Yes.
Do they need to? No. A developer knowingly introduced an app into the store that exploited a security flaw, and was punished according to the agreements he signed.
Techdirt has not posted any stories submitted by Evil Closet Monkey.