United Airlines Bug Bounty: An experience in reporting a serious vulnerability

About six months ago, United Airlines announced their Bug Bounty program and, almost immediately, it received a ton of press. Though I initially wasn’t interested in pursuing the program (mainly due to its popularity), I decided to take a look a week or two after its launch despite being late to the party. Shortly after, I identified (and reported) a serious vulnerability in an API endpoint that exposed the PII (personally identifiable information) of any rewards member. Surprisingly, this serious issue took United’s team just under six months to patch.

My research started with United’s mobile app. I created a MileagePlus  account, launched the app, and logged in.

While proxying my requests and navigating throughout the app, I noticed an interesting request that seemed to be used to populate my upcoming flights as well as any United Club passes I had purchased. Below is a look at the request:

Note the existence of the mpNumber  parameter. Since the user is already authenticated and cookies are presumably being used to track session state, this parameter looked like an IDOR vulnerability. I subsequently created another test MileagePlus account and booked a cheap flight in order to test it. I ran the same request as above, but substituted the MileagePlus number with the one from my test account. Here is the response:

Though a lot of information is displayed here, I thought the most serious information exposed was the recordLocator  along with the customer’s last name. Using just these two values, an attacker could completely manage any aspect of a flight reservation using United’s website. This includes access to all of the flight’s departures, arrivals, the reservation payment receipt (payment method and last 4 of CC), personal information about passengers (phone numbers, emergency contacts), and the ability to change/cancel the flight.

Club Passes are also exposed by the same response.

Note that the customer’s email address is exposed, as well as the barcode value. This means an attacker could likely gain access to the United Club by spoofing another customer’s barcode value at the entrance, essentially stealing his purchased pass.

I prepared my report and submitted it to United’s security team. Since I understood they were probably overwhelmed with the number of vulnerability submissions, I expected a delayed acknowledgment/response — I didn’t expect, however, for the issue to remain unpatched five months later. In fact, believing that six months was a more than reasonable time frame to get the issue patched (likely a one-line fix), I ultimately had to inform them of my intention to publicly disclose the unpatched vulnerability on November 28 (six months after my original submission). This gave them a few more weeks to get it patched, hopefully avoiding public disclosure.

Disclosure Timeline

2015-05-27: Initial vuln report submitted
2015-07-13: Follow-up with United, informed it was a duplicate
2015-07-16: Follow-up with United requesting estimated patch date
2015-07-22: United replies that only original submitter will receive updates. I request that they reconsider, they agree to inform me when it’s patched
2015-08-12: Follow-up with United requesting status
2015-08-13: Received reply, “the bug is validated and awaiting implementation”
2015-09-15: Another follow-up
2015-09-16: Received reply, “submission is in queue”
2015-11-05: Email (and tweet) United informing them of decision to publicly disclose on 11/28
2015-11-06: United replies thanking me for the heads up and reiterates that the number of submissions has delayed them in fixing issues. I’m reminded of being disqualified.
2015-11-12: I engage media contact and explain the history of the issue
2015-11-13: Media contacts United for comment on situation, expecting to run story on 11/14
2015-11-14: United responds (at 1:00am) that the issue has been patched. I tested to confirm.

My report was actually marked as a duplicate — not unexpected due to my delay in getting started. This is largely irrelevant to the rest of the post, but interesting in how it relates to the terms and conditions of United’s bug bounty program. One of the terms is that public disclosure (of any kind) will result in permanent disqualification from their program and loss of any reward; indeed I was reminded of this by United’s team when I informed them of my intention to go full disclosure. Since I was not to receive an award regardless, and I didn’t have further interest in submitting to the program, I accepted the threat of disqualification.

Overall, I think bug bounty programs are a great step in the right direction, but  running one effectively is critical. Though the intention to publicly disclose the vulnerability appears to have pressured United to fix it, I suspect that the request for comment by media personnel ultimately forced them to take the necessary action.

Share this: Facebooktwittergoogle_pluslinkedin
  • Binary Override

    Very nice find, I wonder if they in fact had a “duplicate” submission or was this just a way to get away from paying…

    • Stevie Graham

      You can still intercept even with pinned certs by using a dylib to decorate the networking APIs with code to log the args and return values.

      • Iñaki Rodríguez

        Usually setting the proxy is enough. You can do it with burp. The only thing that you have to do is to install the CA on the device.

        • Stevie Graham

          Not with SSL pinning. That’s the point of pinning, i.e. to stop MITM via CA substitution.

          • Iñaki Rodríguez

            Agree!! That is the reason for using “usually”. Most of the time it is enough with the CA “trick”, although you really care about security. But you are totally right 😀

  • Matson, Gregory P.


  • Nice find!

    As much as 5-6 months is quite excessive, I wanted to add some thoughts here:

    With the overwhelming exposure, from the security community and main stream media, that United got when launching their public bug bounty program, they’ve probably received a huge amount of vulnerability reports and having a subset of those submissions being valid, it means the engineering team suddenly has a tonne of work on their hands. That’s not to justify that waiting 6 months is ok, only how I’m guessing it must be to be on the other side. Also, United is the first major airline to even start a bug bounty program – so as bad as your timeline turned out, they’re actually trying to work with the security community, more so than other airlines.

    Lastly, enjoy your time in Orlando in December, all four hours!

  • JL

    Seems like large businesses like this could use a bug bounty service managed by a third party with expertise in engineering management. Rather than try to throw this on their IT and public relations (talk about a poor match to put to public scrutiny), a bug bounty program could be more effectively managed by an organization that has both the public relations resources and engineering expertise to work with the customer’s IT to identify the real vulnerabilities and get them patched in days and hours rather than weeks and months.

    Sorta like what Kaggle has for Wisdom of Crowds for solving data science problems, someone could do this for high profile apps and websites.

    Does someone offer this type of service?

  • nikisweeting

    I’ve grown to expect a 3-4 month delay from large companies that recently open their programs to submission. But 6 MONTHS?! That’s crazy especially for a submission of this severity.

  • It feels naive and unjustified to just declare that this is easy to fix and shouldn’t take long. And I don’t think this article discusses the very real risks that this kind of threatening carries. Individual researchers have no realistic perspective on the whole of something as big as United. There’s an implicit assumption that the researcher cares more for the public than United does—that they’re doing things in the wrong priority order. Basically this post asks the rhetorical question “what could possibly be more important than the bug I reported?”

    I feel like there’s some unsubstantiated assumptions. Like (a) United did absolutely nothing in the entire 6 months, and suddenly rushed out a fix when threatened. Because it’s just that easy. Or (b) there was nothing more urgent to do than this fix. Bug fixes take zero time to fix and zero time to test, so just get on with it.

    Some possibilities that aren’t even contemplated here:

    (1) United knew that it would be a while, so perhaps they put in a bunch of detective controls (e.g., throttling / rate limiting / monitoring) around the vulnerable interfaces. It wouldn’t stop the odd test from succeeding, but it would allow them to respond to wholesale abuse of the vulnerability. This post doesn’t consider that maybe they took some operational actions. They would not be a complete solution, but it’s far better than nothing if you know it will be a while before the software is actually fixed.

    (2) Perhaps there were a lot of other issues that were higher priority. This was bad, sure. But what if there were several that were worse? And what if those worse issues were complicated, so they took longer to fix? And fixes have to be tested. And tests often involve multiple teams, possibly at third party vendors and business partners. This article is written from the perspective of the researcher, for whom this is the worst thing he’s aware of at United. It’s not written from the perspective of United, for whom this is just one of many, many bugs and might not be the worst they’re dealing with.

    (3) What if responding to the imminent threat of disclosure on this vulnerability cause them to leave a worse bug—but one that was undisclosed—unfixed for even longer. Maybe the researcher involved in an even worse bug was playing nicely and not threatening. But now they have to drop everything and fix the one that will be disclosed. There’s no consideration in this post for the negative possibilities that could come from threatening public disclosure. Would United have to leave something important unfixed while they respond to the media hype?

    This kind of threatening enterprises seems reckless and naive. Sure, some enterprises drag their feet and try to ignore security bugs and not fix them. But those are not the enterprises who CREATE BUG BOUNTY programmes. This kind of threat is exactly why people dread bug bounty programmes. They worry that security researchers will start bullying them and they’ll lose control of their development schedule, fixing bugs based on media crises instead of deliberate development schedules. I don’t see who was better served than the researcher’s own ego by this kind of bullying.

    • Randy Westergren

      You’ve made some fair points and I agree with a few of them to an extent.

      I would disagree with any sense of naivety with respect to how simple the vulnerability was to patch. Neither one of us knows for sure how much effort was required, but given that most of my experience comes from the engineering side of the equation, I think I have an idea. It’s widely accepted that patching an IDOR is very low on the difficulty scale.

      A number of your points are largely irrevelant to the fact that United operates a bug bounty program at all, at least from my perspective. If you read through most of my blog posts, I overwhelmingly work with companies who do NOT operate bug bounties, yet I hold them to the same standard.

      Your point regarding the opportunity cost of fixing my bug, as opposed to potentially something even more serious is fair. Just like bounty programs have terms, I think we have to strike a balance as researchers with our own terms. I believe six months is more than a fair deadline for something so serious and easily patched. Alternatively, if researchers wait forever and don’t pressure vendors, we’ll likely end up in a worse scenario. If you look over my other posts, you’ll see I try to work with vendors as much as possible. Drawing a line in the sand, whether you agree about where that line should be or not, hardly seems like “bullying” to me.

      “Sure, some enterprises drag their feet and try to ignore security bugs and not fix them. But those are not the enterprises who CREATE BUG BOUNTY programmes.”

      I think United does a great job of refuting that point in this example. Is a poorly run bug bounty really any better than not having one at all?

  • martin hall

    I’ve found that united bug bounty was created to have people find issue for free, In my experience they treat every sub domain as an “external” site , even the booking.united.com

    I’ve found over 5 bugs all submitted and every-time told that it was an external site.
    To me an external site is a site managed by a 3rd party , however in this case the sites were managed by the airline and also most likely coded internally.

    I gave submitting to them.