366

I have found out recently that the remote assistant software that we put in a smartphone we sell can be activated by us without user approval.

We are not using this option, and it is probably there by mistake. But the people who are responsible for this system don't see it as a big deal. Because “We are not going to use it”…

Am I wrong for going ballistic over it?

What would you do about it if it was your workplace?

This question was IT Security Question of the Week.
Read the Jun 16, 2012 blog entry for more details or submit your own Question of the Week.

anonymousquery
  • 2,991
  • 2
  • 13
  • 4
  • 2
    Is there a use case for the software for situations where the legitimate user may not have control of the device? Say, is there 'wipe device' or 'hard-lock device' functionality? I don't imagine someone who _stole_ the device is going to want to accept remote actions that render the device worthless. – Clockwork-Muse May 17 '12 at 17:35
  • 50
    Which carriers use your phone, so I can switch out my provider if necessary? – Joshua Drake May 17 '12 at 18:53
  • Did your company develop the software or are you licensing it from someone else? – Paperjam May 17 '12 at 21:08
  • 5
    Screenshots or it didn't happen! ;) – lisa17 May 17 '12 at 22:03
  • 147
    one word **wikileaks** –  May 17 '12 at 22:13
  • 16
    The largest ISP in Germany got a lot of [bad press coverage](http://www.heise.de/netze/meldung/WLAN-Hintertuer-in-Telekom-Routern-1558346.html) during the last month because a backdoor account in their enduser routers was discovered. – Hendrik Brummermann May 17 '12 at 22:19
  • 1
    Do you mean [Carrier IQ](http://www.zdnet.com/blog/btl/which-phones-networks-run-carrier-iq-mobile-tracking-software/64500)? Or is this something newer and perhaps not yet widely known? – sarnold May 17 '12 at 23:12
  • 84
    I call bullshit on "we're not going to use it". You don't put that kind of backdoor in unless you absolutely plan on using it, or giving it to someone else to use. Blow the whistle. – Shadur May 17 '12 at 23:23
  • 1
    This sounds like Google's "Do No Evil" – BenjaminRH May 18 '12 at 01:51
  • 6
    @Shadur From my (limited) experience, in large companies things like this may happen purely due to the lack of organization. It doesn't make it less dangerous, but it's not necessarily because of an evil intent. – Anton Strogonoff May 18 '12 at 02:18
  • 3
    @anonymousquery, What are the implications of activating the remote assistant? E.g., will it be discoverable by user, does it allow controlling the system or reading private data? – Anton Strogonoff May 18 '12 at 02:19
  • 40
    "What would happen to our reputation if it became public that we had a backdoor on all those phones?" That question will force a patch ... – schroeder May 18 '12 at 04:37
  • 1
    ZTE recently has bad luck with a similar problem http://www.theverge.com/2012/5/18/3028207/zte-score-backdoor-vulnerability-confirmed-skate there are other recent examples where a default account and password ( which cannot be disable or modified ) into physical hardware has caused lots of questions of how to protect devices like that ( i.e. stuff for infrastructure ). – Ramhound May 18 '12 at 12:52
  • 9
    @AntonStrogonoff Well said. "Never attribute to malice that which is adequately explained by stupidity." – Frank Farmer May 19 '12 at 16:47
  • 7
    @FrankFarmer I'm pretty sure that saying was invented by malicious people to make it easier for them to play stupid when they're caught – Michael Mrozek May 19 '12 at 22:07
  • 1
    Hoping this post and the ZTE news coming in such close proximity means you got your way, grats man :) – Will Buck May 20 '12 at 03:04
  • 1
    Any chance we could get brand/model of the phone? I would like to make sure its not the phone I'm using. Does it run Android? – Nathan Schwermann May 20 '12 at 04:33
  • 32
    I need a nuclear bomb. I am not going to use it. I just need it. – Jus12 May 20 '12 at 14:53
  • @schwiz Probably not. In the event of an internal witch hunt naming names here would go a long way towards deanonymising anonymousquery. – Dan Is Fiddling By Firelight May 21 '12 at 19:26
  • .....don't see it as a big deal. Because “We are not going to use it” - they don't seems to care about there customer. They can get sued once client knows it. – Falaque May 19 '12 at 11:17
  • 1
    They may not use it, bit some 3 letter organization will if they have a court order – TruthOf42 Apr 16 '14 at 13:03
  • 3
    Contact the [Electronic Frontier Foundation](https://eff.org), anonymously if need be. Ask them for help finding a lawyer who will help you establish whether - and if so, how - you can perform a responsible disclosure. – sampablokuper Apr 18 '14 at 21:44
  • 1
    You work for Samsung perhaps? http://redmine.replicant.us/projects/replicant/wiki/SamsungGalaxyBackdoor – Stephen Gornick Jul 14 '14 at 05:37
  • 1
    I wonder what eventually happened with this situation. Though, looking at the profile, the OP never logged in here again after asking the question, so I doubt he'll see my comment :( – Radu Murzea Aug 18 '17 at 14:01

22 Answers22

320

Just because they won't use it, doesn't mean someone else won't find it and use it.

A backdoor is a built-in vulnerability and can be used by anyone. You should explain that doing something like this is very risky for your company. What happens when some malicious attacker finds this backdoor and uses it? This will cost your company a lot of time and money to fix. And what will your company say when people ask why the software contained that backdoor in the first place? The company's reputation might be damaged forever.

The risk is certainly not worth having it in the code.

schroeder
  • 125,553
  • 55
  • 289
  • 326
Oleksi
  • 4,839
  • 2
  • 20
  • 26
  • 76
    +1 All things should be documented and shared with the user. No backdoors, should be allowed. [MSFT even went as far as banning harmless easter eggs from all software](http://blogs.msdn.com/b/larryosterman/archive/2005/10/21/483608.aspx) since it implies that code contains hidden features & functionality. Your company should follow the lead of Trustworthy computing or risk going down a rabbit hole they may not recover from (bad PR, lost sales, etc) – makerofthings7 May 17 '12 at 18:51
  • 65
    +1 "But we're not going to use it" is the *worst* excuse for a backdoor. If you're not going to use it, then what's the point? Someone else will, almost guaranteed. At least if you plan to use it for some reason, it has a reason for being and you can weigh the risks involved. – Matthew Scharley May 17 '12 at 23:48
  • 2
    It's unrealistic to assume that everyone in a large company thinks in the same way, so one person saying "But we're not going to use it" doesn't mean much unless it is backed by explicit policy and repercussions against those who use it. Unlikely to happen if the powers that be are happy to even have it exist. – Burhan Ali May 18 '12 at 05:21
  • 10
    @makerofthings7 Just to be clear on your MSFT statement. They didn't intentionally stop it because those chose to. They were court ordered by law to stop it. "to supply certain government agencies with software, Microsoft can't include undocumented features, including Easter eggs". – SpoiledTechie.com May 19 '12 at 00:28
  • 1
    @SpoiledTechie.com Thanks for the correction. I'll shift some sentiment of appreciation to the government ... for insisting on doing the right things right – makerofthings7 May 19 '12 at 01:24
  • 10
    Software is just like a contract. If you see a clause in a contract that could get you REALLY screwed over you don't sign the contract until they take it out. Them saying "but we're never going to use the clause" is just silly. It only takes one upset employee who knows about it to do mega-damage to your company. – Philip Couling May 20 '12 at 09:15
  • It's easily possible to write backdoors that can't be used by anybody else. – CodesInChaos May 10 '13 at 08:46
  • The "IT Security Question of the Week" link is broken – Pro Q Jun 13 '18 at 23:33
117

If you've informed decision-makers and they've decided not to do anything about it, then by definition your company is knowingly shipping a product with a serious security vulnerability. (And, I assume, hiding it from their customers.) This is a very serious matter. What's the worst that a malicious person with access to this backdoor could do? If it's bad enough, I would go to the FBI about it. (Or whoever has jurisdiction over computer security if you're not in the US.)

If your company knows about the problem and doesn't care, then exposing it is the only ethical course of action. And if they attempt to take retaliatory action against you, you may have legal recourses available, depending on the circumstances and the laws where you live. (Talk to a lawyer about that if you think it might apply in your case.)

Mason Wheeler
  • 1,635
  • 1
  • 11
  • 15
  • 37
    +1 Because I agree with the idea, but I think you will probably lose your job, and you will have to spend the rest of your life harvesting apples because no one else is going to hire you... This is not an easy decision to make. – yms May 17 '12 at 22:08
  • 4
    @Mason Could you reference the legal provisions that are offered to protect whistle blowers in this type of scenario? – Rob May 17 '12 at 22:09
  • 2
    @Rob: That's interesting. When I looked at it some more, it looks like whistleblower protection laws generally only apply to government employees. I'll edit my answer. – Mason Wheeler May 17 '12 at 22:34
  • 8
    There's a whole [US government website](http://www.whistleblowers.gov/) dedicated to whistleblowers, including those who are reporting on breaches of "violations of ... consumer product .. and securities laws." There's likely to be something similar, in most rich countries.. – naught101 May 18 '12 at 00:33
  • 3
    It is not obvious to me that (1) exposing it is the only ethical course of action; (2) the FBI has jurisdiction or interest; and (3) there is any legal recourses for whistleblowers. What is the difference between an undocumented feature and a poorly documented feature? – emory May 20 '12 at 20:27
  • 1
    The UK government's website for the [UK Whistleblowers act](https://www.gov.uk/whistleblowing/overview). – Greenonline Jun 27 '15 at 11:16
69

Please, pardon my cynicism, but this isn't the first and won't be the last backdoor we see in our legitimate, hardly-earned apps and devices. Just to refresh our memory, we can start from the most recent one, the new Amazon's Big Brother Kindle [1][2].

But we have an entire plethora of backdoored software and services, such as PGP Disk Encryption [3][4], ProFTPD [5] or Hushmail [6], to name a few.

And don't forget the OSes: M$ is always ahead with its NSA_KEY [7][8], but also OpenBSD [9] and the Linux kernel [10] can't be considered 100% safe. We also have paid attempts to gain a backdoor access to Skype by NSA [11], that, however, has been assessed as "architecturally secure" [12].

Moving down to firmware, nowadays we are almost acclimatized in having people from our ISP that are able to watch inside our routers (yes, maybe even see our beloved WPA password), but these [13][14][15] can surely be considered as backdoors too!

Finally, a few considerations on hardware and BIOSes [16], and (this is both funny and somehow dramatic) EULAs [17][18], because also lawyers have their backdoors.

Ok, given this preamble, I'll try to answer to the question briefly. No, you're not wrong getting mad for this thing, but you should focus your anger on the correct motivation. You should be angry because you lost a piece of trust towards the company you work for, not for the fact of the backdoor itself (leave this anger to the customers).

And if I were you, I'll just be very cautious. First, I'll make really really sure that what I saw was a backdoor, I mean legally speaking. Second, I'll try in any way to convince the company to remove the backdoor.

You probably signed a NDA [19] with your company so your question here could be already a violation. However I don't know where the NDA ends and your state law begins (it could be even customer fraud), and probably, due to the technicality of the subject only a highly specialized lawyer could help you with this matter. So, if you want to proceed, before doing anything else, even talking to the authorities, you should hire a very skilled lawyer and be prepared to lose a lot of time and money, or even the job.

blade19899
  • 3,621
  • 3
  • 13
  • 18
Avio
  • 830
  • 7
  • 7
  • 6
    The PGP case does not seem like a backdoor to me. – ypercubeᵀᴹ May 17 '12 at 23:57
  • 4
    The Skype article seems out of place. It was written in 2009 and even the article makes it sound like Skype decline offers to create a backdoor. – Ramhound May 18 '12 at 13:09
  • I wasn't saying that Skype _has_ a backdoor, but I was pointing out that the NSA, a government agency, offered money (a lot of money) to have the job done. Is it legal even to ask for such a thing? What happens if you offer to pay someone to break into someone else's telephone line and get caught? This is just to say that the bigger the company, the less likely they are going to pay for their illegal acts. – Avio May 18 '12 at 14:20
  • 2
    Not to mention all the software that offers automatic updates... – Hendrik Brummermann May 18 '12 at 16:29
  • 45
    I think your huge number of links is extremely misleading. Reading it over at first, I had the impression you were pointing out all these instances of backdoors. Then, I went back to read the links, and I found a couple instances of genuine backdoors, most of which were not secrets, a hacked server hosting a malicious version of a piece of software for a short period of time, and then a mixture of things which were all *not* backdoors: failed attempts at backdoors, falsely identified non-backdoors, unsubstantiated rumours about backdoors, and theoretical discussions of potential backdoors. – Jeremy Salwen May 19 '12 at 05:11
  • 10
    The linux kernel link from 2003 is hardly damning – jamespo May 19 '12 at 09:56
  • @Jeremy Salwen - I was just pointing out that backdoors and "undesired functionalities" never requested by users and even working against them, has always existed and will exist for a long time. I could also bet that what I found in my half-hour search is just the tip of the iceberg. This is a controversial topic and for my own safety I begun to consider rumors as half-truths when chosing my own hardware and software. If you believe you're perfectly safe, well, good for you. – Avio May 19 '12 at 18:26
  • Ha - I love the EULAS link ... "PC Pitstop – $1,000 For Free" is definately worth a read. – JW. May 19 '12 at 21:12
  • 6
    +1: The question does seem to use over-linking to improve it's legitimacy. Which looks kinda bad. However, Two points really stand out: * You should be angry because you lost a piece of trust towards the company you work for, not for the fact of the backdoor itself (leave this anger to the customers): Couldn't have said nay better. * I don't know where the NDA ends and your state law begins [...]: It's always nice to remember that no signed contract can violate the country/state law in wich it is signed. IANAL, but already heard this from one. – pkoch May 19 '12 at 21:47
55

If they don't see it as a big deal, you're not asking them the right question. The question to motivate action on this isn't "is this right?" but "what happens to us when somebody finds and publishes this?" Whether you're a big or small company, you're looking at serious damage to your reputation and all the bad things that go along with it if someone outside the company discovers this before you fix it.

Fixing this issue isn't just ethical, it's essential for your company's survival. It's far, far better to fix it quietly now than a week after all your users and customers have left you because it was revealed by some online journalist.

MartianInvader
  • 659
  • 4
  • 2
  • 1
    I totally agree. They may not understand the moral/ethical arguments, but if you force them to follow the money trail, you'll get their attention. – Chris Jaynes May 18 '12 at 14:00
  • 1
    +1. And if they refuse, mind your own business. I am old enough to know that morality means nothing. They are your friends. They pay you. You are responsible to them, not society, unless your ass can get fried. – user4951 May 19 '12 at 13:50
  • 10
    @JimThio: he could be sued and possibly face criminal charges if the backdoor is used, even if (perhaps especially?) if it's not his company doing it. And morality exists, no matter your age -- it just doesn't guarantee you a happy ending. – jmoreno May 19 '12 at 15:02
  • 1
    I would be very careful in how I'd present the "what happens to us when somebody finds and publishes this?" issue, not to make it sound like a direct threat. The decision makers behind all this are obviously already arrogant enough (if they can brush aside with such issues), and reaching a belief that any disagreement with them is a direct threat might be only a hint of a doubt away. – TildalWave Mar 06 '13 at 02:32
35

You should seriously consider going to some governmental or regulatory authority with this, just to protect yourself.

Imagine this scenario:

  1. You inform management about the backdoor. Now they know you know.
  2. Evil Hacker ZmEu finds out about the backdoor, and puts something on pastebin.
  3. Your management finds out about Evil Hacker ZmEu's pastebin.
  4. Your management blames you, and fires you for cause, over your protestations of innocence.

Most security vulnerabilities get discovered multiple times. You won't be the only one to find it, you'll just be the most obvious one to make a scapegoat of.

Bruce Ediger
  • 4,552
  • 2
  • 25
  • 26
  • 12
    It can get even worse, he can be sued for the backdoor as one of the persons involved in the process (he knew about it but continued to be involved) – Danubian Sailor May 18 '12 at 07:34
  • 1
    @lechlukasz Yeah, he should at least quit ASAP. Later he might well be the FBI's star witness over whatever the heck this is. – Kevin Cantu May 18 '12 at 23:28
  • I think it is very unlikely that he would be sued and held culpable on this basis. For those claiming otherwise, I challenge you to identify the basis of action (e.g., the statute or tort) and justify your opinion. Right now, these claims just sound like "Internet lawyering" (uninformed guesses and speculation from folks who are not legal professionals). – D.W. May 21 '12 at 00:26
  • 1
    Dear D.W.: Have you ever been accused of something by management? It's not a legal process, you don't have representation, and usually no appeals exist. We're talking scapegoating here, not Justice. Also, Courts, not "legal professionals" determine innocence, guilt and the validity of basis of action. You're just trying to prop up the authority of legal professionals. – Bruce Ediger May 21 '12 at 13:42
  • 1
    If ZmEu just "puts something on pastebin", this is certainly not the most evil a hacker can be. – Paŭlo Ebermann Jun 09 '12 at 19:30
  • 1
    What if your company doesn't want to share with you the real reason of that backdoor? Just for a moment it pops in my mind: "government". If the government is behind that, going to some governmental authority may not solve the problem (all depends in which country are you). – lepe Jul 01 '15 at 02:09
27

It's ok, people will still buy the iPhones your company makes - your secret is safe. ;)

If it was my workplace, where I'm employed as a security analyst, I'd accept that my job is to identify and communicate risk; it's up to the business to accept the risk. I can not accept risk personally, so my only real option is to ensure that I've communicated the level of risk in the proper forum to the best of my ability. So, if you are employed at a level where you can accept risk, then it's up to you to decide whether or not this is OK. Based on the post, however, you are not at a level where you can accept risk on behalf of the company. So all you can probably do is communicate the risk in a way that the business area can understand, and then let the business area make an appropriate business decision using all of the information available to them.

The thing you do have control over is accepting the risk to yourself posed by working for a company which makes decisions which you think are bad. Your available means of mitigating that risk are documented at Monster.com and friends. :)

dannysauer
  • 678
  • 4
  • 9
23

Before the smartphone area it was a standard feature of all mobile phone to have backdoors. The GSM protocol allowed the base station to update the phone software. http://events.ccc.de/congress/2009/Fahrplan/events/3654.en.html is a good talk about how crazy the security scheme has been.

As far as I know no one of the companies involved in creating GSM got into any legal trouble about the affair. Government agencies like the NSA liked the fact that they had backdoors. At the moment there are people inside the government that want to mandate backdoors for every communication platform.

I think there a good chance that the backdoor exist because some other entity like the NSA wants it to be there. If people higher up in your company made a deal with the NSA they probably won't tell you when you come them to complain about the backdoor.

For all you know it could be the Mossad that's paying your company to keep the backdoor in the software.

A clear backdoor into a modern smartphone is probably worth 6 figures or more on the black market. An employee could sell it or could have been specifically payed to put it there.

On the other hand if the backdoor really just exists because the higher ups in your company are to ignorant than you might be able to explain to them it it's a serious issue.

Christian
  • 1,876
  • 1
  • 14
  • 23
15

You have a professional responsibility and an ethical responsibility to ensure this is addressed, IMO. And you've stepped into a minefield. Protect yourself. Watch your step. Go slow. Think defense-in-depth. I successfully solicited a whistleblower, who has been able to maintain anonymity. The solicitation included advice on maintaining anonymity; take a look.

Check you're not re-blowing the whistle on something already known - like the Carrier IQ stuff. Sending written notification to corporate counsel could go a long way to getting the problem addressed - e.g. via an anonymous email account so you can have 2-way communication. Also: Look at archives of the now-dead Wikileaks:Submissions page I referenced.

Whistleblower.org has good info for you, even though it's government-focused.

Addendum: Have you looked through the source code version control logs to see who put the backdoor in?

13

Treat it as a security vulnerability you have discovered and report it to to, for example, CVE. Anonymously if you wish.

13

Your reaction is sound, and on a gut instinct level means that you care about one or more of: your customers' privacy, your company's public image, your codebase's quality, your own skin.

In my workplace, I would be senior enough to know it a security bug (and not there by company intent, or mandate from the government) - and remove it. It sounds like this doesn't apply in your case, though.

If you can trace "we are not going to use it" to "we put it there for our own use, but don't need it" you can probably describe to someone high enough in the organization the dangers it poses to the company when it shows up on bugtraq / gets used for nefarious purposes by some third party, which is likely to happen if your smartphone is popular, common and valuable enough (as a target - which may translate to "used by important enough people") to attack.

If you can trace it to "it's there by government mandate" or similar, you might want to insist on internal documentation to that effect, so you can at least leave it be and know that you've done what you can to protect your company, and save other skilled coworkers of yours from the dilemma you find yourself in, as a matter of good code maintenance practices. (And ponder your options about working in an industry making tools that both serve and sacrifice their owners, if this feels deeply demotivational.)

ecmanaut
  • 233
  • 1
  • 5
10

You have a known security vulnerability, and your company is only one of an infinite number of parties which could exploit it. Any exploit of that hole, by any party, could reasonably result in a liability the scale of Sony's after the root kit fiasco. Their cost in both dollars and reputation soared into the hundreds of millions of dollars, in a directly analogous situation.

Make the case by drawing direct parallels to Sony, with your user base in ratio to Sony's as a gauge to calculate potential liability when this hole is exploited.

anildash
  • 201
  • 1
  • 4
7

It's ok to worry about it, don't worry, your reaction is normal ^^

I would do one of two things:

  • I would update the user agreement explaining that this possibility exists, therefore asking for the user's consensus (if the people in charge really don't want to take the backdoor away)
  • I would remove the backdoor completely (better option in my opinion)

Also, if it is true that this backdoor is never used, why leave it there?

user1301428
  • 1,947
  • 1
  • 23
  • 29
  • I strongly doubt the asker is in a position to do either of those without risking trouble. The UA is something a lawyer should create, and no mere software developer is allowed to touch it. The second option, removing the backdoor, can lead to trouble if this feature was actually intentional. I'm far from saying it is morally acceptable to leave it, but simply removing features without consent from above is an easy way to get into trouble. – mafu May 22 '12 at 12:37
  • @mafutrct sure, I was implying that the user had to ask to his superiors before doing any of those things: approval from the upper levels is mandatory. Thanks for pointing this out though. – user1301428 May 22 '12 at 14:47
7

I would seriously counsel against immediate whistleblowing. Not least because there's a good chance this happens because someone from the CIA/FBI had a little chat with the head of the company who ordered it to happen through trusted management channels, and that's why it happens even though everyone should recognise that it's a shitty excuse.

You are rightfully recognising this as a shitty excuse. The problem is that other people also should have. Somewhere someone with power must have decided that this would happen. The construct that it's "OK because we won't use it" is then perpetuated.

That means that if you whistleblow (and get sacked) and launch a lawsuit, not only a) might you find it very difficult to get another job, b) your lawsuit might not go anywhere because it could turn out (hey, I'm not a lawyer, but it's plausible) that it wouldn't be recognised as "misconduct" by the firm. If I was the federal government I would go to greath lengths to protect people doing my dirty deeds.

On the other hand, if you have a trust fund and want to catapult yourself to 15-minute fame, you can go public about it. Just have an alternative path staked out in "Alternative Computing" or the Free Software Movement. There's no private jet down that route.

What I advise is getting a decent amount of details on it, finding new employment, then anonymously contacting a security profile and saying you want to go public/whistleblow it through them. The first person you contact will probably comply. The company might try to go after you but they SHOULD have no definite proof from any logs or search warrants and it's not "official" in the sense that new employers would be forced to recognise it.

6

OP: You know what happened in the case of ZTE? Go on pastebin, make a full and comprehensive security advisory. Needless to say, cover your tracks. That's that, if you were unsure to the point that you asked the question here, you can benefit us all by making the advisory.

dhillonv10
  • 91
  • 2
4

As mentioned elsewhere what would scare me would be usage of the interception functionality even without any bad intention of its original developers/installers. The so-called "Athens Affair" comes immediately to mind and underlines this concern. For a technically informative and in the same time exciting read you can check:

The Athens Affair: How some extremely smart hackers pulled off the most audacious cell-network break-in ever

George
  • 2,813
  • 2
  • 23
  • 39
3

It's probably there to allow government agencies to access your cell phone and listen in on whatever you're doing at the time. It's required by law.

See:

Peter Mortensen
  • 885
  • 5
  • 10
GregT
  • 155
  • 2
  • Fascinating answer. However, I don't think the "It's required by law." is an accurate statement of current law. I suggest editing your answer to remove that sentence. – D.W. May 21 '12 at 00:36
  • 3
    Which law? Do we know from which country we're talking? China? – user unknown May 21 '12 at 09:52
  • 2
    This answer seems like total fluff. The user does not quote what the law is. – Ramhound May 30 '12 at 13:02
  • We are talking about the USA with this Remotely activated mobile phone microphones stuff. I would post the law but I have no idea where it is. If anyone is more legislatively skilled, please feel free to help out. – GregT Jul 10 '12 at 23:12
3

This was reported earlier this week in China's ZTE Ships Smartphone with Backdoor to MetroPCS, but finally some are seeing it. It seemed to have gone un-noticed by many...

Peter Mortensen
  • 885
  • 5
  • 10
Phillipe
  • 31
  • 1
  • 2
    Interesting, but I don't know why you claim it is the same backdoor as the one mentioned in the question. The ZTE one you link to is just a setuid-root program, not something that can be remotely activated (not without something more on the phone). – D.W. May 21 '12 at 00:38
0

It really depends on the nature of the back door. Is it worse than Carrier IQ?

I know the cell phone carrier can intercept all my voice and data transmissions and turn them over to the government. In fact, the NSA can intercept it all, too. I also expect the carrier and phone manufacture can, if given physical access to the phone, completely read all the data on it without my approval. So if the back door is limited to these kinds of things and you personally cannot use the back door to get into someone else's phone due to some other security constraint, then I would not get too bent out of shape about it. Raise your concerns to your supervisor in an email and save a copy of it on paper at home.

Now if it is the kind of back door a hacker can use to steal passwords saved on the smart phone without being detected, that's time to go full whistleblower if you cannot get them to lock that down. Access to a customer's phone should require specific security authorization within the company combined with full logging so that people who abuse their security privileges can be identified and have those privileges revoked (at the very least).

If it is somewhere in between, well, maybe leak it to a security researcher....

Major Major
  • 492
  • 2
  • 9
  • 1
    In no circumstances should a discovered backdoor go unreported. Government backdoors still violate my right to privacy IMO. Also, if the backdoor is intended for State use with the proper checks in place, making it publicly known shouldn't hinder its use. Always report these kinds of things... – Chris Frazier May 31 '12 at 12:51
  • 1
    @Chris, not all back doors are created equal, and there is a serious risk of "crying wolf." It's also important to weigh the personal consequences of violating the confidentiality of your employer against the level of violated expectations the "back door" represents. A carrier being able to push a software update to your mobile phone is almost a given even though it could be considered a back door. – Major Major Jun 03 '12 at 18:32
  • good point on the confidentiality part. – Chris Frazier Jun 13 '12 at 13:21
0

Well what you really should be after is their motives for putting in a backdoor. As the highest voter has said, just because you wont use it does not mean it won't be found. If it were me I'd find out what the motive is behind said backdoor, anonymously alert a major tech publication about said backdoor. Then again this depends on whether you feel a responsibility towards the general public or whether you feel it's up to the company to clean up their own act and let someone else hold them to account for their misdeeds.

Dark Star1
  • 131
  • 5
0

Of course the back door didn't appear there without serious forethought. National security agencies are involved, and they paid to have it done, hopefully without anyone noticing. Cellular engineers designed in the capability to turn on a cellular microphone without the owner knowing it (keeping indicator lights off). I expect GPS locations can be broadcast even when an owner has the GPS feature turned off, and pictures can be taken and broadcast without the owner knowing it as well.

  • There is no evidence that any "national security agencies" had anything to do with the feature being added. – Ramhound May 30 '12 at 13:06
-1

Hmm, by posting it here after alerting them I'd say that you should think about actually going public because you kind of already have.

Backdoors are pretty commonplace in emerging technologies as it usually takes legislation a while to catch up. Also, the development communities are not fully converged. For instance the Internet was notoriously unsafe until ISPs had to build in direct access to government bodies via legislation, licensing and exchanges. Only then security became important as the 'right' people had secured all the access they needed.

If it were my workplace I'd STFU... But it might be a bit late for that.

Peter Mortensen
  • 885
  • 5
  • 10
Alex
  • 305
  • 1
  • 3
  • 7
-1

IF you have signed a "non-disclosure agreement" on the type of work and products you are dealing with your company then you shouldn't even be asking this question or posting it here. If not, you can post your company name and the software installed in phones to alert everyone.

Subs
  • 109
  • Disclosure of vulnerability will get you into court if it will be found out regardless of whether NDA was signed or not. – Andrei Botalov May 22 '12 at 07:30
  • Not in USA and this is not a vulnerability. It is the purposeful action of knowing that a backdoor exists and not disclosing to the customers. – Subs May 22 '12 at 07:41