168

I noticed a comment on this answer where another user said

...but it requires risking burning a 0day, which people are not always all that willing to do.

I did an Internet Search for the phrase "burning a 0day" (and similar permutations like 0 day, zero day, etc) and not much came back. It's obvious that "burn" means "use up" in this case. I understand most of what the user meant, but probably not all of why it was important (aka context). I'm looking for a canonical answer, with some reasoning about why "burning a zero-day" is an expensive thing.

Mr Robot s01e06 touches on this when Elliot and Darlene start to argue about what went wrong in their attempted hack.

I found some other people on this SE using the same terminology:

YetAnotherRandomUser
  • 2,290
  • 2
  • 14
  • 20
  • Just as a "thought experiment": if you really wanted to bring doom and destruction to the world of computers, here's how to do it: have a virus which propagates itself, but doesn't do any actual damage, to other countries, but not to North America or Europe, and not to the same country it's propagating from. After a few days, remove those exceptions. After a day or so more, turn on the damage. By this time it will be all over the place requiring a massive cleanup effort. Just sayin'. – Jennifer Jun 25 '18 at 06:08
  • @Jennifer It would still be detected and eradicated quite quickly. A better solution would simply be a worm that spreads quickly and clogs up the network with the sheer bandwidth taken up by scanning for vulnerable targets. This has actually happened before and, within 10 minutes of the first infection, slowed down the _entire_ internet and infected a substantial portion of networked devices. – forest Jul 16 '18 at 06:43
  • 2
    @forest what's the name of that worm or that event? Sounds like a good plot for a hacker movie. – YetAnotherRandomUser Jul 16 '18 at 11:41
  • @YetAnotherRandomUser SQL Slammer and the Morris Worm both had such effects. – forest Jul 16 '18 at 22:48

4 Answers4

230

I was the one who wrote the comment you quoted.

Quick answer: A 0day is burned when the exploit is used too often or haphazardly, resulting in it being discovered and patched. Virtually every time a 0day is used, it risks being burned. Using a 0day more sparingly and cautiously can increase its shelf life. The idiom intends to compare a 0day to a non-renewable resource like combustible fuel that loses its value when used up.

This likely originates from the idiom burn your bridges:

To destroy one's path, connections, reputation, opportunities, etc., particularly intentionally.


What is a 0day?

A 0day is an exploitable vulnerability that is not publicly known. When a 0day is discovered, it can be turned into a working, "weaponized" exploit. Like all vulnerabilities, if it is discovered in public, it will usually be patched and fixed, making it so the exploit no longer works. Every time you use an exploit, you necessarily transmit valuable information to a system that you do not control (yet). If the system is being extensively monitored, the exploit technique may be discovered and with it, the necessarily knowledge to fix the vulnerability and roll out patches to all affected systems.

What does it mean to "burn" one?

I was the one who wrote the comment you are referencing. To "burn" a 0day is slang for using it either too often or using it in a high-risk situation where it is likely that it will be discovered because of its use. Like combustible fuel, once used up or "burned", a 0day will no longer hold the same value (both in monetary terms* and tactical terms). It stops being a 0day once it is no longer in private hands.

Friends may let you "borrow" a 0day to use yourself, under the condition that you do not burn it. This means they are telling you that you can use it, but they are trusting you to be very careful not to use it in a way that makes it likely that it will be found and fixed, depriving access to it.

Someone might decide to disclose the 0day suddenly in public. Especially when it's not done using coordinated disclosure, it's often called dropping a 0day, which will also burn the 0day. This is a bit uncommon but not unheard of. A few years ago on IRC, a guy joined and informed us of a remote code execution vulnerability in TeamViewer that involved sending malicious WinHelp files (which contain Visual Basic code) or something along those lines. Since the first place he disclosed that was in the middle of a general security-related IRC, he was burning a 0day by dropping it.

* 0days usually have literal monetary value. A 0day can range from a thousand dollars to upwards of a million, depending on a variety of factors such as exclusivity, applicability, reliability, specificity, conditionality, etc.

How much are 0days worth?

Exploit brokers often buy or sell bugs with promises of exclusivity. For example, you can sell a bug under the condition that it is sold for the highest price to only one person, not to multiple people. That reduces the chance that it will be discovered, but it means you only get paid once. Alternatively, you could sell an exploit to as many people who want to buy it. You would have to sell it for a lower price because it will be burned very quickly, making its shelf life rather short. Obviously, when a 0day is burned, it is no longer nearly as useful since it will only work on outdated systems. The actual value depends on quite a few factors. They are worth more if they:

  • Work on a variety of systems.

  • Do not depend on a specific configuration.

  • Are reliable and work every time.

  • Are silent and do not leave traces in the logs.

  • Are sold to only one or a limited number of buyers.

Many contractors that deal in exploits will pay you the complete price in small sums over a period of time. If the 0day is discovered before you are paid in full, they will stop the payments. This behaves as disincentive to selling it to multiple contractors or using it frequently yourself. It essentially forces you to guarantee to them that it will remain a 0day, or you simply will not be paid in full.

Additionally, 0days are bought for more by government contractors than by random exploit brokers. You might be able to sell a complete Chrome exploit chain complete with sandbox bypass and local privilege escalation (LPE) for hundreds of thousands of dollars to Raytheon SI. The same exploit would net only a fraction of that price if you sell it to J. Random Broker on IRC. The reason is simply that corrupt governments want to be non-competitive and have ample money from tax payers to obtain exclusive vulnerabilities so they can drone strike journalists export democracy.

How do 0days get burned?

There are many activities that can risk burning a 0day. A few examples:

  • Using an exploit that is unreliable and may create a coredump.

  • Using an exploit that is conditional and only works for some configurations.

  • Using an exploit that results in an event being logged.

  • Using an exploit against a sophisticated and paranoid target.

  • Simply using an exploit too often, increasing its exposure.

  • Giving it to or trading it with a friend who is not responsible with it.

  • Revealing too much about the exploit, allowing others to find the vulnerability themselves.

I do not condone selling 0days to governments or government contractors.

forest
  • 65,613
  • 20
  • 208
  • 262
  • 71
    To burn a zero day is also comparing it to the similar use in espionage (at least from what spy-fic tells me) consider "Burn a cover", or "burn an asset". (which likely also indeed do come from the use of burning combustible fuel which decreases its value, as you say) – Frames Catherine White Jun 18 '18 at 04:42
  • 10
    Indeed. The term "burn" is not unique to software exploitation. – forest Jun 18 '18 at 04:45
  • 31
    Nor to exploitation at all. It's a common English idiom for using something in a manner such that it is not usable again. It's commonly used for all kinds of things, including even money. – reirab Jun 18 '18 at 07:24
  • 6
    Interesting use of "responsible" here. :-) – LarsH Jun 20 '18 at 15:26
  • @Lyndon Yes, I'm fairly certain that the etymology is the term's use in spycraft. – Riking Jun 20 '18 at 16:19
  • @LarsH If you burn a vuln that someone trusts you with (even if it's to do something ethical like report it to the affected vendor), you'll quickly find your access to tasty 0days cut off. As much as I hate the idea of hoarding bugs (burn them or fix them, I say), this is the sad truth. – forest Jun 22 '18 at 02:49
  • Doesn't 0-day mean specifically such a critical vulnerability that it must be fixed the same day as reported? Hence the zero word, meaning number of days it can stay unpatched? – Haspemulator Jun 22 '18 at 07:24
  • 5
    @Haspemulator it refers to its freshness. When the vulnerability has been discovered it is 0 days old, the next day it is 1 day old and so on. If it was of low-risk (e.g. only allowed slightly elevated privileges and only applied in obscure and unlikely cases so not actually that critical) it would still be 0-day, but people just wouldn't care that much about it so you wouldn't be hearing as much news about the 0-day vulnerability as you would if it was critical. – Jon Hanna Jun 22 '18 at 10:44
  • @JonHanna A 0day is _before_ it's discovered (in public). – forest Jun 22 '18 at 17:00
  • 1
    LOL to "giving it to or trading it with a friend who is not responsible with it". At college, I wrote a program called CONINSIM (CONsole INput SIMulator) that made my program look like a fully-authorized operator's console (we're talking about a mainframe over 40 years ago). I did it for the pride of knowing I could; I never intended to use it destructively. I gave a copy to my friend *******, who immediately used it to cancel JES, without which the system immediately crashed. We both got in big trouble for it. When I asked why he did it, he said, "I didn't think it would work". – Jennifer Jun 25 '18 at 06:03
113

A zero-day is a vulnerability that is unknown by the software manufacturer and for which no patch exists.

When using a zero-day vulnerability against a remote server, it may give away how it works. The administrators of the application may notice they have been hacked, look in the logs and discover the vulnerability that was used to hack them. If they then fix the vulnerability, the vulnerability is no longer a zero-day vulnerability and knowledge about it is useless.

For example, this Flash zero-day was actively being exploited and that was how Adobe learned about the vulnerability.

In some cases, using a zero-day does not expose the vulnerability. For example, zero-days to root a mobile OS may be used without the software manufacturer learning about the vulnerability.

Sjoerd
  • 28,897
  • 12
  • 76
  • 102
  • 83
    It's worth noting that some zero days are extremely valuable, so there's a very real price to burning one. – BlueRaja - Danny Pflughoeft Jun 17 '18 at 20:56
  • Comments are not for extended discussion; this conversation has been [moved to chat](https://chat.stackexchange.com/rooms/79046/discussion-on-answer-by-sjoerd-what-does-it-mean-to-burn-a-zero-day). – Rory Alsop Jun 18 '18 at 17:25
23

Security researches find exploits. The day they report it is day Zero because developers will start work on patching it.

Good Security researchers (as in white hat) will publish the zero day to the developers before they publish it to the rest of the community. In many cases they only publish it to the community because the people in charge of the code have otherwise ignored them.

Bad Security researchers (as in black hat) will archive this exploit for a rainy day. Burning a zero day ... is tapping the rainy day archive. This is usually only done for high value targets.

Regardless of Disclosed or Used the exploits time is finite from the moment the developers in charge of the code realize what is going on. So, in a sense the exploit is used up, or burnt.

CaffeineAddiction
  • 7,567
  • 2
  • 21
  • 41
  • 10
    I think making this distinction between good and bad security researchers is inaccurate. Nation-state attackers may keep zero-days, and they may be considered morally good by the country they work for, and morally bad by opposing countries. – Sjoerd Jun 17 '18 at 17:41
  • 20
    It doesn't matter who you are working for, if you find an exploit and don't report it ... your doing it wrong. If you are collecting a paycheck for not reporting it ... your doing it worse. – CaffeineAddiction Jun 17 '18 at 17:56
  • 10
    And if you are selling it to a corrupt government who uses it to spy on its citizens and enforce its own ideas of morality, you are doing it worse still. I personally have never sold a 0day to a government or government contractor, and I try to avoid business with those who do. – forest Jun 18 '18 at 00:16
  • 1
    +1 for explaining the meaning of the name, regardless of debate over whether "good" is always synonymous with being a white-hat (disclosing vulnerabilities). – Peter Cordes Jun 18 '18 at 16:18
  • 2
    You don't "find" an exploit, you develop and test it carefully not unlike other pieces of software. What you may find are bugs that allow exploitation. – Niklas B. Jun 19 '18 at 22:59
9

Specifically focusing on your question about how burning a 0day can be an expensive thing:

With literal monetary value, CaffeineAddiction's answer covered that very well.

There is another type of value that I haven't seen mentioned - when it's purposely released to the public to allow a community to get around a manufacturer's rules. An example of this is in the jailbreaking (rooting of iOS devices) community. There is often talk about not "burning a 0 day" on a minor iteration of iOS. Especially as Apple exploits get harder to find, if a 0day was found on 11.3.1 there would be a talk of waiting until 11.4, or even 12.0 to release it.

Unlike other attacks where the risk grows as its used, a jailbreak essentially guarantees the loss of the 0day as it's made completely public.

CeePlusPlus
  • 191
  • 3