41

When trying to visit https://www.ebay.com, I noticed that I get redirected to HTTP immediately. Here's what cURL says about that:

$ curl --max-redirs 0 -v -L https://www.ebay.com
* Rebuilt URL to: https://www.ebay.com/
* Adding handle: conn: 0x6c8cc0
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* - Conn 0 (0x6c8cc0) send_pipe: 1, recv_pipe: 0
* About to connect() to www.ebay.com port 443 (#0)
*   Trying 66.135.210.61...
* Connected to www.ebay.com (66.135.210.61) port 443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
*   CAfile: /etc/pki/tls/certs/ca-bundle.crt
  CApath: none
* SSL connection using SSL_RSA_WITH_RC4_128_MD5
* Server certificate:
*       subject: CN=www.ebay.com,OU=Site Operations,O=eBay Inc.,L=San Jose,ST=California,C=US
*       start date: Jun 06 00:00:00 2013 GMT
*       expire date: Jun 07 23:59:59 2014 GMT
*       common name: www.ebay.com
*       issuer: CN=VeriSign Class 3 Secure Server CA - G3,OU=Terms of use at https://www.verisign.com/rpa (c)10,OU=VeriSign Trust Network,O="VeriSign, Inc.",C=US
> GET / HTTP/1.1
> User-Agent: curl/7.32.0
> Host: www.ebay.com
> Accept: */*
> 
< HTTP/1.1 301 Moved Permanently
< Location: http://www.ebay.com/
* no chunk, no close, no size. Assume close to signal end
< 
* Closing connection 0
* Maximum (0) redirects followed
curl: (47) Maximum (0) redirects followed

Why would websites force plaintext HTTP while their support SSL, thus exposing the user's browsing habits to eavesdropping?

d33tah
  • 6,514
  • 8
  • 39
  • 61
  • 7
    One possible reason is the use of a reverse proxy caching server. For example, [Varnish does not support SSL](https://www.varnish-cache.org/docs/trunk/phk/ssl.html). – bwDraco Mar 12 '14 at 23:24
  • 1
    While the content at eBay is not top secret, it is the base of many peoples livelihoods. I don't understand such shady security practices, but hey, this is eBay we're talking about.... – David Houde Mar 13 '14 at 10:52
  • 12
    I detest eBay's non-use of SSL. It's not just about people being able to see what pages you visit, but (1) people can steal session cookies, and (2) people can alter the pages you see (possibly having arbitrary JS execution, providing misinformation, etc.) – Lekensteyn Mar 13 '14 at 21:59

5 Answers5

40

There are some reasons (not necessarily good reasons, but reasons nonetheless) to prefer HTTP over HTTPS. If these reasons apply, then it makes sense to enforce HTTP usage even when the client appears to want to use HTTPS.

A (usually) bad reason for preferring HTTP is one that is commonly uttered in such a debate: SSL is often assumed to be heavy, both for the computational cost of encryption, and because of its consequences on caching (though it is possible to cache pages served over HTTPS, the SSL layer prevents some tricks such as transparent proxies, that are commonly applied by ISP). The computational cost is overrated (it used to be a bottleneck in the times of 3DES and 90 MHz Pentium machines with fast ethernet, but things have changed since then). As for caching, one point to make is that it is increasingly irrelevant when pages become more dynamic.

We may imagine, though, that Ebay wants to encourage widespread ISP-based proxying for all the item pictures that they serve. I can easily conceive that these pictures eat up a substantial part of the network bandwidth of Ebay. Enforcing plain HTTP maximizes the probability that caching takes place, thus saving money on Ebay's side.

A less bad reason for preferring HTTP is to allow easier automatic scanning of data for unwanted content and intrusion detection. SSL is end-to-end, so if such scanning is applied in an HTTPS world, then it must happen at one of the ends, which may be inconvenient in a given big architecture.


As for the privacy of your browsing habits, I don't think that Ebay gives a fig about it. In fact they are, by construction, quite eager to learn, analyse, profile, and possibly sell away your browsing habits (advertisers pay for such information). So it does not seem reasonable to really expect Ebay to actively protect your privacy, since part of their business model is the exact opposite.

Thomas Pornin
  • 322,884
  • 58
  • 787
  • 955
  • 4
    SSL does not necessarily prevent the web site's owner from doing content scanning, though it does mean that at least the content scanning device needs the web server's private key just as much as the web server needs it, and software capable of doing the decryption essentially in parallel with the web server. It does rule out a lot of the common automated scanning devices, certainly, but it doesn't make it impossible. – Anti-weakpasswords Mar 13 '14 at 03:02
  • 1
    Note, though, that making the private key available to several systems is itself a security risk. Also, even with the private key, SSL decryption is not feasible if a DHE cipher suite is used. – Thomas Pornin Mar 13 '14 at 11:03
  • 6
    I don't agree with the part that ebay doesn't care about users' privacy because they themselves analyse and possibly sell them. My thoughts are that such data is worth money for ebay, and money is to be protected (and stolen data by competitor is stolen money). Ebay might still not be concerned about privacy, but not for that reason. – kutschkem Mar 13 '14 at 14:16
  • @ThomasPornin - an excellent point about DHE ciphersuites protecting from private keys on multiple servers. Would you mind adding a comment with the canonical answer to ECDHE cipher suites also providing or not providing protection from intercepting hardware with the server's private key? Also, agreed - it's definitely a security risk itself to have the private key on more machines than are necessary, as you mentioned, though some people may balance that increased risk against other factors like convenience for additional X and/or IDS/IDP capability. – Anti-weakpasswords Mar 14 '14 at 02:02
  • 1
    "SSL is end-to-end, so if such scanning is applied in an HTTPS world, then it must happen at one of the ends, which may be inconvenient" -- I'm not saying this *isn't* how a system might end up with difficulty introducing SSL, but that's *supposed* to be what reverse proxying is for. You put the SSL at the boundary of your data centre and do what you like inside. You get criticised (e.g. Google) for letting the NSA put their boxes in your private network to monitor everything in the clear while claiming they have no access to your database, but that's a proof of concept that it's possible... – Steve Jessop Mar 15 '14 at 10:58
22

Speaking from personal experience: I managed a website that wanted to send all form data over a secure https connection. However, due to a variety of reasons other pages displayed non secure content that we could not manage to work over https.

This lead to big warning signs in Firefox and Chrome (THIS SITE CONTAINS NON SECURE ELEMENTS with shouting exclamation marks in the address bar and such) which to the non-initiated looked frightening even though nothing suspicious was happening.

To avoid sending the wrong message we simply redirected traffic over http for pages that did not send any customer data.

Ebay seems to be doing the same thing: when data is actually being entered in a form and transmitted it's always https.

Although to be honest: in our case we lacked budget to go over all the pages and fix the insecure elements which really should have been the way to go.

Only other reason I can think of would be performance: strictly speaking SSL is slower.

In short: I can imagine reasons to redirect traffic over http, but in my case it definitely wasn't best practice.

user3244085
  • 1,173
  • 6
  • 13
  • 12
    "when data is actually being entered in a form and transmitted it's always https" - The problem with this is that it only protects against packet sniffing. A man in the middle can serve you a modified page with the embedded ssl stripped. – Red Alert Mar 13 '14 at 01:14
  • How so? If there are secure elements on a non-secure page, those elements still have to be served over a secure connection. Multiple connections are used to service the request for a single page. In order for a man-in-the-middle to serve you alternative content, they would still have to crack SSL/TLS, right? In other words, the user agent (browser) makes an https request to a domain, and if SSL/TLS handshake does not take place with a server at that domain, using a valid cert, then the browser will throw up an error page, no different from serving all of the elements on the page over https. – Craig Tullis Mar 13 '14 at 08:16
  • 3
    The big issue with serving mixed http and https is just that the user might presume the entire page is secure if the main URL is https, but the page contains absolute URL's to http links. The security warning is about disabusing the user of that false sense of security. But it does not mean that the secured elements themselves are any more vulnerable to interception than if every element in the request was secured. – Craig Tullis Mar 13 '14 at 08:19
  • 2
    @Craig current browsers don't have a way to tell whether a form is going to submit over HTTPS until after you submit it. Also, JavaScript could be injected to steal form data before it's submitted. Neither of these apply if the form is on a HTTPS page. (the form might still submit over HTTP, but then it's the website creator's fault, and not because of an attack) – user253751 Mar 13 '14 at 08:32
  • @immibis see what happens when I type at 2am? :-) Fair enough, if the specific context is submitting a form, even if you explicitly submit to an https url, but from an http "page," then MIM risk is real. Mitigation is complicated and easily done wrong. For the record, I'm actually a fan of full https everywhere. Caching concerns are real, but most content is dynamic these days. Https performance concerns are more related to the initial connection handshake than the data encryption, so my stance is that https performance problems are often an indication of fundamental design issues, anyway. – Craig Tullis Mar 13 '14 at 08:59
  • 1
    @Craig I am personally against adding complexity and overhead where complexity and overhead are not required. No matter how little. My connection to `icanhas.cheezburger.com` does not need to be encrypted unless I'm logged in or on the login page. Sometimes we trade efficiency for ease-of-development, but this is inefficient and (slightly) *harder* to develop. – user253751 Mar 13 '14 at 09:03
  • @immibis I really just mean https wherever any security is required, e.g. line of business apps or services that store or potentially store PII, which these days is an awful lot of them. It isn't all that rough to avoid hard-coding absolute URL's into apps, for example, and to take other fairly simple steps to ensure that the http/https decision is only a configuration issue, and not a coding issue. Except in the sense that the sorts of things that cause https performance problems (anything that causes lots of new connections to be made per request, no compression, etc.) also affect http. ;) – Craig Tullis Mar 13 '14 at 09:12
  • You said "full https everywhere", so I took you to be one of those people who wants to ban all non-TLS connections period. Full HTTPS on a site with confidential information is perfectly reasonable. – user253751 Mar 13 '14 at 09:14
  • 2
    Once you are back in HTTP you are also in unauthenticated territory, which means the next HTTPS redirect could be to a false site. So: HTTPS (shopping.example.com) -> HTTP (fake site, hijacked/poisoned DNS, etc.) -> HTTPS (attacker's HTTPS site with a valid certificate for shopping.not-example.com which is passing through the actual HTTPS data shopping.example.com after sniffing between and de\re encrypting). There is no way to make this safe -- likely a few folks here didn't notice the "example.com" and "not-example.com" -- how often do you read the entire URL? How often do your parents? – zxq9 Mar 15 '14 at 08:12
14

It's hard to give a good answer, as there arguably is no good reason to do this.

If I had to list pros and not cons, I'd say this:

If a site doesn't want users who try to visit the site via https to think it's down, and has SSL support for some features, such as login, but thinks it doesn't have or doesn't want to support infrastructure allowing all content to go via SSL, then it might try doing this.

For a long time, it was a commonly believed myth that SSL required lots more hardware than regular HTTP. An old decision may not be revisited.

Widely cached sites will have less load if caching proxies can shoulder some of the load, which is impossible when SSL is active.

Bandwidth-conserving tools like the optional compression in Opera and Chrome don't work with content served over SSL.

  • "I don't care if the NSA sees my funny cat pictures and don't want to spend extra anything to stop the NSA seeing funny cat pictures" is an arguably good reason. For eBay, though, that's not applicable. – user253751 Mar 13 '14 at 08:33
  • It isn't for email accounts or health-related web searches, either, but everyone is still falling over themselves to give Google, MS and Amazon every insight about their lives that they can. Users, ultimately, tend to get the security they deserve; that a willingness to self-educate dictates how much security that winds up being finds a neat parallel in many other areas of life, health and economy. – zxq9 Mar 13 '14 at 22:18
  • 1
    +1 For bringing up the third party compression issue. – Muhd Mar 15 '14 at 01:25
1

Most likely to retain the utility of caching and reduce the load involved in the enormous amounts of spider crawling required to keep external searches up to date.

The bad part isn't so much that a third party can see what page you looked at (the network of analytics utilities installed all over the place that report back to Google already guarantee you're tracked nearly everywhere), but that bouncing between HTTP and HTTPS provides multiple points to misdirect from an unauthenticated HTTP site to a fake HTTPS site with a bogus certificate. The risk isn't tracking, and its not really exposure of content on the HTTP pages, its that this sort of back-and-forth bouncing weakens a user's sense of threat awareness.

It is much more difficult for the average user to understand that there is a scary opening every time HTTPS->HTTP->HTTPS happens, especially since this occurs for most people after logging and authenticating one's self to the server.

"Oh, look, a certificate warning just popped up. Weird. But I've been signed in this whole time (immediately presses 'dismiss/permanently add exception' button)"

A better solution would be to make all public (visible to anonymous users) parts of the site available via both HTTP and HTTPS to keep search engines happy and spiders crawling low-load pages, but never redirecting anyone once they cross over to HTTPS. If they really cared about security, though, they would never automatically redirect from HTTP to HTTPS -- that sort of automatic redirection is an opportunity to set up a MITM attack every time, and even banks do it.

zxq9
  • 340
  • 2
  • 8
  • 1
    How about defining an attribute for tags (especially those served from https: pages) which would indicate what the cryptographic hash value of the identified resource should be? If an https: page could say "Fetch this picture from an http: site and confirm that its hash is xxx, and if it isn't, use https://", that would allow caching proxies without opening up man-in-the-middle risks. – supercat Mar 13 '14 at 23:20
  • @supercat A good idea but with limited application. It is easier to provide universal https and make the crossover one-way than implement a selective content hashing system, rendering this useful only in cases where the site *must* provide a resource outside its control -- anathema in a "secure" setting *and* opens the site itself to exploitation by whoever controls that external resource (providing something insanely large to hash, taxing cycle availability). The basic problem is a universal attitude of "what the user doesn't know won't hurt our bottom line". – zxq9 Mar 15 '14 at 08:05
  • By my understanding, proxy servers cannot cache https:// content. Many servers have large volumes of content which proxy servers should be able to cache, but which client software should be able to verify for veracity. One could guard against malicious provision of oversized files by having the tag include the size as well as the hash; one could guard against fetching a long file and then finding it's out of date by having the http:// request include and indication of what hash value is expected and/or having the response indicate the hash is expected for the attached file. – supercat Mar 15 '14 at 22:09
1

From a user experience perspective, if a user sees SSL, and visualizes security, they tend to become unnecessarily cautious.

For example, if you have a website with an image of a lock on the home page that says "We are Secure", conversion rates drop. Keeping that mechanism hidden from the user is not always a bad idea, and if the password exchange and authentication is implemented correctly than it's really not a security issue.

We can also imagine how it might actually take an attacker longer to hack eBay if eBay obscures their security protocols, because they are appearing to be weak when in reality they must have some fairly formidable lines of security.

Appear weak when you are strong, and strong when you are weak

- Sun Tzu, The Art of War

OneChillDude
  • 411
  • 2
  • 10