So I had an interesting issue today which I couldn't find many Google results for, so I'll create one!

Many thanks to sigmavirus24 from #python-requests on Freenode for this - I'm just passing it on.

I use reverse proxying quite a bit on, because I've only got one public IP address, but I don't want to run all web apps from the same host. Reverse proxying means there's one server which gets all requests to that IP on port 80, but it then passes them on to different servers inside my network depending on what the requested URL was. So when you request a page under (like this one), the request is handled by a different machine than when you request a page under

I also use different SSL certificates for the various subdomains. If you check and, you'll see you get different SSL certs for each, and neither is valid for the other subdomain.

When you're using reverse proxying with this sort of config, the reverse proxy machine has to have all the certificates, and present the correct server certificate when the SSL/TLS connection is established - the establishment of the connection can't be handed off for $SECURITY_REASONS. This actually depends on an SSL/TLS extension called SNI (Server Name Indication), which was fairly new stuff six or seven years ago. These days though it mostly Just Works with all browsers and other client-y things you really care about (no, you don't care about IE 6. You really, really don't) and once you figure out the initial configuration you kinda forget about it...until it comes back to bite you in the ass!

I was working on our openQA integration stuff, porting it to use my openQA python client library which is based on the awesome python-requests. I was trying to test it with , but it kept failing certificate validation: it was clearly getting the cert for , not the one for

As Mr. Virus24 explained, the problem is pretty simple: python-requests does not always support SNI. It's kind of an optional feature. If you have Python 2.7.9 or Python 3 (possibly it requires some specific Python 3 minor release, but I'm not sure which) SNI support can be relied upon, but for older Python 2 releases, requests will only do SNI if some other Python modules are installed. If you're using pip, you can do pip install requests[security]. I use distro packages, so I went and looked at requests' file and figured that, on Fedora 21, I'd need pyOpenSSL, python-pyasn1, and python-ndg_httpsclient packages installed. I had the first two, but the last was missing. Sure enough, as soon as I installed it, my test script magically started working, with no other changes - requests is now doing SNI.

So, if you're using python-requests (directly or indirectly) and having certificate issues that look like this (the server sends the cert for the wrong domain), check you have all the right bits installed for requests to do SNI!