Back to blog

How to Fix SSLError in Python Requests: Causes and Solutions

An SSL error means the TLS handshake failed: your application encountered an SSL certificate it couldn't verify, so the connection was rejected. This issue commonly shows up during web scraping or when integrating with external APIs. In this guide, we'll explain what this error means, its causes, and walk you through the right fix for each.

TL;DR

  • SSL errors in Python requests occur when certificate verification fails during the TLS handshake, mostly due to: expired or self-signed certificate, hostname mismatch, outdated certifi bundle, incomplete certificate chain, or proxy interception.
  • The fix depends on the root cause and which side (server-side or client-side) is responsible
  • If the server-side is broken (self-signed, expired, wrong hostname), specify the specific certificate in your request, or trust the CA that issued it by adding it to your bundle
  • If the client-side is broken (outdated certifi bundle, proxy interception), update the certifi bundle using pip install --upgrade certifi. This is the right fix when the server's certificate is valid, but verification fails.
  • Setting verify=False is a quick fix that also exposes you to man-in-the-middle (MITM)

What is SSLError in Python Requests? 

SSL errors in Python Requests occur when the ssl module (more on this module later) is unable to verify the target server's identity.  

When the requests library initiates an HTTPS connection, it performs a TLS handshake, just like a browser. During this handshake, the library attempts to validate the server's SSL certificate. If that verification fails at any point, or for any reason, you get an SSL error, and the connection is terminated. 

To present a clearer picture, here's a request to a test server with a problematic SSL certificate: self-signed.badssl.com.

import requests
requests.get('https://self-signed.badssl.com/')

And here's a concatenation of the error response:

requests.exception.SSLError.HTTPSConnectionPool(host='seld-signed.badssl.com', port=443):
Max retries exceeded with url:/
(Caused by SSLError(SSLCertVerificationError(1,
'[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed:
Self signed certificate (_ssl.c:1007)')))

This raised SSL exception and immediate termination is by design. Python, by default, doesn't allow communication with unverified servers to ensure end-to-end data encryption. However, it's still frustrating when this error halts your project without a clear indication of the cause. 

To avoid wasting time and resources on trial-and-error fixes, it's important to understand 3 things:

  • What goes on during the TLS handshake
  • How Python's requests handles certificate verification 
  • What the library checks

What happens during a Python requests TLS handshake?

The TLS handshake begins with the ClientHello message. Your Python process opens a basic TCP connection and sends a message to the target server. This message contains the supported TLS versions, cipher suites, and client random number, among other parameters.

The server then responds with a ServerHello message, which specifies a chosen TLS version and cipher suite, its own random number, and its SSL certificate chain. Requests checks this certificate to validate the server's identity (certificate verification). This is where SSLError originates, so let's pause and explore how Python's certificate verification works. 

How Python requests handles certificate verification and what it checks

Requests is a high-level Python library built on top of urllib3, which in turn wraps the ssl module, Python's built-in TLS wrapper around OpenSSL (a C library and the actual cryptographic engine).

This architecture means that urllib3, via the ssl module, handles most of the certificate verification. That's why you sometimes see "urllib3" in the error response. By default, this module calls certifi.where() to locate its Certificate Authority (CA) bundle, which contains over 150 root certificates. 

During the certificate verification stage of the TLS handshake, the server presents its certificate, which typically chains up to a specific root CA. 

urllib3, through the ssl module, then attempts to link this chain to one of Certifi's 150 root certificates. For verification to succeed, the server's certificate must complete a valid path to one of the CAs.

urllib3 also checks if the hostname matches the certificate and whether the certificate has expired.  

If any of these checks fail, requests raises an exception: requests.exceptions.SSLError. This error originates in OpenSSL and propagates up, with each layer sequentially abstracting it. That's why the traceback logs are always so long. 

Here's the full error from the initial requests

Traceback (most recent call last):
File "scraper.py", line 4, in <module>
response = requests.get('https://self-signed.badssl.com/')
[... urllib3 internal frames omitted for brevity ...]
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED]
certificate verify failed: self signed certificate (_ssl.c:1007)
During handling of the above exception, another exception occurred:
urllib3.exceptions.MaxRetryError:
HTTPSConnectionPool(host='self-signed.badssl.com', port=443):
Max retries exceeded with url: /
(Caused by SSLError(SSLCertVerificationError(1,
'[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed:
self signed certificate (_ssl.c:1007)')))
During handling of the above exception, another exception occurred:
requests.exceptions.SSLError:
HTTPSConnectionPool(host='self-signed.badssl.com', port=443):
Max retries exceeded with url: /
(Caused by SSLError(SSLCertVerificationError(1,
'[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed:
self signed certificate (_ssl.c:1007)')))

You typically read a Python traceback log from the bottom up. That means, in the example above, verification failed at the ssl.SSLCertVerificationError line, and the message after the colon tells you what went wrong. In this case, "self signed certificate." (We'll discuss what this means in the next section).

urllib3 attempts to retry the SSL error, and requests catches it, creating the SSLError exception chain below:

ssl.SSLCertVerificationError -> urllib3 -> requests.exceptions.SSLError

Causes of SSLError in Python Requests 

Think of the certificate verification stage as a series of strict security checkpoints, and failures (server-side or client-side) can result from a handful of reasons. Here are the 6 most common root causes of SSL errors in Python requests.

Self-signed certificate

As the name implies, a self-signed certificate is one where the server creates, issues, and signs its own certificate, rather than chaining up to a third-party Certificate Authority. Requests only verifies certificates from well-known CAs (i.e., those in its Certifi bundle). Since a self-signed certificate is only known to the server, Python rejects it. 

A good example of this can be seen in the previous request made to self-signed.badssl.com.

Below is a sample script that catches the SSL error from that site, so we can see what it looks like.

import requests
try:
response = requests.get('https://self-signed.badssl.com/')
except requests.exceptions.SSLError as e:
print(f'SSL error: {e}')

Output:

SSL error: certificate verify failed: self-signed certificate

Self-signed certificates are common in internal tools, dev environments, and some scraping targets. While they trigger the SSL error above, they don't necessarily present an insecure connection. They're just not trusted or known to Python by default.

Expired certificate

SSL certificates have a validity period that ranges from 90 days to 20 years, depending on the certificate type. Once a certificate exceeds this time period, without renewal, Python rejects it outright, even if it passes every other check. 

As an example, here's a Python script that catches the SSL error from expired.badssl.com, a test server with an expired certificate.

import requests
try:
response = requests.get('https://expired.badssl.com/')
except requests.exceptions.SSLError as e:
print(f'SSL error: {e}')

Output:

SSL error: certificate verify failed: certificate has expired

Hostname mismatch

SSL certificates are unique to domain names. When the server presents its certificate during the TLS handshake, it includes two fields: CN (Common Name, which represents the hostname) and SAN (Subject Alternative Names). Python only checks SANs, although older versions may fall back to CNs if SANs are absent. 

If the domain name in your request isn't listed in the SANs field, or there's a mismatch, Python raises a hostname mismatch SSL error. 

Here's a request to wrong.host.badssl.com, a test endpoint with a hostname mismatch.

import requests
try:
requests.get('https://wrong.host.badssl.com/')
except requests.exceptions.SSLError as e:
print(f' SSL error: {e})

Output:

SSL error: certificate verify failed: Hostname mismatch, certificate is not valid for 'wrong.host.badssl.com'.

This error is often seen with www vs. non-www, wildcard certs, or redirected endpoints. 

Always use the hostname associated with the certificate. You can inspect the certificate to get this information using the script below. 

import ssl, socket, json
def inspect_cert(hostname, port=443):
context = ssl.create_default_context()
try:
with socket.create_connection((hostname, port)) as sock:
with context.wrap_socket(sock, server_hostname=hostname) as ssock:
cert = ssock.getpeercert()
print(f'Subject: {cert.get("subject")}')
print(f'Issuer: {cert.get("issuer")}')
print(f'Expires: {cert.get("notAfter")}')
print(f'SANs: {cert.get("subjectAltName")}')
except ssl.SSLError as e:
print(f'SSL failed: {e}')
# To see the cert even when verification fails:
context2 = ssl.create_default_context()
context2.check_hostname = False
context2.verify_mode = ssl.CERT_NONE
with socket.create_connection((hostname, port)) as sock:
with context2.wrap_socket(sock, server_hostname=hostname) as ssock:
raw_cert = ssock.getpeercert(binary_form=True)
print('Got cert bytes, use openssl to decode')
inspect_cert('wrong.host.badssl.com')

This script:

  • Inspects the server's SSL certificate
  • Retrieves the subject, issuer, expiration date, and SANs 
  • Retries the connection, but with verification disabled, if verification fails

Outdated Certifi or Python 

The Certifi package returns a snapshot of Mozilla's trusted Certificate Authorities, which is periodically updated. If your Certifi or Python version is outdated, your request will reject a newly added CA, because the issuing root isn't present in the local bundle.

Certificate chain is incomplete

As discussed in a previous section, CAs typically chain up from the actual server (leaf) certificate to the intermediate CA, and then to the root CA. When a server doesn't include the intermediate CA in its TLS configuration, Python Requests rejects the certificate because it cannot complete the chain up to a root CA in its certifi bundle. 

This doesn't necessarily indicate a bad SSL certificate. In fact, in these cases, a browser would fetch the missing intermediate CA and complete the chain. But Python's ssl module doesn't do this by default; rather, it rejects the connection, often with a self signed certificate SSL error. 

Proxy interception

This is one of the most common causes of SSL errors. It mostly occurs when using enterprise network proxies, especially during web scraping. These proxies operate as transparent man-in-the-middle (MITM) solutions, decrypting and re-encrypting HTTPS traffic. 

They intercept your connection and create a new TLS handshake with the target server. They then generate a new certificate for your request. This certificate isn't signed by a root CA in the certifi bundle, so Python automatically rejects the connection. 

Now that you understand that SSL errors aren't random and that each one reflects an issue in the certificate verification process, you can quickly map the root cause to the right fix.

If you'd like a broader grounding in Python exception handling, check out Python Errors and Exceptions: An Ultimate Guide to Different Types and Solutions

Fix #1: Ignoring SSL certificate verification 

Ignoring SSL certificate verification is the quickest fix for any SSL error, but it also carries real risk. 

Python allows you to disable SSL verification for a request by setting verify=False. Once that is included in a request, urllib3 turns off certificate verification during the TLS handshake. Since the requests library can no longer verify if it's talking to the right server, the connection becomes vulnerable to man-in-the-middle (MITM) attacks: a third party could intercept, read, and modify the response data. 

For a scraper collecting read-only public data from targets you do not authenticate with, this fix can be reasonable. However, if your requests involve credentials, session tokens, or any personal data, the MITM can read your data in plain text. 

Keep in mind that setting the verify=False parameter triggers a urllib3 warning on every request. If your scraper makes 100s or 1000s of requests, things can get untidy and difficult to maintain. To suppress this warning, call the urllib3.disable_warnings() method before any request is fired. Here's an example:

import requests
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
response = requests.get('https://self-signed.badssl.com/', verify=False)

Note that this only suppresses the log output, not the risk. 

You can also apply verify=false globally using requests.Session(). This method allows you to set the parameter once and have it work for every request in that session.  

Here's an example that puts everything together:

import requests
import urllib3
# part 1 with no warning suppression
response = requests.get('https://self-signed.badssl.com/', verify=False)
# part 2 with warning suppression and session
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
session = requests.Session()
session.verify = False
r = session.get('https://self-signed.badssl.com/')
print(r.status_code)

This script has two parts: 

  • Part 1 shows the verify=False parameter without suppressing the urllib3 warning
  • Part 2 suppresses the urllib3 warning and sets the parameter globally using requests.Session()

Output:

# part 1
InsecureRequestWarning: Unverified HTTPS request is being made to host 'self-signed.badssl.com'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#tls-warnings
warnings.warn(
# part 2
200

Fix #2: Specifying a custom certificate file

If you're getting the self-signed certificate error, you can fix it by specifying a custom CA file in your request. Once Requests receives this file, it passes it to urllib3, which uses it as the CA bundle in place of the default certifi. 

The certificate file you specify must match the one the server presents. This requires you to manually export the server's certificate details to a local file. While you can get this file using Chrome DevTools or Firefox's certificate viewer, the recommended approach is to use openssl s_client from your terminal.

This method connects to the server and retrieves its exact certificate chain. 

Here's an example script that downloads the certificate chain from self-signed.badssl.com using openssl s_client:

On macOS or Linux, paste this in your terminal. On Windows, use GitBash, WSL, or PowerShell with OpenSSL installed. 

openssl s_client -connect self-signed.badssl.com:443 -showcerts \
</dev/null 2>/dev/null \
| openssl x509 -outform PEM > cert.pem

This script:

  • Opens a TLS connection with self-signed.badssl.com 
  • Retrieves the certificate chain
  • Saves it as a standard PEM cert file

To view what this cert looks like, run the following command in your terminal:

cat cert.pem

Your output should look like this:

-----BEGIN CERTIFICATE-----
MIID0jCCA3mgAwIBAgIQZSBYnvF+tVxmRDPyny5oSjAKBggqhkjOPQQDAjBSMQsw
CQYDVQQGEwJVUzEZMBcGA1UECgwQQ0xPVURGTEFSRSwgSU5DLjEoMCYGA1UEAwwf
. . . . (certificate data) . . . .
-----END CERTIFICATE-----

Now that you have your custom file, the next step is specifying it in your request. 

The verify parameter we used in the previous section accepts a file path as a string. So by passing the path to your cert file as a string to this parameter, you can instruct Requests to use the custom file in place of the certifi bundle during the TLS handshake.

Here's an example that specifies a custom CA file to connect with a test server that uses a self-signed certificate:

Certificate:
Data:
Version: 3 (0x2)
Serial Number:
65:20:58:9e:f1:7e:b5:5c:66:44:33:f2:9f:2e:68:4a
Signature Algorithm: ecdsa-with-SHA256
Issuer: C=US, O=CLOUDFLARE, INC., CN=Cloudflare TLS Issuing ECC CA 1
Validity
Not Before: Apr 2 21:18:57 2026 GMT
Not After : Jul 1 21:24:46 2026 GMT
Subject: CN=example.com
Subject Public Key Info:
Public Key Algorithm: id-ecPublicKey
Public-Key: (256 bit)
pub:

Otherwise, you'll get an error.

For CI pipelines and containerized scrapers, hardcoding the file path might be inconvenient. In this case, you can use the REQUESTS_CA_BUNDLE environment variable to configure the path externally without editing your code.

Here's an example:

export REQUESS_CA_BUNDLE=/path/to/cert.pem
python scraper.py

This bash script sets the environment variable in your shell and Requests uses that file for every call in your scraper.

Fix #3: Updating SSL dependencies (Certifi, Python, OpenSSL) 

If the target URL opens directly, without warnings, in a browser, but throws an SSL error in your Python script, the problem is almost certainly from your end (client-side).

In such cases, updating your SSL dependencies is a good place to start. Run the command below to update Certifi:

pip install --upgrade certifi
# confirm the current version
Python -c "import certifi; print(certifi.__version__)"

It's recommended to specify Certifi explicitly in your request. This way, you guarantee you get the version you just installed. Here's an example:

import requests
response = requests.get(url, verify=certifi.where())

Remember also to check the OpenSSL version. An outdated OpenSSL may not support newer TLS extensions or root CAs. This can cause errors identical to certificate verification issues.

The following command outputs your OpenSSL version.

python -c "import ssl; print (ssl.OPENSSL_VERSION)"

If the OpenSSL version is outdated, a common occurrence in Python 3.7 and below, the only way to upgrade is to update Python itself. On Windows or macOS, you can navigate to python.org and download the latest version or use a tool like pyenv to manage installation.

You can confirm your new installation version using the command below:

python --version

On macOS, Homebrew and pyenv installations don't automatically inherit the system keychain (a built-in trusted certificate store). That means the certificates bundled with macOS aren't visible to Python by default. You often need to install Certifi or run the certificate setup script (Certificates.command) to sync your environment. 

It's also important to note that upgrading Certifi alone may not be enough if you're using a virtual environment. These environments isolate your Python packages, and another dependency might be pinning Certifi to an old version. So even if you run the Certifi upgrade command, pip won't upgrade past the pinned version.

In such cases, audit the dependency graph. You can use pipdeptree to get the full graph:

pip install pipdeptree
pipdeptree --package certifi

This command returns the certifi version and what library requires it. If it's an old Certifi version, forcing an upgrade could break the dependency. Try upgrading the responsible package. If it's not needed, simply remove the package and upgrade Certifi. 

Fix #4: Verifying and trusting SSL certificates

Do you remember the approach in fix #2? It trusts one specific certificate: the exact details the server presents. The problem here is that TLS certificates have expiry dates. When the server renews its certificate, even if it is renewed by the same CA, the leaf certificate bytes change. Your hardcoded cert.pem will no longer match, and SSLError returns.

The same goes for the other fixes.

Verifying and trusting SSL certificates is a permanent, production-grade fix. This includes appending the server's root CA to Certifi's own bundle file. Any subsequent call to requests.get() will trust both the standard Mozilla CAs and your custom CA, with no changes to the calling code.

Remember the command to retrieve the server's certificate? That example only passed the leaf cert using verify=. To get the root CA, you need to print the full certificate chain and save only the root certificate.

Here's an example:

openssl s_client -connect example.com:443 -showcerts \
</dev/null 2>/dev/null

This command outputs multiple certificate blocks, like the one below. The block that begins with TLS ECC root CA in its first line is the one you want. 

Certificate chain
0 s:CN=example.com
i:C=US, O=CLOUDFLARE, INC., CN=Cloudflare TLS Issuing ECC CA 1
. . . . . .
-----BEGIN CERTIFICATE-----
. . . . . .
-----END CERTIFICATE-----
1 s:C=US, O=CLOUDFLARE, INC., CN=Cloudflare TLS Issuing ECC CA 1
. . . . . .
-----BEGIN CERTIFICATE-----
. . . . . .
-----END CERTIFICATE-----
2 s:C=US, O=SSL Corporation, CN=SSL.com TLS Transit ECC CA R2
. . . . . .
-----BEGIN CERTIFICATE-----
. . . . . .
-----END CERTIFICATE-----
3 s:C=US, O=SSL Corporation, CN=SSL.com TLS ECC Root CA 2022
. . . . . .
-----BEGIN CERTIFICATE-----
. . . . . .
-----END CERTIFICATE-----

The command below saves only the root CA to root-ca.pem:

openssl s_client -connect example.com:443 -showcerts \
</dev/null 2>/dev/null \
| awk '/BEGIN CERTIFICATE/{cert=""} {cert=cert $0 "\n"} /END CERTIFICATE/{last=cert} END{printf "%s", last}'\
> root-ca.pem

To verify that the root CA you saved is correct, run the following command:

openssl x509 -in root-ca.pem -noout -issuer -subject

This block outputs the issuer and subject of the certificate in root-ca.pem

Output: 

issuer=C=GB, ST=Greater Manchester, L=Salford, O=Comodo CA Limited, CN=AAA Certificate Services
subject=C=US, O=SSL Corporation, CN=SSL.com TLS ECC Root CA 2022

Lastly, the script below appends the root CA to certifi's bundle:

import certifi
bundle_path = certifi.where()
with open(bundle_path, 'ab') as f:
with open('path/to/root-ca.pem', 'rb') as ca:
f.write(ca.read())
print('root CA appended')

Now, Requests will trust the appended root CA while also preserving the actual Certifi. However, upgrading your Certifi bundle will remove your appended CA as Certifi's cacert.pem file is replaced.  For a more persistent fix, create a merged CA bundle that combines Certifi's standard CAs with your custom CA, then point Python at it via an environment variable.

For web scrapers, SSL errors are often not isolated. When scraping HTTPS targets at scale, especially those with aggressive bot detection, SSL issues can compound with other connection problems. 

To help you avoid wasting time and resources with manual trial-and-error fixes, the Decodo Web Scraping API handles TLS negotiation, certificate trust, and connection retries transparently.

SSL errors killing your scraper?

Decodo's Web Scraping API handles TLS negotiation, certificate issues, and fingerprinting so your code never sees another SSLError.

Choosing the right fix for your situation 

As we've seen in previous sections, SSL errors can stem from different root causes. Identifying these causes is key to choosing the right fix for your situation. This article explains the most common root causes in Python requests and how to identify them. 

Below are some situations and recommended fixes:

  • If you’re scraping a public site with a valid certificate but requests still fail, update your CA bundle (Certifi) first; if the issue persists, check your system/OpenSSL version.
  • If you’re connecting to an internal tool or dev server with a self-signed certificate,
    provide a custom CA bundle via verify= or add the certificate to your system trust store. It's not advisable to use verify=False in production.
  • If you’re behind a corporate proxy or a scraping proxy that performs TLS interception, obtain the proxy’s root CA certificate and configure it via verify= or install it in the OS trust store
  • If the target site’s certificate has expired, verify=False is the only programmatic workaround. However, you must treat the data as untrusted and, if possible, notify the site owner.
  • If you’re running a one-off script with no sensitive data involved, using verify=False (optionally with suppressed warnings) is acceptable for quick, small-scale tasks
  • If you’re building a production scraper that handles authentication, personal data, or anything sensitive, disabling SSL verification exposes you to MITM attacks. Use other fixes or a trusted intermediary.

Conclusion

SSL errors in Python requests occur when the ssl module is unable to verify the server's certificate. This is often triggered by one of several factors, including a self-signed or expired certificate, a hostname mismatch, an incomplete chain, and proxy interception. Always apply fixes according to the root cause. You can identify a cause by reading the traceback logs. Keep in mind that, while verify=False disables SSL verification, which ultimately clears any SSL error, it also exposes you to man-in-the-middle (MITM) attacks.

Data in, problems out

Decodo's Web Scraping API handles proxies, rendering, and anti-bot bypass so you don't have to worry about them.

About the author

Lukas Mikelionis

Senior Account Manager

Lukas is a seasoned enterprise sales professional with extensive experience in the SaaS industry. Throughout his career, he has built strong relationships with Fortune 500 technology companies, developing a deep understanding of complex enterprise needs and strategic account management.


Connect with Lukas via LinkedIn.

All information on Decodo Blog is provided on an as is basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Decodo Blog or any third-party websites that may belinked therein.

Frequently asked questions

Is it safe to use verify=False in a production web scraper?

No. verify=False exposes your connection to man-in-the-middle (MITM) attacks, which allows the attacker to read and modify response data. Never use verify=False in production. This fix is a reasonable last resort, only when your scraper is collecting read-only public data from targets you do not authenticate with.

Why does my SSLError only appear in Python but not in my browser?

It means the server's certificate is valid, and the issue is with your Python installation or an outdated Certifi bundle. Updating Python and Certifi can fix this issue.

How do I check which CA bundle Python Requests is actually using?

The certifi.where() method prints the path to the CA bundle file that Requests is currently using. It's usually a cacert.pem file. Remember to import certifi.

Mastering Python Requests - Hero

Mastering Python Requests: A Comprehensive Guide to Using Proxies

When using Python's Requests library, proxies can help with tasks like web scraping, interacting with APIs, or accessing geo-restricted content. Proxies route HTTP requests through different IP addresses, helping you avoid IP bans, maintain anonymity, and bypass restrictions. This guide covers how to set up and use proxies with the Requests library. Let’s get started!

Retry Failed Python Requests in 2026

There’s no reliable Python application that doesn’t have a built-in failed HTTP request handling. You could be fetching API data, scraping websites, or interacting with web services, but unexpected failures like timeouts, connection issues, or server errors can disrupt your workflow at any time. This blog post explores strategies to manage these failures using Python’s requests library, including retry logic, best practices, and techniques like integrating proxies or custom retry mechanisms.

The Best Python HTTP Clients for Web Scraping

Not all Python HTTP clients behave the same way on the wire. The one you choose affects how many requests you can run concurrently, how identifiable your traffic is to anti-bot systems, and how much code you need to manage. This guide breaks down six clients – urllib3, Requests, HTTPX, aiohttpcurl_cffi, and Niquests – covering where each fits and where it falls short.

© 2018-2026 decodo.com (formerly smartproxy.com). All Rights Reserved