Scrapoxy is a powerful proxy aggregator that enables centralized management of multiple proxies within a single interface. It generates a pool of private proxies from multiple proxy services, intelligently routes traffic, and handles proxy rotation for you.
User-friendly interface
Simplify the management of proxy instances with an intuitive user interface, making it easy to set up, configure, and monitor proxies efficiently.
Ban management
Overcome bans by injecting proxy names into HTTP responses, allowing scrapers to notify Scrapoxy to remove problematic proxies from the pool when bans are detected.
Traffic monitoring
Monitor traffic, measure incoming and outgoing data, and track key metrics like request counts, active proxy numbers, and requests per proxy.
Why residential proxies?
A residential proxy serves as a mediator, allowing users to get an IP address from an authentic desktop or mobile device connected to a local network. Due to its origin, residential proxies are a perfect match for overcoming geo-restrictions, bypassing CAPTCHAs, managing multiple accounts, and conducting web testing with the CapSolver platform.
Decodo offers top-notch residential proxies with an extensive IP pool of over 55M IPs across 195+ locations. With an unparalleled responsive rate, clocking in at under 0.5 seconds, a success rate of 99.68%, and an affordable entry point with Pay As You Go, Decodo is an excellent deal for hustlers and fast-growing companies.
Set up Decodo proxies with Scrapoxy
To get started, you'll need 3 things: Docker, Scrapoxy, and proxies. Follow each step below to set everything up.
Getting residential proxies
Log in to your Decodo dashboard, find residential proxies by choosing Residential under the Residential column on the left panel, and select a plan that suits your needs. Then, open the Proxy setup tab and copy the username and password. Save it, as you'll need to use it later.
Installation
To use Scrapoxy, you'll need to install Docker first. Then, follow these steps:
Launch the Docker Desktop application.
Open the Settings menu.
Go to the General tab.
Enable Docker terminal.
Click Apply & restart.
Launch the Terminal from the bottom-right corner and enter the following command (replace admin, password, secret1, and secret2 with your custom values):
7. Navigate to http://localhost:8890 and log in with your username and password you provided in the command just earlier (not the proxy authentication information).
After logging in, you'll be prompted to create a new project. Enter the following information and select options:
Name. Create a custom name for your project.
Minimum proxies. Set the minimum amount of proxies that should be used in this project.
Auto Rotate Proxies (ms). If enabled, proxies will randomly rotate within a provided interval.
Auto Scale Up. If enabled, all proxies will be started upon sending a request.
Auto Scale Down. If enabled, all proxies will be stopped when no requests are sent.
6. Intercept HTTPS requests with MITM. If enabled, Scrapoxy will intercept and modify HTTPS requests and responses.
7. Keep the same proxy with cookie injection. If enabled, Scrapoxy will inject a cookie to maintain the same proxy for a single browser session.
8. Override User-Agent. If enabled, the header will be overridden with the value assigned to a proxy instance. This ensures all requests within the instance have the same User-Agent header.
9. Shuffle TLS Ciphersuite. If enabled, Scrapoxy will assign a random TLS cipher suite to each proxy instance to avoid TLS fingerprinting.
10. Once you're done setting up, click Create.
The project is set up, and you'll need to choose a provider from the Marketplace. Find Decodo in the list and click Create.
In the next step, enter the credentials for Decodo:
Name. Create a custom name for the credentials. This can help differentiate between them if you're using several products.
Product. Select the product you're using from the dropdown. In our example case, select Residential or Mobile.
Username. Enter the username you previously saved from the Decodo dashboard.
Password. Enter the password you previously saved from the Decodo dashboard.
Click Create to save the credentials.
The final step is to create a Connector:
Credentials. Select the credentials that you've just created previously from the dropdown list.
Name. Enter a custom name for the Connector.
# Proxies. The maximum number of proxies that the connector can provide and that you intend to use. You can adjust this later.
Proxies Timeout. Set the time to attempt to connect to a proxy server before considering it offline.
Proxies Kick. If enabled, set the maximum duration for a proxy to be offline before being removed from the pool.
Country. Select a specific country to use proxies from or leave it at All.
Session Duration (min). Set the duration of the session.
Click Create.
Once created, you'll see it in the list of Connectors. Enable it by toggling the Start/Stop this connector button.
Now that everything's ready, you can integrate Scrapoxy into your code and see how it works. Here's an example of how to use Python together with the requests library:
result = requests.get(url, proxies ={'http': proxy,'https': proxy}, verify=ca)
print(result.text)
Replace the username and password with the credentials from the project Settings in the Scrapoxy interface.
You must also download the CA certificate from Settings and place the file in the same directory as the script or write a full path to it in the ca variable. The endpoint should always be localhost:8888 unless configured otherwise.
To view and manage your proxies, you can use the Proxies, Coverage, and Metrics tabs in Scrapoxy. These provide information about active proxies, what countries they're located in, upload and download speeds, the number of requests, and their status.
Country, state, city, ZIP code, and ASN-level targeting
Rotating and sticky session options
<0.6s avg. response time
99.86% success rate
99.99% uptime
Seamless integration with scraping tools and bots
24/7 tech support
14-day money-back
SSL Secure Payment
Your information is protected by 256-bit SSL
What people are saying about us
We're thrilled to have the support of our 130K+ clients and the industry's best
Clients
Awards
Industry experts
Attentive service
The professional expertise of the Decodo solution has significantly boosted our business growth while enhancing overall efficiency and effectiveness.
N
Novabeyond
Easy to get things done
Decodo provides great service with a simple setup and friendly support team.
R
RoiDynamic
A key to our work
Decodo enables us to develop and test applications in varied environments while supporting precise data collection for research and audience profiling.
C
Cybereg
Best Usability 2025
Awarded for the ease of use and fastest time to value for proxy and scraping solutions.
Best User Adoption 2025
Praised for the seamless onboarding experience and impactful engagement efforts.
Best Value 2025
Recognized for the 5th year in a row for top-tier proxy and scraping solutions.
A proxy is an intermediary between your device and the internet, forwarding requests between your device and the internet while masking your IP address.
Residential Proxies
from $1.5/GB
Real, physical device IPs that provide a genuine online identity and enhance your anonymity online. Learn more
When we scale businesses, do research, or just scroll leisurely, it's always a good idea to be clued up on online security. Whatever your reasons for running your eyes over a web browser are, anonymity and privacy are two pretty important players here. The thing is that any browser, website, system, or network can see our IP address. Some of them might even log your IP address and track it. In this blog post, we’ll go over the dangers of using free software, fake IPs, and the illegal aspects of using IP information.
With proxies, you can experience YouTube like never before. Unblocking geo-restricted Youtube content, scraping the platform, and using bots to automate your processes are just a few things proxies enable you to do.
Looking for an all rounded search engine scraping tool? Well, have you tried Decodo’s SERP Scraping API? In this video you will find out what it is, what it can scrape and why it might be the only search engine scraping tool you’ll ever need.
So, there you are, casually surfing the net, when… a CAPTCHA appears out of the blue, interrupting your flow. Yes, it’s that little test making sure you’re not a robot, and let’s face it – it can really slow down your processes. The great news? You don’t have to be stuck. It’s possible to bypass CAPTCHAs. So, buckle up, and let’s dive into the tricks that make these roadblocks the past.
Business success is driven by data, and few data sources are as valuable as Google’s Search Engine Results Page (SERP). Collecting this data can be complex, but various tools and automation techniques make it easier. This guide explores practical ways to scrape Google search results, highlights the benefits of such efforts, and addresses common challenges.
Scrapoxy is a powerful proxy aggregator that manages and rotates proxies, helping developers scale web scraping operations while minimizing the risk of bans.
What is proxy scraping?
Proxy scraping involves using proxies to collect data from websites without revealing the scraper's real IP address, allowing for anonymous and distributed web scraping while avoiding blocks or rate limits.