Durch Oleg Kulyk | Aktualisiert vor 9 Monaten | Data

9.6 / 10



Service Level


ScrapingAnt Übersicht

Follower: 17
Oleg Kulyk
Rate API:
Melden Sie sich bei Rate API an


Web Scraping and Web Harvesting are challenging tasks. Many specialists have to handle Javascript rendering, headless browser update and maintenance, proxies diversity and rotation.

ScrapingAnt is a simple API that does all the above for you:
🛠Latest Chrome render
💻Run Javascript
🕵️‍♀️Thousands of proxies over the World

For extended documentation, please, follow:

About API

ScrapingAnt helps you to solve complex scraping tasks. Our Javascript execution feature is available for everyone without any additional efforts.
We have two similar endpoints for your convenience (POST and GET). They use the same backend under the hood but allow our users to be more flexible at code writing point.

Why you should try our service

  1. Our servers rotate several thousands of proxies to provide the best web scraping experience.
  2. ScrapingAnt will fetch the URL you want to scrape through a Chrome browser which will execute the Javascript code on the target page. This can be very useful if you are scraping a Single Page Application built on frameworks like React.js / Angular.js / JQuery or Vue.
  3. You can pass custom cookies to the webpages you want to crawl.
  4. Both GET and POST endpoints to fit all your scraping and crawling needs.

POST and GET differences

Parameters passed to GET endpoint have to be encoded. If you don’t want to encode your request parameters, please, consider using POST endpoint.
Warning : If you use GET endpoint, please always ensure that the parameters are ENCODED! This is extremely important to avoid mixing your API parameters and the possible HTTP parameters of the URL you want to scrape.

If you need help encoding your URL you can find below how to do it:

in Python
in JS
in PHP
in ruby

For POST endpoint you don’t need to encode anything,

Parameters and usage

The only one parameter is required to use our API - url. So just pass this parameter to start scraping. Pretty easy, isn’t it? 😉
The optional parameters are:
cookies - String parameter. You can pass custom cookies to the webpages you want to crawl. To do this just passe the cookie string in the cookies parameter. We currently only handle the name and value of custom cookies. If you want to set multiple cookies just separate cookies with ; Example: cookies = "cookie_name_1=cookie_value1;cookie_name_2=cookie_value_2"
return_text - Boolean parameter. By enabling this parameter you’ll be able to fetch only the text content from the Web resource without any of HTML tags. Disabled by default!

Each request has 30 seconds to return results successfully. So if your page is not loaded completely after 30 seconds, we will return partially loaded page. Please be aware of this maximum timeout when writing your own code.

Do not hesitate to contact us in case of questions, suggestions, and proposals. We are always trying to provide the best customer support for our users.

Bewertung: 5 - Stimmen: 1