Back to all courses
The only Node JS API tutorial you need. Learn how to Build and Deploy your own fully custom JavaScript API with Node and Express from scratch. Once you've learned that, I'll teach you how to put it on a platform called RapidAPI, set the pricing tiers, start selling, and finally, build a business around it.
The purpose of JavaScript Mastery is to help aspiring and established developers to take their development skills to the next level and build awesome apps."
This guide will demonstrate how to develop and deploy a custom JavaScript API from scratch. More importantly, you will learn how to sell your API and turn it into a profitable business.
These are the key points that you will understand through this guide:
You will better understand the entire process by developing your own Amazon Scraper API. Once you have built and deployed this API, this guide will show you how to upload it to a platform called RapidAPI Hub, explain the pricing tiers, begin selling it, and finally build a business around it.
Developers of all experience levels are welcome to work on this project. There are no prerequisites. If you're a rookie, you'll discover what APIs are, how to use Node and Express to build a backend JavaScript application, and much more.
If you're an experienced developer, you'll appreciate this guide's deployment and business aspects. With that said, allow me to show you the entire procedure.
First and foremost, using a real-world example in this guide, you will comprehend the entire process, from idea to deployment. You will create an Amazon data Scraper API, and this API will parse the HTML response and return the valuable product data in JSON format.
This guide is based on an idea I had. I'd like to encourage you to think of something you've wanted to make for a long time and follow in my footsteps. In this manner, you will have something original that you have produced, and you will be able to upload it to RapidAPI Hub by using the same procedures. Now let's get started.
Before we dive right into the code, let's quickly explain what an API is.
API stands for Application Programming Interface, allowing two applications to communicate. An API is used every time you use an app like Facebook, send an instant message, or check the weather on your phone. What precisely is an API, then?
An app connects to the internet and sends data to a server when you use it on your phone. The server retrieves the data, processes it, and delivers it back to your phone. An API is used by the application to obtain the data it needs and display it to you in a readable style.
Let's use a well-known example to illustrate this better. Consider yourself ordering from a menu while seated at a table in a restaurant. The part of the system that will prepare your order in the kitchen. The vital link connecting your order to the kitchen and returning your meal to your table is missing.
This is where the waiter or API comes into play. The waiter is the messenger or API who takes your requests or orders and communicates them to the kitchen and the system. The waiter then returns the response to you. It's the food in this case.
Leveraging Node.js, the most popular JavaScript runtime, and Express, which is a framework for Node, you will create a server.
You'll build a backend application or, in other words, an API. Then you'll deploy and host that API on RapidAPI Hub, where hundreds of thousands of people will be able to use it.
I have created the Amazon Data Scraper API for this guide. Let me demonstrate what this API does. Amazon Data Scraper is the simplest way to get Amazon product, price, sales rank, and review data in JSON format. Search Results, Product Offers, Product Reviews, and Product Details are our four distinct endpoints for this API.
You can see everything this specific request can do if you scroll down. For example, you can send some information and the product ID and gather all of the details about that specific Amazon product. In this case, the Apple MacBook Air will work just fine.
Let's begin by downloading and installing Node.js. Of course, you can omit this step if Node.js is already set up on your computer. But if you don't, you should download and install it immediately.
Following installation, you can create a new folder on your desktop.
Let's call it something like Amazon Scraper in this instance. You can, of course, change the name or make a completely new project.
You may easily drag and drop it into your preferred code editor once you've made it. In this guide, Visual Studio Code is employed.
Once you're inside VS Code, you may create the package.json file, which is the foundation for any application. You can select the terminal option under View and then generate our package.json
. The VS Code window with an integrated terminal will open by clicking this.
You can run the clear command, and then you can write:
sh
npm init -y
This will generate the package.json
, which looks quite empty initially. It will show the name
, the version
, description
, and the main
file in js, which you'll create in just a moment.
To begin, delete the test script and write a start script which will be used to launch our application. It will look like this:
json
{"name":"amazon_scraper","version":"1.0.0","description":"","main":"index.js","scripts":{"start":"node index.js"},"keywords":[],"author":"","license":"ISC"}
Let's install all the necessary dependencies by running:
sh
npm install express request-promise nodemon
Let's press enter and wait for this to be installed. Once our packages are installed, you can add that nodemon
command. You can say something like:
json
{"name":"amazon_scraper","version":"1.0.0","description":"","main":"index.js","scripts":{"dev":"nodemon inex.js","start":"node index.js"},"keywords":[],"author":"","license":"ISC","dependencies":{"express":"ˆ4.17.1","nodemon":"ˆ2.0.9"}}
Now you can use the dev
command to run our application while developing it.
Once the setup is complete, you can right-click and make a new index.js
file. This will serve as our API.
As I mentioned, you will use Express to create a backend application.
js
const express = require('express');
For making API requests, you can use:
js
const express = require('express');const request = require ('request-promise');
Now that you have Express, you can initialise our application and call it a function by writing:
js
const express = require('express');const request = require ('request-promise');const app = express();
Just below that, you can also create our PORT
by writing:
js
const express = require('express');const request = require ('request-promise');const app = express();const PORT = process.env.PORT;
Later on, you will get a dynamic port from where you deploy our application. But for now, you can say:
js
const express = require('express');const request = require ('request-promise');const app = express();const PORT = process.env.PORT || 5000;
To allow our application to parse JSON input, you can write:
js
const express = require( 'express');const request = require ('request-promise');const app = express();const PORT = process.env.PORT || 5000;app.use(express.json());
Since every express application needs at least one route, let's create one:
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;app.use(express.json());app.get('/', (req, res) => {});
Now you have to return something. It could be any message to inform us that our server is running.
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;app.use(express.json());app.get('/', (req, res) => {res.send('Welcome to Amazon Scraper API.');});
The last thing that we need to start the server is app.listen()
:
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;app.use(express.json());app.get('/', (req, res) => {res.send('Welcome to Amazon Scraper API.');});app.listen(PORT, () => console.log()(`Server running on port ${PORT}`));
This is the bare-bones Express application. Now you can run it by running the following command in the terminal:
sh
npm run dev
This will launch nodemon
and our application. Prior to that, the Request Promise Library dependency must be installed. So let's get started. Control/Cmd + C
is used to stop your terminal from running. You can now run that command again, and you will see our server running on port 5000
.
Let's try it in the browser now to see how it works. Simply open your browser and navigate to localhost:5000.
You've successfully built a Node.js Express backend JavaScript server, and now you'll add the routes that will provide additional logic. Let's get started right away. You will use a service called ScraperAPI to scrape valid data from Amazon's website.
This tool allows you to scrape HTML from any website and convert it into useful JSON data. ScraperAPI is entirely free to use to get started. You get 5,000 free requests, which is plenty for a demo application.
So, without further ado, let's sign up and create an account. You can sign up through Google or GitHub, and once you're in, you'll see something like this.
The most crucial part of this page is the API key. Copy it and return to the code. Here, you will use the API key like this:
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;const apiKey = 'd18449318a367e553f85ef10b1035016';app.use(express.json());app.get('/', (req, res) => {res.send('Welcome to Amazon Scraper API.');});app.listen(PORT, () => console.log()(`Server running on port ${PORT}`));
Now that you have our API key, you can close this terminal, and let's create our Base URL.
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;const apiKey = 'd18449318a367e553f85ef10b1035016';const baseURL = `http://api.scraperapi.com?api_key=${apiKey}&autoparse=true`;app.use(express.json());app.get('/', (req, res) => {res.send('Welcome to Amazon Scraper API.');});app.listen(PORT, () => console.log()(`Server running on port ${PORT}`));
Now that you have our base URL, you can create our own API routes to retrieve specific products. So the first route will be for fetching product details.
That is, again, going to be an app.get()
, and the URL will be app.get('/products/:productId')
. The colon (:) before productId
means that the productId
will be dynamic. Finally, you can add a callback function. This time it will be a async
function and consist of a request and a response. You can write that as:
js
app.get('/products/:productId', async (req, res) => {});
You can expand that function and get the productId
from the request parameters. You can write that as:
js
app.get('/products/:productId', async (req, res) => {const { productId } = req.params;});
Now that you have it, you can open a try
and catch
block. Inside that block, you want to get a response from ScrapperAPI. You can write the details like this:
js
app.get('/products/:productId', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseURL}`)} catch (error) {}});
After that, you can add some extra parameters that give us information for a specific product. You can say:
js
app.get('/products/:productId', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseURL}&url=https://www.amazon.com/dp/${productId}`);} catch (error) {}});
Finally, you just want to send that response back from our server, so you can write:
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;const apiKey = 'd18449318a367e553f85ef10b1035016';const baseURL = `http://api.scraperapi.com?api_key=${apiKey}&autoparse=true`;app.use(express.json());app.get('/', (req, res) => {res.send('Welcome to Amazon Scraper API.');});// Get product detailsapp.get('/products/:productId', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseUrl}&url=https://www.amazon.com/dp/${productId}`);res.json(response);} catch (error) {res.json(error);}});app.listen(PORT, () => console.log(`Server running on port ${PORT}`));
Our initial route is finished. Save it, and then check it out in the browser. Now that you are back on localhost:5000, you will enter a specific product ID. Then, what is this product ID exactly, and where can you get it?
Try searching for a MacBook Air on Amazon.com, then open this 2020 Apple MacBook Air as an example.
Let's navigate to our localhost and enter it there like this: localhost:5000/products/B08PV4GYCL. Although the data is displayed, it is mostly illegible.
Let's go back to the code and make it readable. You will use the parse
command to improve our data because it is currently in text format:
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;const apiKey = 'd18449318a367e553f85ef10b1035016';const baseURL = `http://api.scraperapi.com?api_key=${apiKey}&autoparse=true`;app.use(express.json());app.get('/', (req, res) => {res.send('Welcome to Amazon Scraper API.');});// Get product detailsapp.get('/products/:productId', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseUrl}&url=https://www.amazon.com/dp/${productId}`);// Using parseres.json(JSON.parse(response));} catch (error) {res.json(error);}});app.listen(PORT, () => console.log()(`Server running on port ${PORT}`));
Before doing this, you can install the chrome extension JSON Formatter and try parsing it in the browser in the browser. You'll notice that it looks a lot better now:
Let's examine what you have now. You have the product name
, product_information
, the Item weight
, product dimensions
, language
, brand
, pricing
, and even images
. You can expand them, click on a specific image, and get all the images from the Amazon listing. The beautiful thing about this is that other developers can use this data to create a website showcasing these products.
They can utilize the data to build an Amazon clone or something completely new, where APIs come into play. One more time, I want to underline that this is simply my own small API that leverages web scraping. It doesn't have to be an API for scraping; you can make one that could be even better than this.
You can take a different course of action. You can use your own database, serve the responses from there, and then use them. Consider the top 50 APIs for airline data, weather data, football, finance, a love calculator, a heart stone, currency exchange news, booking, and email validation. Your own algorithms can be used to build jokes, movie databases, and APIs.
You can design an API for anything you can think of and then post it to the RapidAPI Hub. Now that you have finished creating your first endpoint, what would an API be? You only need an endpoint; you need one that is more recent and provides you with product information.
Now let's create two more endpoints, one of which will give us more offers about a specific product, and the other will give us detailed reviews.
To create the second route, you can simply copy the first one, which will get product reviews.
The first step is to change the url. You still need a productId
, and the response will change slightly. You also need to add product-reviews
in the URL. You can write it like this:
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;const apiKey = 'd18449318a367e553f85ef10b1035016';const baseURL = `http://api.scraperapi.com?api_key=${apiKey}&autoparse=true`;app.use(express.json());app.get('/', (req, res) => {res.send('Welcome to Amazon Scraper API.');});// Get product detailsapp.get('/products/:productId', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseUrl}&url=https://www.amazon.com/dp/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});// Get product reviewsapp.get('/products/:productId/reviews', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseUrl}&url=https://www.amazon.com/product-reviews/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});app.listen(PORT, () => console.log()(`Server running on port ${PORT}`));
Let’s check it out by navigating tolocalhost:5000/products/B08PV4GYCL
, and this time add the /reviews
like localhost:5000/products/B08PV4GYCL/reviews in your URL bar.
It will take time to process because there are so many reviews to parse. Now you have detailed information about the reviews, such as how many 5-star-ratings
there are, how many 4-star-ratings
there are, and so on. You can also see the top positive reviews, the number of stars, the review's date, and the entire title and review data. You can even sort it by top positive and top critical, and you can see all the other reviews.
Now onto the third endpoint, which is product offers. Again, copy the previous endpoint and then change it accordingly. You can simply write it down like this:
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;const apiKey = 'd18449318a367e553f85ef10b1035016';const baseURL = `http://api.scraperapi.com?api_key=${apiKey}&autoparse=true`;app.use(express.json());app.get('/', (req, res) => {res.send('Welcome to Amazon Scraper API.');});// Get product detailsapp.get('/products/:productId', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseUrl}&url=https://www.amazon.com/dp/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});// Get product reviewsapp.get('/products/:productId/reviews', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseUrl}&url=https://www.amazon.com/product-reviews/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});// Get product offersapp.get('/products/:productId/offers', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseUrl}&url=https://www.amazon.com/gp/offer-listing/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});app.listen(PORT, () => console.log()(`Server running on port ${PORT}`));
Let's save it and open it in a browser. You only need to change the smallest detail in the URL and replace /reviews
with /offers
like localhost:5000/products/B08PV4GYCL/offers
, and you'll have our data. Unfortunately, our MacBook Air has no current offers, so this section is empty, but that's perfectly fine.
Let's get to the most crucial step: implementing the search query. So, if you go back, you'll notice that you're looking for a MacBook Air, among other things. You can see the s?K=macbook+air
at the top, followed by our term. Let's put that into action.
To implement that request, you can copy the previous one and say something like Get Search Results this time. This route will be slightly different because you no longer have products. You'll do something like this:
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;const apiKey = 'd18449318a367e553f85ef10b1035016';const baseURL = `http://api.scraperapi.com?api_key=${apiKey}&autoparse=true`;app.use(express.json());app.get('/', (req, res) => {res.send('Welcome to Amazon Scraper API.');});// Get product detailsapp.get('/products/:productId', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseUrl}&url=https://www.amazon.com/dp/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});// Get product reviewsapp.get('/products/:productId/reviews', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseUrl}&url=https://www.amazon.com/product-reviews/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});// Get product offersapp.get('/products/:productId/offers', async (req, res) => {const { productId } = req.params;try {const response = await request(`${baseUrl}&url=https://www.amazon.com/gp/offer-listing/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});// Get search resultsapp.get('/search/:searchQuery', async (req, res) => {const { searchQuery } = req.params;try {const response = await request(`${baseUrl}&ur&url=https://www.amazon.com/s?k=${searchQuery}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});app.listen(PORT, () => console.log()(`Server running on port ${PORT}`));
This is what the request should look like. You can save it and take a look. You can enter the URL for our newly created endpoint, which will be localhost:5000/search/macbook%20air
.
Our data has been restored. You even have an array of ads, so if you want to hide them, you can do so now. You can use this API on your own website and simply display the results. Superb.
As you can see, you have a wide range of outcomes. First, consider the 2020 Apple MacBook. Then there's the third-inch model, and so on. However, you gained all the results that had something to do with our search query.
Our four endpoints have all been created, and the API is nearly complete. Before you host this application on Heroku, there is one more thing you need to do, or rather, one more question you need to answer.
Do you want to share your scraper API key with everyone who uses our API? So, should you share it, or should everyone have their own scraper API key? I'll take it out in this case because I don't want it to be publicly visible.
Now that this is commented out, you need to change the baseURL
because it will no longer be static. So you can do something like const generateScraperUrl = ()
. That will be a function that accepts an API key like const generateScraperUrl = (key)
. Based on that, it will return a string similar to the one you've had so far.
You may copy our base URL and remove it. The URL itself will remain unchanged. However, you should note that the API key is no longer our API key; instead, you pass it when you call the function. Where should you call this function, then? The answer is with every single one of these responses. For instance, you can use generateScraperUrl(apikey)
instead of the baseUrl
.
But at this point, you may be asking where this API key came from. Every single one of our requests will now include query parameters. Therefore, customers can specify their API key in the RapidAPI marketplace in the form of app.get("/products/:productId?api key=", async (req, res) =>
, and that API key will then be filled in inside of our request.query
.
So you can get that right inside like const {api_key} = req.query
. That way, everybody can enter their own api_key
. You have to copy that for the three requests below, and you also need to change our baseUrl
with generateScraperUrl(apikey)
for all three requests. You can see the above discussion in the complete code below:
js
const express = require('express');const request = require('request-promise');const app = express();const PORT = process.env.PORT || 5000;//const apiKey = 'd18449318a367e553f85ef10b1035016'const generateScraperUrl = apikey => `http://api.scraperapi.com?api_key=${apiKey}&autoparse=true`;app.use(express.json());app.get('/', (req, res) => {res.send('Welcome to Amazon Scraper API.');});// Get product detailsapp.get('/products/:productId', async (req, res) => {const { productId } = req.params;const { api_key } = req.query;try {const response = await request(`${generateScraperUrl(apikey)}&url=https://www.amazon.com/dp/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});// Get product reviewsapp.get('/products/:productId/reviews', async (req, res) => {const { productId } = req.params;const { api_key } = req.query;try {const response = await request(`${generateScraperUrl(apikey)}&url=https://www.amazon.com/product-reviews/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});// Get product offersapp.get('/products/:productId/offers', async (req, res) => {const { productId } = req.params;const { api_key } = req.query;try {const response = await request(`${generateScraperUrl(apikey)}&url=https://www.amazon.com/gp/offer-listing/${productId}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});// Get search resultsapp.get('/search/:searchQuery', async (req, res) => {const { searchQuery } = req.params;const { api_key } = req.query;try {const response = await request(`${generateScraperUrl(apikey)}&ur&url=https://www.amazon.com/s?k=${searchQuery}`);res.json(JSON.parse(response));} catch (error) {res.json(error);}});app.listen(PORT, () => console.log()(`Server running on port ${PORT}`));
With that, our application is ready to be deployed to Heroku. Let's do that.
You can host your application anywhere, such as AWS or Heroku. In my experience, getting apps up and running on Heroku is extremely simple. So that's what you'll do today.
To begin, go to the top and sign up or log in. After that, you'll be on your dashboard.
So, click Create New App and give it a unique name, such as jsm-amazon-scraper. You can also select a region and then click Create app. If you scroll down, you'll see all of the instructions for how you can deploy your API.
First, you must download and install the Heroku CLI. After you click the link, you'll be greeted with all the download links right inside.
Once you have downloaded it, you can see the interior project and run git init
.
I've opened our apps side by side for easier viewing, and I'm going to stop our server by pressing control/cmd + C and Y
. Clear the terminal and then run git init
. Then, run the following command:
sh
heroku git:remote -a jsm-amazon-scraper
Before you run git add
, open your file explorer and add a file called .getignore
.
You can add our node modules
to the .gitignore file. This way, it won't be pushed to Heroku, which is precisely what you want. After that, run git.add
, then git commit -am "make it better"
, and finally git push heroku master
, which will build out and deploy our server to Heroku.
After that, expand your browser, go to overview, and you should see Build Succeeded. Finally, press the Open App button. It will be different for you, as you can see on our own custom domain jsm-amazon-scraper. You'll notice that our API is now live, which means it's no longer only available on our local host but also on the web.
Let us proceed to the [RapidAPI Hub](https://Rapidapi.com/hub?utm source=RapidAPI.com/guides&utm medium=DevRel&utm campaign=DevRel). You've already built and hosted the API; now it's time to put it on the marketplace.
Not only will I show you how to put it on the marketplace, but I'll also show you how to set pricing tiers and finally start selling it. Again, you do not need to use the Amazon scraper. Create something unique. You now understand the entire procedure. I'll show you the final steps, and then after this guide, you can start creating your own API and posting it right here.
So allow me to demonstrate how to do it. You can access your APIs by clicking the top right corner and clicking My APIs. That brought up the dashboard; as you can see, I already had the Amazon data scraper developed.
Now I'm going to make a new one so I can walk you through the entire process. So I'll go to the top left and click Add New API.
You can give your API a name, such as Amazon Data Scraper, and a brief description. You can also choose a category. This would be eCommerce in this case. Finally, we have reached a critical point: how do we want to specify our API?
RapidAPI's UI, Open API, Standard Postman Collection, Graph Schema, or Kafka can all be used. You'll just use the built-in UI in this case, so click Add API. You can enter a lengthy description and upload an image if you wish. You can also enter the URL of the website. Finally, if you have any usage terms, you can enter them. In my case, I'll simply hit the save button.
You're done with the first part. Now you have to add a Base URL.
Our base URL is the same one you've entered for your website: our API. That is the website that you are hosting on Heroku. You can add that as a base URL and click save.
You also have to specify four endpoints, so let's create them individually, starting with our Amazon Product Details endpoint.
I recommend that you enter a more detailed description here. In my case, I'm just going to copy what I had in the name. If you have any external documentation links, you can paste them here and then specify the end. In our case, that will be /products/{productsId}
.
A great thing about RapidAPI Hub is that the API you wrote will be accessible through the most popular programming languages. As you can see, it has been recognized that you have one path parameter and can add the example value. In our case, that was the idea of the MacBook Air. You can put any random Amazon product id.
Finally, you can save and test the endpoint.
The response will currently be “Error 401: unauthorized” because you haven't specified the API key. The query is a quick and easy way to specify the API key. The name, api_key, must be specified here. You will choose STRINGas the Type, and then you can choose whether to show your key as the example value. That is not something I would recommend because it will be visible to everyone.
But because this is a demo, I'm going to leave it here, and I'm going to make it required because people need their API key to use this API.
Test the endpoint again, and you will get the product data successfully. You need to set up an example response, copy it and then go to example responses. Select the response code, which will be 200, and add it.
Finally, you can include the product details response that you copied so that people know what to expect before they begin using the API. Not only that, but you can generate a schema so people can guess what each Amazon product will have before making a request.
Finally, name it Apple MacBook Air product details and save the example response; one endpoint is successfully added.
I will not waste your time and create endpoints for all the requests. Before you go ahead and make our API public. Let's check some of the other tabs right there.
There are also tabs for security, global settings, documentation, announcements, and plans and pricing. That is a significant one. For the time being, you'll try to keep things simple by pricing our API based on the number of requests made. In our case, a call to any endpoint is considered one request, so you can scroll down and add your plans here.
For example, for our basic plan, you can click edit and see that people can make around 1,000 monthly requests. You're going to set a hard limit, which means that if a user goes over, it will simply be blocked. You can click save, and you have a free basic plan.
Let's now add a pro plan. It'll be the same as before. You'll have requests every month; in this case, let's say 10,000. Then you can even let them go over it and specify it. For example, each additional request costs $0.01.
Let's add one more plan. In this case, this is going to be for huge users. You can add requests, and then monthly, they can make 50,000 requests, and you're also going to charge them 1 cent for the extra requests above 50,000. Let's save it.
Once you've added all your plans, you can click preview, and there you go. This is what the pricing plan is going to look like. People can come here and subscribe to your API, and RapidAPI Hub will handle the rest.
With that said, you are ready to make our API available to the public. Let's change this from private to public right now. And while this is a demo API, if you're creating your own, make sure it's fully documented, functional, and accessible to hundreds of thousands of developers on the marketplace.
Finally, in the top right corner, select View in Marketplace. Our API was up and running in a matter of seconds. It's right there for people to see. It can be accessed via URL, and people can test it before using it
On your RapidAPI platform, you can see the endpoints you added. Developers can immediately see what they need to do to test the endpoint. You can also view the results, sample responses, the entire body, and the response schema.
Finally, let's test the Search Results endpoint here. You will look for something different, like a keyboard. Type “keyboard” in the searchQuery
field and then click Test Endpoint. You are greeted with JSON data on all of the most recent keyboards.
You created an Amazon data scraper API, hosted it, and launched it to the RapidAPI Hub. So, after watching this guide, come up with an idea and build something, and then try to replicate the process by building a backend server, hosting it on Heroku, and publishing it to RapidAPI Hub.