Back to all courses
In this video I show you how to make a profitable API and sell it on the RapidAPI Hub.
My name is Ania Kubow and I am a Software Developer (ex-Eurostar) who quit the rat race to focus on my channel and teach others how to code by building retro Games and awesome projects. I am also part of the core team @FreeCodeCamp. You can find me putting out videos on there, as well as on my own channel every Wednesday (and sometimes an extra one during the week too)!
In this guide, you will learn how to build a Climate Change API, publish it on RapidAPI, and monetize it.
The guide is tailored for beginners aiming to make money with their coding skills. We will build an API on Climate Changes using Express, Axios, and Cheerio packages. I will also show you how to list your API on the largest API hub-Rapid API, and monetize it.
For this guide, only a basic understanding of JavaScript is required. However, even if you don't have your command over JavaScript, follow along anyway because there won't be a lot of code involved. Before we start, ensure you have Node.js installed on your machine.
API stands for Application Programming Interface. APIs allow the technologies to talk with each other and are essential to many services we rely on. They grab and shape information and pass it from one technology to another.
As a developer, you can use TikTok's APIs to get a live TikTok feed onto your website or even use them in a two-way stream to get, post or delete data from a movie system. To summarize, APIs are everywhere today.
After building our API, we will list it on the RapidAPI Hub. As a developer, when you launch your API on the Rapid API platform, you can essentially sell its access to those who want to utilize what you have made. This access comes in different plans you can choose from, allowing you to control how you monetize what you have built.
As it is the largest hub for APIs out there at this moment in time, the footfall will be in our favor, meaning that you could take your API idea from a simple form of passive income to a full-blown startup, depending on how much time you want to dedicate to it.
What are we waiting for? Let's do it.
Since we will be using the Rapid API platform, so let us go ahead and sign up. As I want to create my own API, I will click on “My APIs” and then leave it at this point.
For this guide, I have decided to build an API that would tell us climate change news from various publications worldwide. If you want to go for any other topic, like crypto, that is entirely up to you.
I'm going to start off by creating a blank project using WebStorm. Feel free to use whatever code editor you wish and create an empty directory. So just like this, we can start entirely from scratch.
As a general rule, any project that uses node js will need to have a package.json
file. To create the file, we need to run the command npm init
in our terminal, ensuring that we are in the directory we just created.
The command will spin up the file for us, and we will be prompted to answer a few questions like the name, version, and description of our package. It will also ask to choose the entry point, which we will keep the index.js
file, and leave the rest of the questions blank.
As you can see in the climate-change-api
directory, a package.json
file has been created with all the keys we were asked to fill out. Within this file, we will see all our installed packages.
We will install the Cheerio, Express, and Axios packages for this guide. Run the following commands to install them:
npm i cheerio
npm i express
npm i axios
Within our Climate Change API directory, we will create a new JavaScript file and call it index.js
. This will be our server, and we will write our code in it.
We will start the coding by defining the port on which we will open our server. I am going to call it PORT 8000. This can be whatever you wish.
To use our packages in the backend, we will type the standard syntax for each of our three packages.
js
const express = require('express')const axios = require('axios')const cheerio = require('cheerio')
To initiate express, we will call the express function and save it in the 'app' variable. Then we will make this 'app' to listen to our port to run the server.
js
const app = express();app.Listen(PORT, () => console.log('server running on PORT ${PORT }'));
I'll also need to write a start
command in the script
in package.json file and use nodemon
to listen to any changes made in the index.js
file.
To run our backend, I'll type the command npm run start
in our terminal.
We will make express, i.e., app
listen to the homepage path and create a response.
js
app.get('/', (req, res) => {res.json('Welcome to my Climate Change News API');});
Here you can see it is working perfectly.
Now let us start scraping data from a news source, let's say, The Guardian. Axios and cheerios will be utilized for this purpose. First, I'll use Axios to grab all the HTML from the Guardian webpage. Then using Cheerio, we will look for the elements. We will look for all the < a
tags containing the word climate. We will grab the text of those tags and save it as the title. Also, we will grab the href
for each tag and save that as a URL.
js
app.get('/news', (req, res) => {axios.get('https://www.theguardian.com/environment/climate-crisis').then(response => {const html = response.data;const $ = cheerio.load(html);$('a:contains("climate")', html).each(function () {const title = $(this).text();const url = $(this).attr('href');articles.push({title,url,});});});res.json(articles);}).catch(err => console.log(err));
Great! We have scraped the Guardian webpage. You can install a JSON Viewer extension on your browser to make it readable. So with the extension, it is looking great.
So every time we found an < a
tag containing the word climate, we created a title and URL from its text and href to have an array full of titles and URLs from the Guardian. How cool is that!
Now that we have successfully scraped one website, let's scrape climate information from multiple sources to make our API more effective. I'll start doing this by creating an array of the newspapers I want to scrape. I will keep three, but you can add as many as you want.
js
const newspapers = [{name: 'cityam',address: 'https://www.cityam.com/london-must-become-a-world-leader-on-climate-change-action/',},{name: 'thetimes',address: 'https://www.thetimes.co.uk/environment/climate-change',},{name: 'guardian',address: 'https://www.theguardian.com/environment/climate-crisis',},];
Then I'll basically loop the same code with all those newspapers as follows:
js
const articles = [];newspapers.forEach(newspaper => {axios.get(newspaper.address).then(response => {const html = response.data;const $ = cheerio.load(html);$('a:contains("climate")', html).each(function () {const title = $(this).text();const url = $(this).attr('href');articles.push({ title, url, source: newspaper.name });});});});app.get('/', (req, res) => {res.json('Welcome to my Climate Change News API');});app.get('/news', (req, res) => {res.json(articles);});
Perfect! You can see we are now getting data from all three sources. However, our URLs from the Telegraph still need to be completed. They lack a base, as seen in their page's source code.
This issue can be solved simply by adding a base
function and passing the Telegraph through it.
js
const newspapers = [{name: 'thetimes',address: 'https://www.thetimes.co.uk/environment/climate-change',base: '',},{name: 'guardian',address: 'https://www.theguardian.com/environment/climate-crisis',base: '',},{name: 'telegraph',address: 'https://www.telegraph.co.uk/climate-change',base: 'https://www.telegraph.co.uk',},];const articles = [];newspapers.forEach(newspaper => {axios.get(newspaper.address).then(response => {const html = response.data;const $ = cheerio.load(html);$('a:contains("climate")', html).each(function () {const title = $(this).text();const url = $(this).attr('href');articles.push({title,url: newspaper.base + url,source: newspaper.name,});});});});
Now I want my API to be able to get data from individual news sources separately. This can be done by adding a new app.get
function.
js
const PORT = 8000;const express = require('express');const axios = require('axios');const cheerio = require('cheerio');const app = express();const newspapers = [{name: 'thetimes',address: 'https://www.thetimes.co.uk/environment/climate-change',base: '',},{name: 'guardian',address: 'https://www.theguardian.com/environment/climate-crisis',base: '',},{name: 'telegraph',address: 'https://www.telegraph.co.uk/climate-change',base: 'https://www.telegraph.co.uk',},];const articles = [];newspapers.forEach(newspaper => {axios.get(newspaper.address).then(response => {const html = response.data;const $ = cheerio.load(html);$('a:contains("climate")', html).each(function () {const title = $(this).text();const url = $(this).attr('href');articles.push({title,url: newspaper.base + url,source: newspaper.name,});});});});app.get('/', (req, res) => {res.json('Welcome to my Climate Change News API');});app.get('/news', (req, res) => {res.json(articles);});app.get('/news/:newspaperId', (req, res) => {const newspaperId = req.params.newspaperId;const newspaperAddress = newspapers.filter(newspaper => newspaper.name == newspaperId)[0].address;const newspaperBase = newspapers.filter(newspaper => newspaper.name == newspaperId)[0].base;axios.get(newspaperAddress).then(response => {const html = response.data;const $ = cheerio.load(html);const specificArticles = [];$('a:contains("climate")', html).each(function () {const title = $(this).text();const url = $(this).attr('href');specificArticles.push({title,url: newspaperBase + url,source: newspaperId,});});res.json(specificArticles);}).catch(err => console.log(err));});app.listen(PORT, () => console.log(`server running on PORT ${PORT}`));
To prepare our API for deploying on Heroku, we will need to run the command of npm i nodemon
. Also, we will have to change the port options to use the deployed link if it is deployed, and use the local host port when running locally.
js
const PORT = process.env.PORT || 8000
After that, we will sign up on Heroku and create a new app.
Once you create it, you will see instructions on the Heroku page. Follow all those instructions and run these commands successively.
Once done, it will deploy our app on Heroku. Copy the URL of your Heroku deployment.
We'll go back to our RapidAPI dashboard, give our API a brief description and add the base URL we just created.
Next, I will add a couple of endpoints by clicking on the REST Endpoint.
After that, I'll choose the pricing for my API. Here I have multiple plans to choose from, starting from basic to mega.
Lastly, we'll make our API public, and that's it! Our Climate Change API is live on RapidAPI.
Following the same steps stated in this guide, you can build your own APIs and make money by selling them on the RapidAPI platform. Happy building!