Python API Tutorials

How To Capture & Record Search Results with the Google Search API

Is There a Google Search API?

Google’s supremacy in search engines is so massive that people often wonder how to scrape data from Google search results. While scraping is not allowed as per their terms of use, Google does provide an alternative and legitimate way of capturing search results. If you hear yourself ask, “Is there a Google Search API?” the answer is a resounding YES.

Overview of Google Search API

With the Google Search API, you can programmatically invoke Google Search and capture search results. You be wondering why you would want to do that. If you work in SEO — or a related field — then keeping a tab on the top Google results for a particular keyword is part of your day to day job (such as monitoring Google Search Console, optimizing for the knowledge graph, or search queries). Monitoring the search listing helps you keep a check on the popular links about a keyword and track changes in the search rankings.

In this blog post, we are going to harness the power of this API using Python. We will create a utility Python script to create a custom SERP (Search Engine Results Page) log for a given keyword.

How to Get Access to the Google Search API

1. Sign Up for a RapidAPI Account

To begin using the Google Search API, you’ll first need to sign up for a free RapidAPI developer account. With this account, you get a universal API Key to access all APIs hosted in RapidAPI.

RapidAPI is the world’s largest API marketplace, with over 10,000 APIs and a community of over 1,000,000 developers. Again, it is free to sign up for this account.

2. Navigate to the Google Search API Console

You can search for “Google Search API,” or alternatively, you can access the API Console directly.

3. Subscribe to Google Search API

Once you are in the API console, click on the “Pricing” tab to take a look at the subscription tiers available for Google Search API.

Google Search API has a freemium subscription model. Under the BASIC plan, you get 100 API calls per day for free. Subscribe to this plan, and you are all set to explore the API further.

What Are Some Free Web Search APIs?

Check out these other web search APIs that provide free and freemium based pricing models.

Related Resources

How to use the Google Search API

The Google Search API is quite straightforward. It has two API endpoints, both supporting their variant of input parameter for returning the same search data.

GET get search

The “GET get search” endpoint takes the search string as input and returns the search results in a JSON format array.

POST post search

The “POST post search” endpoint takes in a JSON object as input and returns the search results in a JSON array. The JSON object contains the search string as well as a number to limit the maximum results.

Testing the Google Search API

Let’s test one of the endpoints to get a glimpse of the search results returned by this API.

Select the “POST post search” endpoint in the API console and pass the JSON object, as shown below. In this case, we are searching for an “API Marketplace,” and the results are limited to 100.

Trigger the API, and you should see a long array of results containing the title and link of each search result somewhat like this.

Building a SERP Log Script using Python

Now it’s time to build a Python script leveraging the Google Search API to collect search engine result page (SERP) listing.

With this script, you can keep a tab on the search rankings of the keywords of your choice. Every time you run this script with a keyword passed as an argument, it generates an HTML file containing the top ten search listing of the keyword. This allows you to automate the process of monitoring search rankings without manual effort.

Follow the steps below to install the dependencies and build the code for this script.

Prerequisites

The Python script is based on the Python3 environment and uses two external libraries. Make sure that you have these libraries installed within your local Python 3 runtime.

1. Mako Templating Engine

The Mako templating engine is a Python-based, HTML templating library for generating HTML pages. The Python script uses this library to generate the final SERP listing page.

Install Mako using the pip command.

pip install Mako

2. Requests

Requests is a popular Python library for performing HTTP API calls. This library is used in the script to invoke the Google Search API with your RapidAPI credentials.

Install Requests using the pip command.

pip install requests

The SERP Log Script Code

Now that you are aware of the prerequisites for running the script, let’s write the Python script.

The Python script can be split into four parts.

  1. Imports and Declarations
  2. Trigger Google Search API (using requests)
  3. Generate SERP Log HTML File (using Mako)
  4. User Input Processing and Output Display

1. Imports and Declarations

In the beginning, we import the Python modules that are used in this script. Along with that, we also need to define your RapidAPI credentials. This is also the place where we import Mako and Requests libraries.

import sys
import json
import datetime
import requests

from mako.template import Template


RAPIDAPI_HOST = "<YOUR_RAPIDAPI_HOST>"
RAPIDAPI_KEY  = "<YOUR_RAPIDAPI_KEY>"
RAPIDAPI_URL  = "https://google-search3.p.rapidapi.com/api/v1/search"

Note: The placeholders <YOUR_RAPDAPI_HOST> and <YOUR_RAPDAPI_KEY> must be replaced with the actual values obtained from the API console, while you are logged in to RapidAPI account.

2. Trigger Google Search API (using requests)

Earlier, we tested the “POST post search” endpoint manually. With the requests module, you can invoke the API programmatically.

We define a separate function that takes in the keyword as an argument, builds the JSON input, and triggers this API endpoint with the JSON input.

def trigger_api(search_leyword):

  payload = "{  "q":" + """ + search_leyword + """ + ",  "max_results":  10}"

  headers = {
      'x-rapidapi-host': RAPIDAPI_HOST,
      'x-rapidapi-key': RAPIDAPI_KEY,
      'content-type': "application/json",
      'accept': "application/json"
    }

  response = requests.request("POST", RAPIDAPI_URL, data=payload, headers=headers)

  if(200 == response.status_code):
    return json.loads(response.text)
  else:
    return None

The response of API is converted to JSON, which is a Python dictionary.

3. Generate SERP Log HTML File (using Mako)

This step is required to convert the API response into HTML. We define another function that creates a new HTML file with a timestamp and passes the API response to a Mako template file to generate the HTML output.

def generate_serp(search_leyword,api_data):

  search_timestamp = datetime.datetime.now()

  date = search_timestamp.strftime("%m-%d-%Y")
  time = search_timestamp.strftime("%H%M%S")

  search_filename = "SERP_Log_"+date+"_"+time+".html"

  serp_template = Template(filename='search_result_template.html',input_encoding='utf-8', output_encoding='utf-8', encoding_errors='replace')

  try:

    with open(search_filename, 'wb+') as fd:

      fd.write(serp_template.render(Search_String=search_leyword,Date=date,Time=search_timestamp.strftime("%H:%M:%S"),result_list=api_data))
      fd.close()

  except IOError as e:

    print("Error In Opening File For Writing SERP Report")
    raise Exception

  return search_filename

The Mako template file ‘search_result_template.html’ performs the magic of converting the JSON API response to HTML within the generate_serp( ).

<!DOCTYPE html>

<html>

<head>
  <title>Your Search Results for ${Search_String}</title>
</head>
<body>
  <h1>Top 10 Search Results for ${Search_String}</h1>
  <h2>Date : ${Date} , Time : ${Time}</h2>
  <div>
    % for result in result_list[:10]:

    <h3>Search Result #${loop.index+1} : <a href='${result["link"]}' target="_blank">${result["title"]}</a></h3>
    % endfor
  </div>
</body>
</html>

As you can see, the Mako template allows you to access variables in a Pythonic way to generate HTML elements.

4. User Input Processing and Output Display

This part is the main routine of this script. It first checks for user input supplied as arguments to the script. Afterward, it invokes the trigger_api( ) and generate_serp( ) to generate the SERP result HTML file.

if __name__ == "__main__":

  search_string = ' '.join(sys.argv[1:])

  if(len(search_string) > 2): 

    try:

      print("Generating SERP Report For " + search_string)

      data = trigger_api(search_string)

      if(data):

        results_file = generate_serp(search_string,data)

        print("SERP Report Generated, File : " + results_file)  

      else:

        print("Error in Triggering Google Search API")

    except Exception as e:

      print("Cannot Generate SERP Report")

  else:
    print("Search String Not Long Enough")

SERP Log Script in Action

Save the python code as ‘serp_generator.py’ file and make sure that this file, along with the Mako template contained in ‘search_result_template.html’, resides in the same directory.

Now, assuming that you have all the prerequisites installed under your Python3 runtime environment, you can invoke the script as follows.

python serp_genearator.py api marketplace

We are invoking it to generate the SERP listing for the same keyword “api marketplace” that we tested manually using the endpoint earlier.

Here is what the generated output file contains when opened in a browser tab.

The generated file is created in the same directory and has the date and time stamp of the exact time of its creation.

Web Search at Scale

You just discovered a way to perform web search at scale!

Whatever your end goal is, the SERP Log script can spawn thousands of times to generate many SERP listings for you. This heap of the SERP log becomes a treasure trove of data for you to collect search results and find the latest and popular websites for a given topic.

Other applications include:

So go ahead and get immersed in the sea of search result analytics until we are back soon with yet another interesting demo of an API hosted in RapidAPI.

4/5 - (3 votes)

View Comments

  • I have read so many posts about the blogger lovers however this piece of writing is really a good article, keep it up.

  • The api_data contains the API response. We use it as an input for the template that generates the HTML SERP report.

  • Everything is very open with a clear explanation of the issues.
    It was really informative. Your site is very helpful.
    Many thanks for sharing!

Share
Published by

Recent Posts

Power Up Your Enterprise Hub: New March Release Boosts Admin Capabilities and Streamlines Integrations

We're thrilled to announce the latest update to the Rapid Enterprise API Hub (version 2024.3)!…

3 weeks ago

Unveiling User Intent: How Search Term Insights Can Empower Your Enterprise API Hub

Are you curious about what your API consumers are searching for? Is your Hub effectively…

3 weeks ago

Rapid Enterprise API Hub Levels Up Custom Branding, Monetization, and Management in February Release

The RapidAPI team is excited to announce the February 2024 update (version 2024.2) for the…

1 month ago

Supercharge Your Enterprise Hub with January’s Release: Search Insights, Login Flexibility, and More!

This January's release brings exciting features and improvements designed to empower you and your developers.…

3 months ago

Enhanced Functionality and Improved User Experience with the Rapid API Enterprise Hub November 2023 Release

Rapid API is committed to providing its users with the best possible experience, and the…

5 months ago

The Power of Supporting Multiple API Gateways in an API Marketplace Platform

In today's fast-paced digital world, APIs (Application Programming Interfaces) have become the backbone of modern…

6 months ago