Medium

פרמיום
Verified
על ידי Nishu Jain | מְעוּדכָּן 2 months ago | Data
פּוֹפּוּלָרִיוּת

9.7 / 10

חֶבִיוֹן

604ms

רמת שירות

100%

Health Check

N/A

חזרה לכל ההדרכות (2)

Medium API: Get Posts Using Python

Hey guys!

Today, we’ll be fetching posts (articles or stories) written by a user from Medium Platform (https://medium.com), using Python and Medium APIs. It’s pretty straightforward and simple …

So without wasting much time, let’s get to the code!

Prerequisite —

  1. Install python

  2. Install requests library (pip install requests)

  3. Register to RapidAPI to get YOUR_APIKEY (It won’t take long, trust me!)

Code in Steps —

Import requests library

import requests

Set headers and base_url

headers = {
	"x-rapidapi-host": "YOUR_APIHOST", # 'medium2.p.rapidapi.com'
	"x-rapidapi-key": "YOUR_APIKEY"
}

base_url = 'https://medium2.p.rapidapi.com'

Set username

username = 'nishu-jain'

Get user_id

response = requests.get(base_url + '/user/id_for/' + username, headers = headers)

json_data = response.json()
user_id = json_data['id']

Get all the articles (article_ids) written by the author

response = requests.get(base_url + '/user/' + user_id + '/articles', headers = headers)

json_data = response.json()
article_ids = json_data['associated_articles']

Get information related to each article

articles = []
for article_id in article_ids:
	 response = requests.get(base_url + '/article/' + article_id)
	 article = response.json()
	 print('[FETCHING]: ', article['title'])
	 articles.append(article)

Done! You’ve successfully got all the posts using python

print('Total number of articles: ', len(articles), '\n')
print('Details of the first article: \n')
print(articles[0])

Output of the first article’s info —

{ 
 'id': '562c5821b5f0',
 'title': 'About Me :) - Nishu Jain',
 'subtitle': 'Who am I and what do I do ...',
 'author': '1985b61817c3',
 'tags': ['about-me', 'nishu-jain', 'bio', 'software-engineer', 'introduction'],
 'publication_id': '*Self-Published*',
 'published_at': '2021-04-17 17:42:10',
 'last_modified_at': '2021-09-13 06:48:43',
 'claps': 52,
 'voters': 2,
 'word_count': 527,
 'reading_time': 2.3720125786164,
 'topics': ['self'],
 'url': '[https://nishu-jain.medium.com/about-me-nishu-jain-562c5821b5f0'](https://nishu-jain.medium.com/about-me-nishu-jain-562c5821b5f0'),
 'image_url': '[https://miro.medium.com/1*4cUFmh4kDyGvf4y-73tBaA.png'](https://miro.medium.com/1*4cUFmh4kDyGvf4y-73tBaA.png'),
}

Full Code —

import requests

headers = {
  "x-rapidapi-host": "medium2.p.rapidapi.com",
  "x-rapidapi-key": YOUR_APIKEY
}
base_url = 'https://medium2.p.rapidapi.com'

username = 'nishu-jain'
#print('Username: ' + username)

response = requests.get(
                    base_url + '/user/id_for/' + username, 
                    headers = headers
)
json_data = response.json()
user_id = json_data['id'] # string
#print('User ID: ' + user_id)

response = requests.get(
                    base_url + '/user/' + user_id + '/articles',         
                    headers = headers
)
json_data = response.json()
article_ids = json_data['associated_articles'] # list of string
#print('Article IDs: \n' + '\n'.join(article_ids))
#print('Number of article ids fetched: ', len(article_ids))

articles = []
for article_id in article_ids:
   response = requests.get(
                       base_url + '/article/' + article_id, 
                       headers=headers
   )
   article = response.json()
   print('[FETCHING]: ', article['title'])
   articles.append(article)

print('\nTotal number of articles: ', len(articles), '\n')
print('Details of the first article: \n')

from pprint import pprint
pprint(articles[0]) # print first article's info

Help & References —

In case you’re stuck somewhere, read the following articles for more information —
Medium APIs — Documentation
Getting started with Medium REST APIs

How To Retrieve Medium Stories of a User Using APIs?
Easiest method for scrapping Medium Articles written by any user, using Medium APIs

Or you can contact me, I would be happy to help …

nishujain1997.19@gmail.com

Thanks for reading and have a nice day!