One of coolest things about APIs is seeing what you can build when you smash them together. We built a script that identifies an emotion from an image (with Microsoft’s Emotion API), then recommends a musical playlist based on the emotion identified (using Spotify’s Public API). We’re calling it Mood Music after the team universally vetoed “Spot-motion” (boo!).
Check it out!
You can check out the project on GitHub, but here’s the code:const RapidAPI = new require('rapidapi-connect'); const rapid = new RapidAPI('EmotionSpotify', '###RapidAPI project ID###'); let open = require('open'); let imageUrl = process.argv[2]; rapid.call('MicrosoftEmotionAPI', 'getEmotionRecognition', { // Your Microsfot EmotionAPI substription key (See Docs: https://rapidapi.com/microsoft-azure-org-microsoft-cognitive-services/api/microsoft-computer-vision3/docs) 'subscriptionKey': '###Microsoft Emotion Subscription Key####', // This is the URL of the facial image to be interpreted 'image': imageUrl }).on('success', (payload) => { // The MicrosoftEmotionAPI returns a confidence score for happiness, sadness, surprise, anger, fear, contempt, disgust or neutral. // The emotion detected should be interpreted as the emotion with the highest score, as scores are normalized to sum to one. // I built a simple loop to find the emotion detected. let scores = payload[0].scores; let strongestEmotion = ""; let emotionScore = 0; for (var key in scores) { if (scores[key] > emotionScore) { emotionScore = scores[key]; strongestEmotion = key; } } rapid.call('SpotifyPublicAPI', 'searchPlaylists', { // strongestEmotion should now equal the emotion detected in the photo 'query': strongestEmotion, 'market': '', // I limit the results to 1 for simplicity. For this test, I'm just returning the top result 'limit': '1', 'offset': '' }).on('success', (payload) => { // A JSON object is returned containing information about the playlist including the name, URL, and owner. // Here I have grabbed the playlist's URL and opened it in the browser using the npm package "open" open(payload.playlists.items[0].external_urls.spotify); }).on('error', (payload) => { console.log("Spotify Playlist Query Error"); }); }).on('error', (payload) => { console.log("Microsoft Emotion Error"); });
Here’s how we built it.
Connect to Microsoft Emotion API
How the Microsoft Emotion API Works
First things first, you’ll want to connect to Microsoft’s Emotion API. This API scans faces in images and video for the following emotions: anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise. Why these sentiments? Supposedly, these emotions are expressed universally across cultures.
When you call the API, it will return a confidence score from 0-1 for each emotion. Here’s what happens when the API analyzes the very happy Richard Hendricks (actor Thomas Middleditch) from Silicon Valley.
As you can see in the image above, the API identified “happiness” as the dominant emotion with a score of 0.99. Note–confidence scores will always add up to 1. Even if the emotions are more ambiguous than this smiling picture, the API will assign weighted confidence scores across each emotion.
How to Connect to the Microsoft Emotion API
Now that you understand how the API works, let’s make some calls! Here’s what you’ll need to do.
Step 0: Get the Microsoft Emotion subscriptionKey
To make an API call, you’ll first need to sign up for a Microsoft account and get a subscriptionKey
by doing the following:
- Go to the Microsoft’s Emotion API page
- Create a Microsoft account or log into an existing account
- In the Request New Trials list, choose “Emotion – Preview” to create new subscription
- In Key section choose Key1 or Key2 and press “show” or “copy” to see the
subscriptionKey
. These keys are interchangeable and either can be used for our purposes
The process is pretty straightforward, but there are two places where one might get tripped up.
- Microsoft has multiple cognitive services APIs, including the Computer Vision API (identifies and tags objects in images) and the Face API (isolates faces from images). Each of these APIs needs a separate API key. For this project, be sure to get the API key for Microsoft Emotion API.
- Microsoft may not generate a
subscriptionKey
until you’ve verified your email. To do this, click the “Send email verification” button if it appears, then click the link in the confirmation email message.
Once you’ve got the subscriptionKey
, you’re ready to call the API.
Step 1: Connect to Microsoft Emotion API through RapidAPI
While you can call the API manually, we’ve decided to use the RapidAPI tool. RapidAPI lets you test an API call, export the code snippet in your preferred language and call multiple APIs from one abstraction layer.
- Go to RapidAPI’s Microsoft Emotion API package page and select the
getEmotionRecognition
endpoint - Enter your Microsoft Emotion
subscriptionKey
and your image (through an image link or by uploading it directly) - Click “Test function” to make a call! The API should return an array of confidence scores on emotion detection
Here’s a quick GIF of how the process works.
You can export the code snippet used to call the API if you sign in. In this tutorial, we’ll export the snippet with Node.js; however, you can also export the snippet in Ruby, PHP, Python, Objective-C and Java, or cURL.
Step 2: Build feedback loop
Next, you’ll want to build a feedback loop that grabs an array of scores, iterates through all the emotions and chooses the strongest emotion based on highest score (closest to 1). Even if your picture is emotionally ambiguous, the script will always return at least one emotion.
We declared three variables: scores
, strongestEmotion
and emotionScore
.
scores
: Takes payloads and identifies array of emotions from JSON response.strongestEmotion
: An empty string variable that will house the emotion with the highest confidence score.emotionScore
: Variable used to determinestrongestEmotion
through the for loop.
Here’s what the loop looks like with these variables:
}).on('success', (payload) => { // The MicrosoftEmotionAPI returns a confidence score for happiness, sadness, surprise, anger, fear, contempt, disgust or neutral. // The emotion detected should be interpreted as the emotion with the highest score, as scores are normalized to sum to one. // I built a simple loop to find the emotion detected. let scores = payload[0].scores; let strongestEmotion = ""; let emotionScore = 0; for (var key in scores) { if (scores[key] > emotionScore) { emotionScore = scores[key]; strongestEmotion = key; } }
Next, you’ll use the strongestEmotion
variable to search Spotify playlists.
Connect to the Spotify Public API
The Spotify Public API lets you search Spotify’s giant musical database. The best part? You don’t need a subscription key! Here’s how to get the code snippet from the Spotify API call and incorporate that code into your API smash.
Step 0: Test the API call in RapidAPI and export the code snippet
This step should feel familiar–it’s the same process we used to call Microsoft’s Emotion API. Head over to the Spotify Public API package. Next, pick the searchPlaylists
endpoint. Make a test call by typing in an emotion in the query parameter (ex. “surprise”) and clicking “Test Function.” The whole process literally takes ten second–we timed it.
Bam! You’ve just called the Spotify API. Now, we’ll copy the code snippet and export it into your script.
Step 1: Add Spotify Public API call to Script
To put it all together, we’ll fill in the query parameter for the Spotify API call with the strongestEmotion
variable. The code snippet will look like this:
rapid.call('SpotifyPublicAPI', 'searchPlaylists', { 'query': strongestEmotion, 'market': '', // I limit the results to 1 for simplicity. For this test, I'm just returning the top result 'limit': '1', 'offset': ''
The payload returns lots of information about the playlist that we don’t need for this project. Grab the external URL from top result with the snippet: payload.playlists.items[0].external_urls.spotify
Bonus tip: we added the npm package “open” so the playlist external link opens in your browser automatically. Just run
npm install-save open
in terminal or visit this link for more info.
Congrats!
That should do it! You’ve just built an API Smash.
You can reference your code with our project on GitHub. f you want to build another API Smash, check out our Text-A-GIF project using GIPHY and Twilio.
Leave a Reply