Back to all courses
Are you building RESTful APIs correctly? In this video, let's create CRUD endpoints to support a job board API using TypeScript, Node/Express.js, RapidAPI Client, and Xata. This is the first video in this series where we will set up a brand new Node.js configured with TypeScript, create basic CRUD endpoints using a Xata database, and test these endpoints in the RapidAPI Client VS Code Extension.
I publish weekly videos about Web Development! I am constantly learning the latest and greatest in Web Development.
In this guide, we will discover how to construct a RESTful API with Node.js, Express.js, TypeScript, RapidAPI, and Xata for the database.
This is the first section of the guide that explains the foundations of building RESTful APIs. This will be in Node.js, so you need to have knowledge of JavaScript to comprehend it.
On top of that, we will use TypeScript to obtain typings for everything we perform. Along the process, we will use the assistance of two sponsors to help develop this. One is Xata and the other is RapidAPI. In order to test all of our API endpoints as we construct, we will utilize the RapidAPI extension for Visual Studio Code.
Let's have a quick overview of Visual Studio Code. I have the RapidAPI Client Extension installed. If you don't already have RapidAPI Client installed, you can search for it inside VS Code extension marketplace and then install it. Then, you will be presented with an icon on the sidebar where we can create all of our individual RESTful API requests.
This is going to be vital for our testing. Ensure that this is installed to continue with this guide. The other aspect is that we will be able to deploy this to RapidAPI Hub in order to generate revenue from your API, as well as the setups for testing those endpoints locally. You will also have the option to save them to the RapidAPI Studio by signing up for a free RapidAPI account and then navigating to RapidAPI Studio to accomplish this.
We're also using Xata for our database, as we'll need a location to store the data we're working with in our API.
You can sign up for a free account and begin developing. This database is the ideal companion for JAMstack, serverless, JavaScript, and web development projects in general. In addition to a large number of other unique features, some of which will be explored in this guide.
These are the two sponsors that we will employ to develop this project. The primary resource you'll need to be familiar with is the GitHub repository; if you fall behind or wish to compare code, you may access this repository via the link provided. In order to get all the information, you can look over that, and then we'll start.
Setting up a project with TypeScript and Node.js will be the initial task. There are various templates available for implementing this. Nonetheless, we will primarily utilize this template.
The majority of this article will be walked through, and then I'll adapt it to our needs. Thus, the following requirements apply:
So, let's go ahead and start to do the initial setup.
The initial step is creating a directory and starting VS Code by running the following command in the terminal:
sh
mkdir node.js-api-with-rapidapi-typescript-xatacode -r node.js-api-with-rapidapi-typescript-xata
Now that we have a blank project, we need to initialize it as a JavaScript project, which we may do by writing:
sh
npm init -y
It uses the name of our folder as the name of our project and then populates it with some default variables.
The next step is to add TypeScript as a required dependency. TypeScript will therefore be:
sh
npm install typescript --save-dev
This should appear in our package.json
file for TypeScript, followed by the creation of the node modules
directory.
Installing the ambient Node.js types for TypeScripts is the next step. Due to our use of TypeScript, we must additionally obtain the typings for the various items and nodes we will employ. So, therefore:
sh
npm install @types/node --save-dev
By creating a tsconfig.json
file, we can now begin configuring the presentation of this entity. This will instruct TypScript on how to translate all of the code we write into JavaScript.
Now, within the tsconfig.json
file, I will copy the configuration from the project and modify it somewhat. This information is also available in the source code.
Moreover, within the tsconfig.json
file, we will utilize 'sourceMap'. We have our output directory, denoted by outDir
, which indicates where our code goes when it is compiled. Because we need to convert the TypeScript to JavaScript, this will be placed in the dist
directory.
Then we have rootDir
, which is our root directory. Then where will we store all of our source code? It will be located in the src/
directory. We wish to target es6
, which is a more recent version of JavaScript that browsers should support. Our lib
in this case will be 'esnext'.
The moduleResolution
equals Node
and esmoduleInterop
equals 'true', whereas module
equals commonJS
. This setup enables us to utilize the new import syntax as opposed to commonJS
. Imports can be written using the ECMAScript Modules (ESM) syntax, which is what I prefer. Consequently, this is the setup I will employ.
json
{"compilerOptions": {"sourceMap": true,"outDir": "dist","rootDir": "src/","allowSyntheticDefaultImports": true,"strict": true,"target": "es6","lib": ["esnext"],"moduleResolution": "Node","esModuleInterop": true,"module": "CommonJS"}}
Now we can create our source/index.ts
by running the following command in the terminal:
sh
mkdir srctouch src/index.ts
To open the src/index.ts
file, you can simply run:
sh
code src/index.ts
We can write some code such as:
ts
console.log('Hello world!')
To compile our code, we'll need to run the tsc
command using npx
, the Node package executer. The tsc
will read the tsconfig.json
in the current directory, and apply the configuration against the TypeScript compiler to generate the compiled JavaScript code.
sh
npx tsc
After running this, we can see a dist
directory, output index.js
file, and index.js map
file.
The first task is to do cold reloading, It is explained below:
We will now concentrate on setting up either cold or hot reloading in this by leveraging a ts-node
. All of your TypeScript will be trans-compiled using this, which is essentially what we'll do for our production builds.
Then, for ts-node
, we want to start with a single file and have ts-node
build out or run on that file, which will then compile on all other TypeScript files that are referenced from the source. So, ts-node
and nodemon
should both be added here. We can actually have this live reloading server thanks to Nodemon
.
sh
npm install --save-dev ts-node nodemon
Once we set this up and run ts-node
, it will be triggered depending on changes to our files and then automatically terminate. Therefore, we are adding a nodemon.json
setting for this. This tells nodemon to watch
the src
source directory. All of our TypeScript and JavaScript files will be located there.
We'll not ignore
anything and then once it notices a change in those, it's going to run ts-node
on that index.ts
file, which activates a build and then our application is going to reload.
json
{"watch": ["src"],"ext": ".ts,.js","ignore": [],"exec": "npx ts-node ./src/index.ts"}
Following this, I will add a few configurations to our package.json
file. Then, to run the project, we need to execute nodemon
. Add a script for that purpose.
json
"start:dev": "npx nodemon",
By running npm run start:dev
, npx nodemon
will start our app using npx ts-node ./src/index.ts
, watching for changes to .ts
and .js
files from within /src
.
In this phase, we will write a production build command. To accomplish this, we will essentially delete the build directory and then run tsc
again.
Add a build
script in order to clean and compile the project for production. Install rimraf
, a cross-platform utility that functions similarly to the rm -rf
command (just erases whatever you tell it to).
sh
npm install --save-dev rimraf
And then, add this to your package.json
:
json
"build": "rimraf ./build && tsc",
When we run npm run build
, rimraf
will remove our old build
folder before the TypeScript compiler emits new code to dist
.
The production start script can now be added. Moreover, this production start script will serve two purposes. It will execute the build
command, which will delete the build
directory and execute another tse
, followed by the node
command on the index.js
file in the build
directory.
The startup script looks like this:
json
"start": "npm run build && node build/index.js"
The code in this JSON file looks like this:
json
{"name": "nodejs-api-with-rapidapi-typescript-xata","version": "1.0.0","description": "","main": "index.js","scripts": {"build": "rimraf ./build && tsc","start:dev": "npx nodemon","start": "npm run build && node build/index.js"},"keywords": [],"author": "","license": "ISC","devDependencies": {"@types/node": "^18.11.9","nodemon": "^2.0.20","rimraf": "^3.0.2","ts-node": "^10.9.1","typescript": "^4.8.4"}}
There's a few other things we want to install. We want to have the types
for node so we can install types
as follows:
sh
npm install @types/express –save-dev
Let's just make sure that this goes into a build
directory, not a dist
directory. We can actually start to run this:
sh
npx tsc
Additionally, we can add a .gitignore
file. So inside of this gitignore
file, we want to ignore node_modules
. We'll also want to ignore the build
directory. After that, you could go ahead and write the following command inside the terminal:
sh
git init
Then, you might write the following code to view all of the items we must commit:
sh
git status
Then you can run the following command to add the following::
sh
git add *
Afterward we can run the following command to commit the code:
sh
git commit -m "Initial setup"
After that:
sh
git add .gitignoregit commit -m "Added gitignore"
Now, we can run the following command to run the dev server:
sh
npm run start:dev
You'll see that nothing will actually transcompile because we have not yet begun to take action. So we'll open index.ts
file and begin to construct the fundamentals of an express server.
Let's immediately begin importing the necessary items. The types for the request
and response
objects that we will reference within our endpoints are request and response.
ts
import express, {Express, Request, Response} from "express";
We can also install dotenv
as follows to manage our environment variables:
sh
npm install dotenv
This dotenv
package will allow us to use environment variables locally. We will then create a .env
file in which we may define environment variables.
Now that the .env
file has been generated, we will also need to ensure that it is excluded from our .gitignore
file. So let's proceed and ensure that we do it.
To make sure that you have that already committed, you can run the following commands in the terminal:
sh
git add .git commit -m "Ignoring environment variable files"
We'll import these items from express, then import .env
so that we can reference the environment variables, and then we'll initiate this process.
ts
import express, { Express, Request, Response } from 'express';import dotenv from 'dotenv';const app: Express = express();
Then, we'll state our port
will be processed by the environment variable, and it will default to 3000.
ts
import express, { Express, Request, Response } from 'express';import dotenv from 'dotenv';const app: Express = express();const port = process.env.PORT || 3000;
At this point, we would like to launch the server. We'll pass in port
and a callback function, and within it, we'll log server running at port
:
ts
import express, { Express, Request, Response } from 'express';import dotenv from 'dotenv';const app: Express = express();const port = process.env.PORT || 3000;app.listen(port, () => {console.log(`Server running at port ${port}`);});
Following that, we need to install express package by writing the following:
sh
npm install express
Now let's run start:dev
to test if this will actually run. The server is therefore running on port 3000.
sh
npm run start:dev
In the following section, we will construct four RESTful endpoints, including a GET
, a POST
, a PUT
, and a DELETE
request. Then we'll proceed with integrating it with Xata to pull data from a real database. So let's proceed and stub out these items.
We will begin with a app.get
and then write the route that will be referenced. We will next create API endpoints for a jobs
API. Therefore, the initial endpoint will be a get
at the root of the domain. Then, we have our callback function, which will be executed when this endpoint is activated. So, we have req
and res
, and we want those to be types of request
and response
respectively.
Here, we may begin to utilize TypeScript to specify the input parameters with which we will operate. To verify this, we can execute res.json()
and return a message, therefore we should be able to call it endpoint.
ts
import express, { Express, Request, Response } from 'express';import dotenv from 'dotenv';const app: Express = express();const port = process.env.PORT || 3000;app.get('/api/jobs', (req: Request, res: Response) => {res.json({ msg: 'Hello get jobs' });});app.listen(port, () => {console.log(`Server running at port ${port}`);});
To begin, we may navigate to http://localhost:3000/api/jobs to see the results. Now, we'll include this into our RapidAPI Client workflow by clicking the symbol on the left side of Visual Studio Code. Therefore, via RapidAPI Client, we may proceed to generate a request, test it, and store it so that we can rerun it for further testing.
After this, I'll integrate my work here with RapidAPI. This will be a personal project, therefore we'll click open and select Add new project. We will label it "Jobs Node.js API" and select "Yes" to sync it. Now, if you have not previously done so, you must ensure that you are logged in on RapidAPI Studio. You will then be able to follow along here.
If I proceed to my RapidAPI Studio account, you will notice that I have my Jobs Node.js API. Clicking on that, you will find that I do not yet have any environments or groups. Therefore, let's create a request.
This will be a GET request to http://localhost:3000/API/api/jobs. We could call it Get all jobs. Therefore, we should be able to test this. When we receive a response, we can synchronize it back to RapidAPI by pressing the sync button.
Now let's proceed to fill out the remaining endpoints. As previously established, we will write the code for POST, PUT, and DELETE in the same manner as GET. In our PUT and DELETE functions, we shall make reference to an additional parameter called a route parameter.
ts
import express, { Express, Request, Response } from 'express';import dotenv from 'dotenv';const app: Express = express();const port = process.env.PORT || 3000;app.get('/api/jobs', (req: Request, res: Response) => {res.json({ msg: 'Hello get jobs' });});app.post('/api/jobs', (req: Request, res: Response) => {res.json({ msg: 'Hello post job' });});app.put('/api/jobs/:id', (req: Request, res: Response) => {res.json({ msg: 'Hello put job' });});app.delete('/api/jobs/:id', (req: Request, res: Response) => {res.json({ msg: 'Hello delete job' });});app.listen(port, () => {console.log(`Server running at port ${port}`);});
So these are the four different endpoints that we're going to continue to work with, GET
POST
, PUT
, DELETE
. You can also correlate these to our Create
, Read
, Update
, and Delete
(CRUD) operations.
Now we can create a corresponding request. Let's create a new request and pass in that the URL http://localhost:3000/API/api/jobs via POST. We can also add a description here and name it Create new job. Let's just test that so we can obtain a "Hello post job."
Let's do another one. This will be a PUT request and the URL will be http://localhost:3000/API/api/jobs/1 and let's name it Update Job. Go ahead and send that. You will see the response "Hello post job".
Then we'll also have Delete method in here and we will repeat the similar process for this too with the URL http://localhost:3000/API/api/jobs/1.
We can now also re-push this to RapidAPI to demonstrate that they are now being saved and that wherever I use this extension, I can draw from this project and refer to it. It will also help us submit this to the RapidAPI.
Since the endpoints have been set up, we can begin working with our data within Xata. In order to start using the platform, you will need to register for an account, log in, and then set up your workspace.
Workspace is similar to a grouping of databases, within which you will create a database. For this guide, I have developed a James Q. Quick's Workspace. I will add a database, which we will call jqq-job-board Demo.
We might begin from scratch, import sample data, and so on. By clicking the Table option on the left side of the Xata interface, we will Add a table. This table's name will be job, and there will be a number of properties contained within.
Since it is only for demonstration purposes, we will not save a great deal of information in the Job table. We'll call it company and give it a string column type. We will generate a new string with the name title and a separate string for the Job link. We can additionally have another string record for the geography, such as where this role is located.
One of the things that I will do is simply reference a [DevRel] (devrelcareers.com). I will only present a single item from this collection. Let's select a role for deepgram, which is a developer advocate for machine learning.
Inside of here, I'm going to write the company name Deepgram, the title is ML Developer Advocate, job link is "https://devrelcareers.com/category/developer-marketing", and in geography you can write Anywhere (remote).
This visual editor is quick and simple to use with Xata, and it displays our new record. Next, they provide code snippets and lead you through the getting started process, which is actually quite intriguing.
In this step, I'm going to install the Xata CLI. Let's start our application and run the following command to install it:
sh
npm install @xata.io/cli -p
Once Xata CLI is deployed globally, the project can be initiated locally. It does this by passing in the URL for the particular database we are dealing with. So I'm going to copy and paste the following command into our terminal:
sh
Xata init –db http://James-Q-Quick-s-workspace-15kbra.us-east-1.xata.sh/db/jqq-job-board-demo
Additionally, we require it to create TypeScript code for us. Where should this code be placed? It can be placed within src/xata.ts
. If you have not previously logged in, it will prompt you to do so. Then, we can choose the default development branch and proceed by selecting "none".
It will be all set and it'll show us a few different commands that we can run. Let's go back and look inside of our code. Inside of .xatarc
, you can see a configuration file that basically just has the URL to the database.
ts
{"databaseURL": "https ://James-Q-Quick-s-workspace-l5kbra.us-east-1.xata.sh/db/jqq-job-board-demo","codegen": {"output": "src/xata.ts"}}
Next, they generated the xata.ts
source file. This is the code that generates the real database client, which we will subsequently expose via the function getXataClient
. Hence, whenever we need to reference the client, we can invoke this function.
Furthermore, they have also included types based on our data. So, it constructed this table's object with all of its columns, and based on that, they perform some TypeScript inference so that we obtain types like jobs
and jobsRecord
etc.
ts
import { BaseClientOptions, buildClient, SchemaInference, XataRecord } from '@xata.io/client';const tables = [{name: 'job',columns: [{ name: 'jobLink', type: 'string' },{ name: 'companyName', type: 'string' },{ name: 'jobTitle', type: 'string' },{ name: 'geography', type: 'string' },],},] as const;export type SchemaTables = typeof tables;export type InferredTypes = SchemaInference<SchemaTables>;export type Job = InferredTypes['job'];export type JobRecord = Job & XataRecord;export type DatabaseSchema = {job: JobRecord;};const DatabaseClient = buildClient();const defaultOptions = {databaseURL: 'https://James-Q-Quick-s-workspace-l5kbra.us-east-1.xata.sh/db/jqq-job-board',};export class XataClient extends DatabaseClient<DatabaseSchema> {constructor(options?: BaseClientOptions) {super({ ...defaultOptions, ...options }, tables);}}let instance: XataClient | undefined = undefined;export const getXataClient = () => {if (instance) return instance;instance = new XataClient();return instance;};
One thing to keep in mind while you run this, Xata requires an instance of a fetch object, which does not include node 18 by default.
This is essentially a Node management issue. It's the simplest approach I've discovered to utilize different versions of Node, and I'll use version 18 for this project as well. Input the following command to execute:
sh
nvm use 18
Inside of this index.ts
file, we can now start to import the things we need for Xata. We can do an import of getXataClient
, Jobs
type, and import this from the .xata
folder.
ts
import express, {Express, Request, Response} from "express";import dotenv from 'dotenv';import {getXataClient, Job} from './xata';const app:Express = express();const port = process.env.PORT || 3000;app.get('/api/jobs', (req: Request, res: Response) => {res.json({ msg: 'Hello get jobs'});});app.post('/api/jobs', (req: Request, res: Response) => {res.json({ msg: 'Hello post job'});});app.put('/api/jobs/:id', (req: Request, res: Response) => {res.json({ msg: 'Hello put job'});});app.delete('/api/jobs/:id', (req: Request, res: Response) => {res.json({ msg: 'Hello delete job'});});app.listen(port, () => {console.log (`Server running at port ${port}`)})
The code generator can be run again. It is directly connected to our database, so it will generate all of the necessary code, including the necessary types and other information.
sh
xata codegen
In addition, I will execute the following command to ensure that environment variables are set up:
ts
import express, {Express, Request, Response} from "express";import dotenv from 'dotenv';import {getXataClient, Job} from './xata';dotenv.conf();const app:Express = express();const port = process.env.PORT || 3000;app.get('/api/jobs', (req: Request, res: Response) => {res.json({ msg: 'Hello get jobs'});});app.post('/api/jobs', (req: Request, res: Response) => {res.json({ msg: 'Hello post job'});});app.put('/api/jobs/:id', (req: Request, res: Response) => {res.json({ msg: 'Hello put job'});});app.delete('/api/jobs/:id', (req: Request, res: Response) => {res.json({ msg: 'Hello delete job'});});app.listen(port, () => {console.log (`Server running at port ${port}`)})
Xata has generated an API key for this database, which is included in my .env
file.
You could manually create one of these keys, but they took care of it for us, which is rather convenient, and it will also automatically recognize it.
As we get down to our endpoints, or right before we do, we want to get an instance of the XataClient
.
ts
import express, { Express, Request, Response } from 'express';import dotenv from 'dotenv';import { getXataClient, Job } from './xata';dotenv.conf();const app: Express = express();const port = process.env.PORT || 3000;const xata = getXataClient();app.get('/api/jobs', (req: Request, res: Response) => {res.json({ msg: 'Hello get jobs' });});app.post('/api/jobs', (req: Request, res: Response) => {res.json({ msg: 'Hello post job' });});app.put('/api/jobs/:id', (req: Request, res: Response) => {res.json({ msg: 'Hello put job' });});app.delete('/api/jobs/:id', (req: Request, res: Response) => {res.json({ msg: 'Hello delete job' });});app.listen(port, () => {console.log(`Server running at port ${port}`);});
Let's start inside of our GET endpoint. Here, we'll state that jobs
will come back from xata.db.job.getAll()
. This returns to us a promise. Therefore, we'll mark these functions as async
.
Then, after completing that for each of them, we will await
for every response we receive. Before we restart our server, we also need to edit the json
so that it returns the jobs
that we just retrieved.
ts
import express, { Express, Request, Response } from 'express';import dotenv from 'dotenv';import { getXataClient, Job } from './xata';dotenv.conf();const app: Express = express();const port = process.env.PORT || 3000;const xata = getXataClient();app.get('/api/jobs', async (req: Request, res: Response) => {const jobs = await xata.db.job.getAll();res.json(jobs);});app.post('/api/jobs', async (req: Request, res: Response) => {res.json({ msg: 'Hello post job' });});app.put('/api/jobs/:id', async (req: Request, res: Response) => {res.json({ msg: 'Hello put job' });});app.delete('/api/jobs/:id', async (req: Request, res: Response) => {res.json({ msg: 'Hello delete job' });});app.listen(port, () => {console.log(`Server running at port ${port}`);});
Now, let’s start our server with the following command:
sh
npm run start:dev
We can now return to VS Code and use our RapidAPI Client. Call Get all jobs and press the send button to actually receive the list of jobs from Xata.
Let's start working on the other endpoints right away. Remember that we haven't added any TypeScript to this at all. We'll return to error handling as we haven't yet added any.
However, let's go on to our following task, which is to POST. To retrieve the body or to place it inside of a POST request, you would configure express. Accordingly, we'll use the name app.use
, which is for express and middleware.
We'll pass the express.json
function as a parameter, which will enable us to refer to a POST's body. We'll retrieve the body in this instance from req.body
. We'll take that job
, create a new job
in its place, and then return the createdJob
.
ts
import express, { Express, Request, Response } from 'express';import dotenv from 'dotenv';import { getXataClient, Job } from './xata';dotenv.conf();const app: Express = express();const port = process.env.PORT || 3000;const xata = getXataClient();app.get('/api/jobs', async (req: Request, res: Response) => {const jobs = await xata.db.job.getAll();res.json(jobs);});app.post('/api/jobs', async (req: Request, res: Response) => {const job = req.body;const createdJob = await xata.db.job.create(job);res.json(createdJob);});app.put('/api/jobs/:id', async (req: Request, res: Response) => {res.json({ msg: 'Hello put job' });});app.delete('/api/jobs/:id', async (req: Request, res: Response) => {res.json({ msg: 'Hello delete job' });});app.listen(port, () => {console.log(`Server running at port ${port}`);});
Let's test that out and create a new job in our RapidAPI Client on VS Code. We'll need to make sure we fill out the body, so we'll do raw JSON and then we'll need to match the properties that are listed inside of our xata.ts
file. We can write the details as follows:
json
{"company": "Cool Company","title": "Cool Developer Job","jobLink": "https://devrelcareers.com/jobs/752572-ml-developer-avocate-deepgram","geography": "Anywhere (remote)"}
This is included in the request body that we are sending to our endpoint. It ought to take the data and store it in the database. So let's send this off right away.
As you can see, the newly generated record has an ID property, indicating that the creation process was successful. This is also visible inside the dashboard.
We'll move on to the PUT
endpoint now. In this case, we'll need a means to both reference the id from the route parameter and get the body. We'll therefore take this from the req.body
, and I can then take the ID from the req.params.id
.
Then, we'll create const updatedJob
and await xata.db.job.update
using this information. Additionally, we'll pass the id
and the job
inside it. The id
will specify which job
we are updating, and we will then return the updated job
.
ts
import express, { Express, Request, Response } from 'express';import dotenv from 'dotenv';import { getXataClient, Job } from './xata';dotenv.conf();const app: Express = express();const port = process.env.PORT || 3000;const xata = getXataClient();app.get('/api/jobs', async (req: Request, res: Response) => {const jobs = await xata.db.job.getAll();res.json(jobs);});app.post('/api/jobs', async (req: Request, res: Response) => {const job = req.body;const createdJob = await xata.db.job.create(job);res.json(createdJob);});app.put('/api/jobs/:id', async (req: Request, res: Response) => {const id = req.params.id;const job = req.body;const updatedJob = await xata.db.joob.update(id, job);res.json(updatedJob);});app.delete('/api/jobs/:id', async (req: Request, res: Response) => {res.json({ msg: 'Hello delete job' });});app.listen(port, () => {console.log(`Server running at port ${port}`);});
We can also test our PUT request such that when we return to the RapidAPI extension and select Update Job, we can see the response for the record we just created. The path will be http://localhost:3000/API/api/jobs/rec-cdld3vapcimtsqhirpg after I have copied the id
into it.
I'll then simply update or give in the information that will be updated inside the JSON. In this instance, the subject will be a company, and we can put "Super Cool Company Name" and press the send button.
This can be confirmed inside of Xata. The name "Super Cool Company" is there, and it appears to be successful.
We can head over to our Delete request now. We'll take the id
from the req.params
and then call xata.db.job.delete
. The record's id
will then be passed in, and we will await
our call before returning the deletedRecord
.
ts
import express, { Express, Request, Response } from 'express';import dotenv from 'dotenv';import { getXataClient, Job } from './xata';dotenv.conf();const app: Express = express();const port = process.env.PORT || 3000;const xata = getXataClient();app.get('/api/jobs', async (req: Request, res: Response) => {const jobs = await xata.db.job.getAll();res.json(jobs);});app.post('/api/jobs', async (req: Request, res: Response) => {const job = req.body;const createdJob = await xata.db.job.create(job);res.json(createdJob);});app.put('/api/jobs/:id', async (req: Request, res: Response) => {const id = req.params.id;const job = req.body;const updatedJob = await xata.db.joob.update(id, job);res.json(updatedJob);});app.delete('/api/jobs/:id', async (req: Request, res: Response) => {const id = req.params.id;const deletedRecord = await xata.db.job.delete(id);res.json(deletedRecord);});app.listen(port, () => {console.log(`Server running at port ${port}`);});
Let's test this out right away. Let's paste the same ID from the one we just produced and change it into our delete request at "https://localhost:3000/API/api/jobs/rec-cdld3vapcimtsqhirpg."
Nothing inside the body needs to be passed. When we send it, the record indicates that it was removed, which is exactly what we wanted.
Let's sum this up. We have four different Endpoints: GET, POST, PUT, and DELETE. With real data inside of Xata, they are all functioning as expected. We still have more work to perform on our API's error handling. We can also bring in some sort of TypeScript deficiency here. I'll move that to another guide, which will be part two, where we'll work on more TypeScript types and then error handling in our API.