How to Build a Full Stack Next.js, FastAPI, PostgreSQL Boilerplate Tutorial
In this tutorial, I will show you how to set up a basic full stack boilerplate with Next.js, FastAPI, and PostgreSQL. Each framework and library was handpicked for its amazing DX (Developer Experience). This is how I would build a web app today if I were to start from scratch.
This tutorial was developed on a Macbook, so the tools may be different depending on your OS.
First, if you don’t already have Node.js, Python, and PostgreSQL installed, I recommend the following tools to install and manage different language and database versions on your machine.
This tutorial is using:
- Node 16.13.0
- Python 3.9.6
- PostgreSQL 14
The branch for this tutorial can be found here:
https://github.com/travisluong/nfp-boilerplate/tree/tutorial-1-how-to-build-nfp-boilerplate
The complete project can be found here:
https://github.com/travisluong/nfp-boilerplate
Initial Project Setup
Let’s create a directory to put all of our code. In a real project, you may want to separate your front-end and back-end repositories, but for the convenience of this tutorial, we’ll throw everything into one repo.
$ mkdir nfp-boilerplate
$ cd nfp-boilerplate
Next, let’s get the back end running. First, start by creating a directory for the FastAPI application in the project root.
$ mkdir nfp-backend
$ cd nfp-backend
Create and activate the python virtual environment.
$ python -m venv venv
$ . venv/bin/activate
Install FastAPI, and other dependencies.
$ pip install fastapi "uvicorn[standard]" gunicorn psycopg2 sqlalchemy alembic "databases[postgresql]" python-dotenv
Here is a quick description of each package.
- fastapi – web framework
- uvicorn – asgi server
- gunicorn – wsgi server
- psycopg2 – postgresql driver
- sqlalchemy – python sql toolkit and object relational mapper
- databases – asyncio support for databases
- alembic – database migration tool
Freeze the requirements.
$ pip freeze > requirements.txt
Database Setup – PostgreSQL
You should have PostgreSQL installed locally. If not, then I’d recommend Postgres.app if you’re on a Mac.
Create the dev database.
$ createdb nfp_boilerplate_dev
Create a user. The -P flag will issue a prompt for the password of the new user.
$ createuser nfp_boilerplate_user -P
Once you got the database and user created, you are pretty much ready to start developing. I’d recommend downloading a database client such as TablePlus, Postico, or pgAdmin. It’ll make life a bit easier, but for this tutorial, I’ll stick with the command line psql client for ease of writing.
Migration Tool Setup – Alembic
In the nfp-backend
directory, initialize alembic.
$ alembic init alembic
In alembic.ini
, find the line with this text sqlalchemy.url = driver://user:pass@localhost/dbname
. Replace it with:
sqlalchemy.url = postgresql://nfp_boilerplate_user:password@localhost/nfp_boilerplate_dev
Generate the first migration file.
$ alembic revision -m "create notes table"
This should have created a file that looks similar to this: df0d975d6fc2_create_notes_table.py
. This is a migration file where we can define the changes that we want to make to our database.
Add the migration code to the upgrade
and downgrade
methods.
"""create notes table
Revision ID: df0d975d6fc2
Revises:
Create Date: 2021-11-30 23:54:45.835230
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'df0d975d6fc2'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"notes",
sa.Column("id", sa.Integer, primary_key=True),
sa.Column("text", sa.String),
sa.Column("completed", sa.Boolean)
)
def downgrade():
op.drop_table("notes")
Preview the SQL that will be run by the migration.
$ alembic upgrade head --sql
You should see something like this.
BEGIN;
CREATE TABLE alembic_version (
version_num VARCHAR(32) NOT NULL,
CONSTRAINT alembic_version_pkc PRIMARY KEY (version_num)
);
INFO [alembic.runtime.migration] Running upgrade -> df0d975d6fc2, create notes table
-- Running upgrade -> df0d975d6fc2
CREATE TABLE notes (
id SERIAL NOT NULL,
text VARCHAR,
completed BOOLEAN,
PRIMARY KEY (id)
);
INSERT INTO alembic_version (version_num) VALUES ('df0d975d6fc2') RETURNING alembic_version.version_num;
COMMIT;
We are creating a record in the alembic_version table to keep track of the migration status. And we are creating the notes table which we defined in the migration file. Also, note that it’s all happening within a transaction.
To actually run the migration, run the same command without the --sql
flag:
$ alembic upgrade head
This will run all migrations that haven’t been run yet.
Log into the DB using psql.
$ psql nfp_boilerplate_dev
Run \dt
command to show the tables. You should see the newly created notes table along with the alembic_version table.
To exit out of psql, run \q
To revert one migration, run:
$ alembic downgrade -1
To run one migration, run:
$ alembic upgrade +1
You can change the number to run or revert multiple migrations.
Back End Development – FastAPI
Now that we have the database and migration set up, let’s start on the back-end API development.
Create a main.py
file.
from typing import Optional
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
def read_root():
return {"Hello": "World"}
When starting a new project, I always like a simple hello world as a sanity check.
Run the backend server.
$ uvicorn main:app --reload
Go to http://localhost:8000
and you should see {"Hello":"World"}
.
Create a .gitignore
in nfp-boilerplate/nfp-backend
.
__pycache__
venv
.env
Create a README.md
. Tip: It’s a good practice to have a readme for each project.
# nfp-backend
Setup:
python -m venv venv
. venv/bin/activate
pip install -r requirements.txt
Run the development server:
uvicorn main:app --reload
Change the main.py
file to the following:
import os
import databases
import sqlalchemy
from typing import List
from fastapi import FastAPI
from pydantic import BaseModel
from dotenv import load_dotenv
from fastapi.middleware.cors import CORSMiddleware
load_dotenv()
DATABASE_URL = os.getenv("DATABASE_URL")
database = databases.Database(DATABASE_URL)
metadata = sqlalchemy.MetaData()
notes = sqlalchemy.Table(
"notes",
metadata,
sqlalchemy.Column("id", sqlalchemy.Integer, primary_key=True),
sqlalchemy.Column("text", sqlalchemy.String),
sqlalchemy.Column("completed", sqlalchemy.Boolean),
)
engine = sqlalchemy.create_engine(
DATABASE_URL
)
# metadata.create_all(engine)
class NoteIn(BaseModel):
text: str
completed: bool
class Note(BaseModel):
id: int
text: str
completed: bool
app = FastAPI()
origins = [
"http://localhost",
"http://localhost:8080",
"http://localhost:3000"
]
app.add_middleware(
CORSMiddleware,
allow_origins=origins,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
@app.on_event("startup")
async def startup():
await database.connect()
@app.on_event("shutdown")
async def shutdown():
await database.disconnect()
@app.get("/notes/", response_model=List[Note])
async def read_notes():
query = notes.select()
return await database.fetch_all(query)
@app.post("/notes/", response_model=Note)
async def create_note(note: NoteIn):
print(note)
query = notes.insert().values(text=note.text, completed=note.completed)
last_record_id = await database.execute(query)
return {**note.dict(), "id": last_record_id}
Note: The above code example can be found in the FastAPI documentation under the async database section. However, we have made a few important tweaks to the original sample code.
We imported os
and dotenv
. We called load_env
to load the environment variables from .env
. And we replaced the hardcoded database URL with a call to os.getenv
.
metadata.create_all(engine)
automatically creates the database tables based on your sqlalchemy table definitions. Since we’re using alembic, we don’t need this, so we will comment it out.
Also, note that we’ve added CORS middleware so that we can call the API from the browser at a different domain than the API.
It’s a best practice to not check secrets into your git repository, such as the database credentials. Let’s extract them into a .env
file, which we are ignoring in our .gitignore
.
Create a .env
file with the following:
DATABASE_URL=postgresql://nfp_boilerplate_user:password@localhost/nfp_boilerplate_dev
Start the server by running:
$ uvicorn main:app --reload
Go to http://127.0.0.1:8000/docs
. You should see the auto-generated API documentation. This documentation is interactive, which means you can make requests to your API from this interface. Go ahead and try to post a note and get a note.
Verify that it has been saved in the database.
$ psql nfp_boilerplate_dev
# select * from notes;
Front End Development – Next.js
Now that we have an API that can get and post, let’s build out a front end to consume this API.
Generate a Next.js application.
$ npx create-next-app@latest
Follow the prompt. We’re going to name it nfp-frontend
.
Since we’re putting everything into a single repo, let’s delete the .git repo in the Next.js folder we just generated. You can skip this step if you’re decoupling your front end and back end.
$ cd nfp-frontend
$ rm -rf .git
Run the dev server and make sure it works.
$ npm run dev
If you go to http://localhost:3000
, you should see the Next.js welcome page.
Next, let’s install Tailwind CSS.
$ npm install -D tailwindcss@latest postcss@latest autoprefixer@latest
$ npx tailwindcss init -p
Import tailwind in _app.js
.
import 'tailwindcss/tailwind.css'
Include tailwind in global.css
.
@tailwind base;
@tailwind components;
@tailwind utilities;
Create a notes.js
in pages
.
import Head from 'next/head'
import { useState, useEffect } from 'react';
export default function Notes() {
const [note, setNote] = useState('');
const [notes, setNotes] = useState([]);
useEffect(() => {
async function fetchNotes() {
const res = await fetch(`${process.env.NEXT_PUBLIC_API_URL}/notes/`);
const json = await res.json();
console.log(json)
setNotes(json);
}
fetchNotes();
}, [])
function handleChange(e) {
setNote(e.target.value);
}
async function handleSubmit() {
const res = await fetch(`${process.env.NEXT_PUBLIC_API_URL}/notes/`, {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({
text: note,
completed: false
})
})
const json = await res.json();
setNotes([...notes, json])
}
return (
<div>
<Head>
<title>Notes</title>
</Head>
<div className="container mx-auto p-10 m-10">
<div className="flex flex-col">
<h1 className="font-bold mb-3">Notes</h1>
<textarea value={note} onChange={handleChange} className="border-2" ></textarea>
<div className="mx-auto p-3 m-5">
<button onClick={handleSubmit} className="bg-green-500 p-3 text-white">Submit</button>
</div>
<div>
<ul>
{notes && notes.map((note) =>
<li key={note.id} className="bg-yellow-100 m-3 p-3 border-yellow-200 border-2">{note.text}</li>
)}
</ul>
</div>
</div>
</div>
</div>
)
}
Here’s a quick summary of the code we just wrote.
- Initialize state for the notes
- API call to fetch notes
- Event handler for textarea changes
- Event handler for submission of text
- JSX to render the form
- Some Tailwind CSS for styling
Note that we’re referencing process.env.NEXT_PUBLIC_API_URL
in the fetch calls. This is an environment variable as it’s a configuration we might want to change based on the environment. For example, development, staging, and production environments.
Create a .env.development
file.
NEXT_PUBLIC_API_URL=http://localhost:8000
Note: The NEXT_PUBLIC
prefix exposes the environment variable in the browser. We need this one to be public since we’re making client-side API calls with it. For server-side environment variables, leave the prefix out.
Next.js uses file system-based routing, which means that the routes are mapped to the folder structure within pages. This makes client-side routing extremely easy to work with.
Re-run the dev server with npm run dev
and navigate to http://localhost:3000/notes
.
Congratulations. You should see the beginnings of a fully functional full-stack application.
Conclusion
Thank you for reading this far. This is my special recipe made with some of the finest and most-loved ingredients out there. I hope this boilerplate will help some people get past some of the bike-shedding and yak-shaving that goes on at the start of a project.
Where to go from here? I recommend checking out the following documentation:
In a future post, I will write about my methodology for deploying this stack. Stay tuned for more.