You are on page 1of 3

Ebook:

https://rest-apis-flask.teclado.com/docs/course_intro/

Repository:
https://github.com/tecladocode/rest-apis-flask-python

-----------------------------------------------------------------------------------
-----------------

We want to create a virutal environment because if the app is going to be exported


all the libs will be there and we dont need to install anything
We are creating a virtual environment using:
Python -m venv .env_test
using ctrl+shift+p we are selecting the python venv
then open another terminal using ctrl J

if doesnt work:
Set-ExecutionPolicy Unrestricted -Force
pip install virtualenv
then try to activate it
https://stackoverflow.com/questions/18713086/virtualenv-wont-activate-on-windows
or in the folder .venv\Scripts\activate

-----------------------------------------------------------------------------------
-----------------
1. then install requirments:
pip install flask

2. then create a new python file called app.py


its important to name it like so to run flask

3. We also need to create a requirments.txt file, which will hold all the libraries
that python will Unrestricted
then we will create the image and run the needed dependencies through the
requirments file.
for an example the flask, flask-smorest, and python-dotenv

4. We also want to create a file called .flaskenv


which is a file that holds all the variables that needed for flask to run when it
starts
here we declare variables like:
FLASK_APP = app
FLASK_DEBUG = 1 - this allows the app to be restarted each time the code is changed

5. Files:
Models: the tables of the db, what columns should be and what the relations should
be
Resources = blueprints : the operations on each table
Schemas: the deserialization and serialization of data.
first create a model (so we will have a base for the db), then create a schema (so
we know how to parse incoming/outgoing data)
then create a resource(so we can have operations on the model), then register the
blueprint into the apis

model -> schema -> resource -> blueprint link to api

Flask-Migrations - these are used when we develop our app and we change things in
the the db. in order to not break the db
we need to use db migrations.
installation:
pip install flask-migrate
at the app - from flask-migrate import migrate

flask-migrate is the connection between alembic and flask.


do start -> flask db init (to init migrations files)
then, delete the db, so alembic can create the tables and know how to provision
them at later user.
then -> flask db migrate (to create migration version)
Then we can upgrate or downgrade versions using -> flask db upgrade (to upgrade to
new db file)
if we want to change things in the db, we make the changes, and then:
flask db migrate -> flask db upgrade

To use local virtual environment variables:


create file named: .env

when we switched from sqlite to postgres we needed to create a migration because it


runs different commands

also note that the deploymnet server dfoesnt run migrations, we need to create a
file to run them
create file docker-entrypoint.sh
then add:

#!/bin/sh
this run the executable
flask db upgrade
so it goes to the lastest migration
and then
exec gunicorn --bind 0.0.0.0:80 "app:create_app()"
to run the app

then at the dockerfile:


CMD ["/bin/bash", "docker-entrypoint.sh"]
to tell docker to run the migration

we can also create a database for local development, so in the .env file i would
put the url for the local db, and wont use sqlite
and for the production database give another link, but at this project i have used
same db for both

to send mails:
use mail gun.

if we want the user endpoints to send emails, we need to add a function there, and
so on any other endpoint file.
if needed:
1. add email to db
2. migrate + upgrade (down forget to set name at the migration file so we can also
downgrade)
3. add schema if needed
4. update endpoint to have that schema and parse
5. call email sending function

to add a redis instance take the redis url and add it to the .env

to run the worker:


docker run --name=api_worker -it api_local sh -c"rq worker -u redis_url queue_name"
-u - where to connection, and the queue name.
we need to run both the worker from redis, which will handle the emails sending
and the api it self.

to deploy everything:
create a settings.py file, and there put the rq configuration settings.
we dont want to pass the redis name and the url so we save it in the file.

You might also like