Skip to content

Gatekeeper to maintain data manipulation without directly accessing the database

Notifications You must be signed in to change notification settings

ghazimuharam/bq-api-gatekeeper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

API Gatekeeper

This repository will transport the data in real time API hit and then put it in responsible table in Google Cloud Big Query.

Installation

Use git to clone this repository

git clone https://github.com/ghazimuharam/bq-api-gatekeeper

Prerequisite

Make sure you have python 3.7 installed on your machine

> python --version
Python 3.7.10

To run the script in this repository, you need to install the prerequisite library from requirements.txt

pip install -r requirements.txt

Store all your service account json files to ./service directory

To run the script, you need to run kafka server on your machine, see Kafka Docs for the installation.

create api-gatekeeper topic on you kafka server.

kafka-topic

Before running the setup.sh script, you have to specify your Google Cloud Application Credentials using command below

export GOOGLE_APPLICATION_CREDENTIALS="./service/your-credentials.json"

Main

Create dataset in bigquery using command below

sh setup.sh

setup.sh

Run main application for the API Endpoint using command below

uvicorn main:app --reload

uvicorn

Run Kafka Consumer script using command below

python kafka-consumer.py

kafka-consumer.py

License

MIT

About

Gatekeeper to maintain data manipulation without directly accessing the database

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published