Zifo Skills Tracker is a full-stack web application that allows the user to visualise, query and update the skills of Zifo employees.
Understanding the skills that employees possess helps with resource allocation and organisation of training.
The production instance can be found at https://skills.zifo-tech.com. You will need a Zifo Microsoft account to access it.
- Quickstart
- Environments
- Installation
- Data
- Running the application locally
- Testing
- Running the application in production
If you're using Windows, you will need to install a Linux environment using WSL 2
: https://learn.microsoft.com/en-us/windows/wsl/install
Once you've configured a Linux environment, install docker compose
: https://docs.docker.com/compose/install/
To launch the dev environment with randomly generated mock data, run the following script:
./scripts/run-dev-fresh.sh
There are 2 environments that can be launched using this application:
For production, you will need to configure environment variables by adding the following to a .env.prod
file in the root directory:
.env.prod |
---|
SESSION_SECRET_KEY |
NEO4J_URI |
NEO4J_USER |
NEO4J_PASSWORD |
JWT_SECERET_KEY |
AZURE_CLIENT_ID |
AZURE_CLIENT_SECRET |
AZURE_TENANT_ID |
AZURE_REDIRECT_URI |
FRONTEND_URL |
REACT_APP_API_URL |
The JWT and session secret keys can be generated using openssl rand -hex 32
.
The Azure keys and Neo4j credentials should be requested from Joe Smith.
More information on Azure configuration can be found in the Authentication documentation.
For the dev environment, environment variables have been hardcoded in config.py
To install the Python backend as a developer, you will need to install poetry
and then use the following commands:
poetry install
poetry shell
Installation of the frontend requires npm
and the following commands to be run:
cd frontend
npm install
The data required to launch the dev environment and successfully run tests is pre-assembled in a CSV and is tracked using DVC. This CSV is used to import the data into neo4j.
To access this data, you will need to configure DVC with AWS credentials that permit access to the zifo-ds-eu
S3 bucket. Follow the instructions here to do this using the method most suitable for your setup.
One way to do this is as follows:
dvc remote modify s3 --local access_key_id <access_key_id>
dvc remote modify s3 --local secret_access_key <secret_access_key>
dvc pull --recursive
Alternatively, you can generate the mock data from scratch, as shown in the following section.
NOTE: the mock data is generated randomly so the existing tests will fail if you regenerate the data.
NOTE: if you generate the mock data from scratch, this will overwrite the DVC tracked file if you have it downloaded so it will need to be pulled again.
Data for the prod environment is stored in a Neo4j Aura database in the cloud. To access the prod data, you will need to request the credentials from:
It is possible to regenerate the import CSV for the prod environment from the legacy MS Forms responses using the script at [pipeline/survey_data.py].
You will need docker compose
installed to run the application.
You can launch the dev environment in two ways:
- Using pre-assembled mock data: run-dev.sh
- Generating mock data from scratch: run-dev-fresh.sh
Run the script as follows:
./scripts/run-dev.sh
To run the tests, simply run the test-dev.sh script in the root directory.
When running in production, the Azure AD app requires that this application is running on a secure domain.
To do this, you will need to configure a reverse proxy using nginx
and use certbot
to generate an SSL certificate for the domain.
A bash script has been acquired from a separate repository to automate the process of generating an SSL certificate.
Run this script once to generate the SSL certificate. The certificate should be automatically renewed after 90 days.
Once the SSL certificate has been generated, you can load the environment variables and launch the application.
You can use the run-prod.sh script to execute all of this.