- OKR Tracker
- Demo
- Project requirements
- Clone and install
- Set up new instance
- Make Firestore ready for production
- Create Google Cloud API Gateway
- Build and deploy
- Lint and fix
- Import production data from Cloud Firestore to local Firestore
- Import production data
- Automated Backup with Cloud Functions
- Supported providers
- Common problems
If you would like to check out how the application works, you can go to the demo-site and sign in with a test-user
- Site: https://origo-okr-tracker.web.app
- User/pass: [email protected] / testuser
- Node 20.x
- Firebase 10.x
- Firebase tools >9.x
- Firebase Blaze plan - Pay as you go
Clone repository and run install:
npm install && cd ./functions && npm install && cd ..
Install Firebase CLI:
npm install -g firebase-tools
Follow this guide to set up a new clean instance of the OKR-tracker. Please read the whole readme and not sequentially. There are some steps throughout the readme that are important to set up a new instance.
- Create a Google Firebase project.
- Initialize the project with Firebase CLI
- Create a Google service account
- From the Project Overview, select Service accounts
- Click Generate new private key
firebase functions:config:set
service_account="<service account private key json-file>"
storage.bucket="<your-storage-bucket-name>"
Cat the whole service account private key json file into the environment key service_account
.
zsh
firebase functions:config:set service_account="$(cat origo-okr-tracker-private-key.json)"
sh
firebase functions:config:set service_account="${cat origo-okr-tracker-private-key.json}"
Note: The private key string needs to have actual line breaks as opposed to \\n
because of an issue with how Firebase stores environment variables. Read more.
We use Google Auth to authenticate users and this needs to be enabled in the Firebase Console.
NOTE: This does not apply if you are only running this locally. We support Google and Microsoft as authentications
- Navigate to your project in the Firebase console
- Press the Authentication-button in the side menu
- Sign-in Method-tab
- Enable Google Auth
Get your Firebase SDK snippet from your Firebase Console:
- Navigate to Project settings
- Under Your apps, find Firebase SDK snippet and press Config
- Copy the following secrets to a
.env.production
file in the root directory. - Use also need
.env.local
to run this locally
Secret | Description |
---|---|
VITE_API_KEY |
from SDK snippet |
VITE_AUTH_DOMAIN |
from SDK snippet |
VITE_DATABASE_URL |
from SDK snippet |
VITE_PROJECT_ID |
from SDK snippet |
VITE_STORAGE_BUCKET |
from SDK snippet |
VITE_MESSAGING_SENDER_ID |
from SDK snippet |
VITE_APP_ID |
from SDK snippet |
VITE_I18N_LOCALE |
nb-NO OR en-US |
VITE_REGION |
europe-west2 |
VITE_LOGIN_PROVIDERS |
login providers allowed separated with hyphen - only implemented google, email. Ex: google-email |
VITE_HOST_URL |
URL which points to cloud functions that are set up as API CRUD endpoints |
VITE_MICROSOFT_TENANT_ID |
To limit the authentication to a certain TENANT, other wise everyone with a Microsoft account could log in |
VITE_ORGANIZATION |
Name of the organization |
firebase use --add
The local development environment uses Firebase Emulator Suite for Firestore and Cloud Functions. There is no need to do anything, only run the development script and everything is set up with a local user through Google auth.
Retrieve current Firebase environment configuration. This is needed for certain cloud functions to function locally.
firebase functions:config:get > ./functions/.runtimeconfig.json
Start Firebase emulators, import mock data and run the development server:
npm run dev
If you want to deploy to production or staging, you need to create multiple collections manually. Go to the Firestore Database in the Firebase Cloud Console
- audit
- departments
- keyResults
- kpis
- objectives
- organizations
- periods
- products
- requestAccess
- slugs
- users
- domainWhitelist (optional)
The collection users
needs one document with the first user. Create a document and add the following fields:
{
"id": "<email the user is signing in with",
"email": "<email the user is signing in with",
"superAdmin": true,
"widgets": {
"itemHome": {
"children": true,
"missionStatement": true,
"progression": true,
"team": true
},
"keyResultHome": {
"details": true,
"notes": true,
"weights": true
},
"objectiveHome": {
"details": false,
"progression": true,
"weights": true
}
}
}
After successfully logging in to the OKR Tracker, navigate to the Admin panel. Here you can create new organisations, departments and products to use as your mock data. On each object you can also create periods, objectives, key results and KPIs.
To export your mock data run the following command:
firebase emulators:export ./mock_data
To update existing mock data, simply run the export command above and confirm overwrite existing export.
Firebase now exports storage emulator as well, even if you don't use it. These new folders are not checked into git because they are empty and git does not add empty folders. If you are a user that has problems running the mock data, you will need to add two folders to the /mock_data/storage_export
folder. These are blobs
and metadata
.
It is possible to set up open API end points for users outside of the OKR-tracker frontend to update progress of Key Results and KPI's. To do so, you only need to deploy all the functions as usual, and then give the users the Cloud Function URL, but we do not recommend to call the Cloud Function directly. The better approach would be to set up a Google Cloud API Gateway and then reroute all the calls to the right Cloud Function.
We have set up an OpenAPI specification which you can check out here.
You can read more about on how to set up an API Gateway here.
The TL;DR is:
- Enable required services
- Create an API
- Create a new service account which has the correct access rights - we use the roles
APIGateway Admin
dsdsCloud Functions Invoker
- Create an API config
- Create a gateway
After an API Gateway has been set up, we have closed the gateway with an API Key, which means that you would need to create an API Key through the Google Cloud Console
If there are any questions regarding this, do not hesitate to get in contact with us and we will gladly help (i.e. create an issue)
Build and deploy to production:
npm run deploy
To configure automatic deploy to Firebase Hosting on merge to main
(triggered as part of the pipeline.yml
workflow), add the following secrets to your GitHub repository:
ENV_FILE_PROD
: Contains a dumped copy of the production dotenv file.FIREBASE_PROJECT_ID_PROD
: The Firebase Project ID.FIREBASE_SERVICE_ACCOUNT_PROD
: Exported JSON key for a GitHub Actions specific service account created for deploying to Firebase Hosting.
See the Firebase documentation for steps required to create these secrets, either by using the Firebase CLI or manually.
ESlint (including Prettier configured to be executed as a linter rule) and Stylelint are used for code formatting and linting. See configuration in the following files:
./.eslintrc.js
./.prettierrc.js
./.stylelintrc.js
npm run lint # Run linter
npm run lint:fix # Fix lint issues
npm run lint:style # Run style linter
npm run lint:style:fix # Fix lint issues found in styles
Based on this tutorial with a few differences for our use case.
The newest version of the OKR Tracker uses the Firebase Local Emulator Suite, where you can play and test your data without being afraid of production changes. It is still in the early stages, which means that auth is still handled by the cloud firebase and not locally.
When you start up the local Firestore emulator you can see that the Firestore is completely empty because we don't have any production data. This is an amazing way of working because you can do what ever you want without doing damages, but it's real life data that you most likely want to test and fix.
We are going to show you how you can export your production data to a GCP bucket or use an existing backed up bucket to import into your local Firestore.
- Firebase CLI
- Google Cloud SDK
How to install Google Cloud SDK and Firebase CLI
Login to Firebase and Google Cloud
firebase login
gcloud auth login
See the list of your projects and connect to the on you'd like to export data from:
firebase projects:list
firebase use <your project id>
gcloud projects list
gcloud config set project <your project id>
For the sake of this how to, we'll be using okr-tracker-production
(production) for gcloud, and origo-okr-tracker
(development) for the Firebase. The reason is that we use auth from our development Firebase instance, and not from the production instance.
If you don't already have automated backups of your production data, we will need to export the production data to a backup on GCP:
gcloud firestore export gs://okr-tracker-production.appspot.com/<backup-folder-name>
Now copy the new folder to your local machine, we are going to do this from our functions folder:
cd functions
gsutil -m cp -r gs://okr-tracker-production.appspot.com/<backup-folder-name> .
If you already have automated backups of your production data, you don't need to export the production data, only import it. For this application our backup folder is not part of the Firebase storage bucket:
gsutil -m cp -r gs://okr-tracker-backup/<YYYY-MM-DD>
To import the production data into your local Firebase emulator, you will need a metadata-file on the root folder, named firebase-export-metadata.json
:
{
"version": "8.6.0",
"firestore": {
"version": "1.11.5",
"path": "functions/<backup-folder-name>",
"metadata_file": "functions/<backup-folder-name>/<backup-folder-name>.overall_export_metadata"
}
}
Start your local Firebase emulator suite with the imported data. Firebase will read the metadata-json file automatically.
firebase emulators:start --import=./
We use cloud functions to backup our database every night and only keep backup of the last 14 days. If a backup is older than 14 days it gets automatically and permanently deleted from the storage bucket.
- Firebase Blaze plan
- Set IAM Permission
- Manually create a storage bucket
- Cloud function
TLDR:
- Navigate to Google Cloud Console and choose your project
- Navigate to IAM & Admin - Your App Engine Service account needs the Cloud Datastore Import Export Admin role
- Navigate to Storage – Create a storage bucket – Give it a rule to delete storage that is >14 days old
- Run the command
firebase functions:config:set storage.bucket="<your-storage-bucket-name>"
This is called automated restore but we still need to manually trigger a cloud function that does the restore from the Google Cloud Console
TLDR:
- From your Google Cloud Console navigate to PubSub
- Create a topic and name it 'restore-backup'
- Trigger the topic by publishing a message and the restore will be triggered
Gif of the process:
Src/Citation: The cloud function blog
OKR-tracker supports for the time being only four login providers: Microsoft, Google, email/pass. If you are looking for other providers that firebase support, we would love for you to open up a PR with the needed changes.
For the Microsoft-integration a TENANT must be specified as the environment-variable VITE_MICROSOFT_TENANT_ID.
Anyone with a google-account can login. To limit domain you have to implement this somehwhere, e.g. in set_user.js
- e.g. if (!user.email.lowerCase().endsWith('oslo.kommune.no')) rejectAccess();
If there are some problems running the project locally, or you get an infinite spinner: inspect the console in the browser, your terminal or firebase-debug.log
file for error messages. Some common messages when firing up the project for the first time:
- "No such file or directory, scandir storage_export/metadata"
- You need to create two directories under
mock_data/storage_export
-blobs
andmetadata
- You need to create two directories under
- It looks like you're trying to access functions.config().service_account but there is no value there
- Check if you have set the config key for service_account correctly. Read the readme again and se how you need to cat the private-key file correctly
- Missing permissions required for functions deploy. You must have permission iam.serviceAccounts.ActAs on service account
- Open the Google Cloud Console (check that you are in the correct project).
- Go to IAM & Admin -> Service Accounts
- Find the service account and click on it
- Click on the "Permissions" panel, then click
Grant Access
- Add your IAM member email address. For the role, select Service Accounts -> Service Account User
- Click Save
- Cannot read property
bucket
of underfined- Set the config key
storage.bucket
. Please read the readme again
- Set the config key