Environments
We operate a few of different environments for development, staging and production.
Environments are initially built with CDK (for the underlying AWS setup) and then services are deployed from the monorepo via serverless.
Each environment is in its own AWS account.
| Environment | SLS stage | Usage | URL |
|---|---|---|---|
| Development | dev | Used for ongoing development of branches prior to merge. Test risky changes, deploy branches, can be destroyed and recreated easily. Its ok to deploy things by hand here. | https://goodfit-dev.netlify.app/ |
| Staging | stag | Where we QA things finally after merge to main but before deploy to production. Automatically deployed from main, no manual changes via the AWS console or API, all through pipelines. | https://goodfit-stag.netlify.app/ |
| Production | prod | Serves production traffic. No manual changes, deploys or changes "by hand". | https://app.goodfit.io/ |
Note that on dev and staging, you can login with shared account: test@goodfit.io GoodFit12345$!. You will need your own account on prod, please ask for that if you need it.
Environment creation
To create an environment, you need to do it in a specific order.
- Create a new environment in the gf-infrastructure repo via CDK and add a new pipeline, run this to provision infrastructure.
- Deploy the 'bootstrap' service, and run
BootstrapRDS,BootstrapRedshiftandMigrateRDS. At this point if it worked, you can runTestVPCandTestDBSecretsConnectionsand everything will pass. - Add new config to /config in gf-sourcers and run configSync in systemTools to find all the config and populate it in AWS SSM Parameter store
- Add a new environment pipeline stage and run the deployment pipeline in gf-sourcers to deploy all services. Note you might need run this more than once as due to dependancy order, it may fail first run.
- Run
PopulateTestDataif you want some test data in the environment (optional). You probably want to leave the system for a bit after this as various pipelines need to run etc etc, say 24 hours. - Configure and run DBT.
At this point all features should work and you can provision a test client and perform a dataset build.