Enquiry Management Tool
Note
You can read the compiled documentation here. You should also read the Enquiry Management playbook in the DBT Software Development Manual.
The Enquiry Management Tool is a web application designed for the needs of the Investment Services Team based in Belfast, to simplify the management of investment enquiries. It allows for:
Reviewing enquiries
Updating them during their engagement with potential investors
Submitting them to DataHub
The batch import and export of enquiries in the form of CSV file
The application also periodically ingests new enquiries from Activity Stream, which were submitted through the GREAT Contact the investment team form.
Technical Overview
The Enquiry Management is a Django REST framework web application. It uses:
PostgreSQL database as the persistence layer
Elastic Search for enquiry search
GDS Components for the UI
Celery for periodic tasks and Activity Stream ingestion
Redis as a backend for both the session and the Celery message queue
Docker Compose for managing service dependencies in development and CI only
Cypress for end to end (e2e) tests
pytest for unit tests
OAuth 2.0 protocol for user authentication
Hawk protocol for inter-service communication authorization
Flake8 as a linter for Python code
Sphinx for documentation
The application also depends on the DataHub API and Activity Stream services.
For information about deployment to the dev, staging and production environments, refer to the Enquiry Management section in the DDaT Readme.
Coding Style (linting)
The style of Python code is enforced with Flake8, which is run against new code in a PR. You can set up a pre-commit hook to catch any formatting errors in updated code by:
$ make setup-flake8-hook
Warning
This creates a new ./env Python virtual environment.
Installation with Docker
This project uses Docker Compose to setup and run all the necessary components.
The docker-compose.yml
file provided is meant to be used for running tests and
development.
$ git clone https://github.com/uktrade/enquiry-mgmt-tool.git
$ cd enquiry-mgmt-tool
Clone the repository:
Bootstrap the project (install node dependencies and compile CSS from SCSS)
$ sh ./bootstrap.sh
Set up your
app/settings/.env
file:$ cp sample_env app/settings/.env
Build and run the necessary containers for the required environment:
Note when running Apple Mac M1 silicon chipset and you get an error: .. code-block:
runtime: failed to create new OS thread (have 2 already; errno=22) fatal error: newosproc
In Dockerfile Use RUN wget for Apple instead of amd64.
$ docker-compose up --build
You can view the app at http://localhost:8001
The application uses SSO by default. When you access the above link for the first time you will be redirected to SSO login page. After authentication it will create a user in the database.
Configuration
The sample_env
file contains all the required environment variable for the application.
Sample values are provided in this file but the actual values are to be included
in the app/settings/.env
file at the appropriate location.
The actual values are added to the Parameter store.
Ensure you are in the correct AWS account when accessing the parameter store. For dev, uat and staging use the
datahub
account when accessing the AWS account page
Once in the parameter store, you can filter for the dev
environment.
Gov PaaS will be deprecated from July 2024 and keys will be moved from Vault to Parameter Store.
The actual values are added to ready-to-trade
vault. Please use the values
corresponding to the dev
environment.
Single Sign On (SSO)
The app works out of the box with Mock SSO, which is part of the
Docker Compose setup. The OAuth 2.0 flow however only works locally when you
set the AUTHBROKER_URL
env var to host.docker.internal:8080
.
This is because the Mock SSO service (configured with the AUTHBROKER_URL
)
must be accessible from outside of docker-compose for the authorization redirect,
and also from within docker-compose to make the access token POST request.
The problem though is that the service can only be accessed from another docker
container as http://mock-sso:8080
, which however is not available outside of
docker-compose. The special
host.docker.internal
host name should be accessible from everywhere. Should it for any reason not
work, try docker.for.mac.localhost
. The value varies across platforms.
You can disable the SSO with the FEATURE_ENFORCE_STAFF_SSO_ENABLED
env var:
FEATURE_ENFORCE_STAFF_SSO_ENABLED=1 # on
FEATURE_ENFORCE_STAFF_SSO_ENABLED=0 # off
Or in app/settings/*
ENFORCE_STAFF_SSO_ENABLED=True # on
ENFORCE_STAFF_SSO_ENABLED=False # off
In which case, it will redirect to Django admin page for login.
Consent Service
To disable usage of Consent Service during development use FEATURE_ENFORCE_CONSENT_SERVICE
env var. Set your local .env
file like this:
FEATURE_ENFORCE_CONSENT_SERVICE=0
OAuth 2.0 Access Token Refreshment
OAuth 2.0 access tokens issued by Staff SSO have expiration time of 10 hours so,
that it just about outlives a user’s working time. In order to always have a valid
access token this app limits the user’s session to 9 hours. When the session
expires, the user will be automatically redirected to /auth/login
which will
refresh both the session and the access token and allows the user to use the
app uninterruptedly for another period of 9 hours.
The session expiration can be configured with the optional
SESSION_COOKIE_AGE
environmental variable which defaults to 9 hours.
Visual Component Styles
The CSS stylesheets are written in SCSS in the sass/
directory.
All class names should conform to the BEM methodology.
We rely on GDS Components and its GOV.UK Frontend SCSS package to provide the main UI component markup and style. We should strive to use the components with their default styling and only override the styles if there is a very good reason for it. Most developers feel an urge to tweak the stiles slightly to their subjective taste. You should resist this urge at all times!
Tests
In accordance with our testing philosophy, the end to end tests are the ones we rely on. The unit tests are optional and should be used mainly as an aid during the development. Keep in mind, that unit tests only make sense if they are written before the actual tested code. Most of the unit tests in this project are legacy code.
Unit tests
The unit tests are written with pytest. You can run all unit tests with:
$ ./test.sh app
End to end tests
The end to end tests (e2e) are written in JavaScript with Cypress. You can run them in watch mode with:
$ npm test
Note
npm test
expects the application to be listening on localhost:8000
The e2e tests can also be run headless with:
$ npx cypress run
or
$ docker-compose run cypress run --browser chrome
Allowing for Fixture Reset during e2e tests
It is possible to expose a URL method which enables an external testing agent (e.g. Cypress) to reset the database to a known fixture state.
Naturally this endpoint is not exposed by default. To enable it you must:
Run Django with
ROOT_URLCONF
set toapp.testfixtureapi_urls
which includes the “reset” endpoint. This can be achieved by running Django withDJANGO_SETTINGS_MODULE
set to eitherapp.settings.djangotest
(which is already set to be the case in pytest.ini) orapp.settings.e2etest
(which is already set to be the case in docker-compose.yml)Set the environment variable
ALLOW_TEST_FIXTURE_SETUP
to have the explicit exact valueallow
.
Under these conditions (and only these conditions) when this endpoint receives a POST
request
it will reset the application database to the state frozen in the files:
Because this method removes all user data it will also invalidate any active session which your test client holds. For this reason the method also creates a standard user of your specification, logs them in and returns the session info in the cookie headers of the response. You must therefor supply this method with JSON which describes a new seed user like this:
{
"username": "user123",
"first_name": "Evelyn",
"last_name": "User",
"email": "evelyn@example.com"
}
Running locally with Data Hub API
The Enquiry Management Tool application integrates with the Data Hub API. The EMT fetches metadata from the Data Hub API and creates an investment project if an enquiry is successful.
Run the Data Hub API following the instructions in the repository’s README
In your .env file in the data-hub-api repository, find the
DJANGO_SUPERUSER_EMAIL
variableFrom the top level of the data-hub-api repository, run the following command using the value of the variable above:
docker exec data-hub_api_1 python manage.py add_access_token DJANGO_SUPERUSER_EMAIL
Copy the token from your terminal and add it as the value of the
MOCK_SSO_TOKEN
environment variable in the .env file of the enquiry-mgmt-tool repositoryAlso in the enquiry-mgmt-tool .env file, set the value of the
MOCK_SSO_EMAIL_USER_ID
andMOCK_SSO_USERNAME
environment variables to the same email address you created the token forFollow the instructions at the top of this file to run the Enquiry Management Tool application
You can check that the integration with Data Hub is working correctly by going to http://localhost:8000/enquiries/1/edit and making sure that a list of names appears in the ‘Client Relationship Manager’ field dropdown
Documentation
Documentation is written in reStructuredText (RST) and Sphinx. The documentation source files
live in the doc/
directory.
Always keep the documentation in sync with the code
Try to provide a link to every external source of information, don’t let future readers of the codebase waste their time by searching for things which could be just clicked through a link.
Always specify all function arguments and return values with
:param <name>:
and:returns:
Sphinx directives. Idealy acompanied with:type <name>:
and:rtype:
to describe the expected types.When referencing other objects use the
:func:
,:class:
,:mod:
, etc directives. You can use them to also reference objects from external libraries e.g.:class:`djang.http.HttpRequest`
, provided they are properly linked through sphinx.ext.intersphinx (see the next point)When referencing objects from other libraries, always try to link them through sphinx.ext.intersphinx by adding a record to the
intersphinx_mapping
dictionary in doc/source/config.py.
Compilation to HTML
Note
Each of the documentation related commands require you to be in the
doc/
directory.
To compile the docs to HTML you need to have installed both the project
dependencies listed in requirements.txt
and the docs dependencies listed
in doc/requirements.txt
. The easiest way to install them is to run the
doc/bootstrap.sh
script:
$ cd doc/
# Create and activate virtual environment specific for docs compilation
$ python3 -m venv .env
& . .env/bin/activate
# Install the merged dependencies
$ sh doc/bootstrap.sh
You can then compile the HTML with:
$ make html
The compiled HTML will then be in doc/build
.
Hosting the compiled documentation
There is a CircleCI workflow defined in .circleci/config.yml
which compiles
and deploys the documentation to the gh-pages
branch of the repository
when code is pushed to the main
branch, which is after every PR merge.
The deployed documentation will then be available at
https://uktrade.github.io/enquiry-mgmt-tool.