graph TD;
subgraph Polaris
Z[Docker Build]
end
Z[Docker Build] --> A(Ingress);
Z[Docker Build] --> B(Ingress);
subgraph GitLab
B[Container Registry] --> H(Images);
end
subgraph Server
A[Docker Pull] --> D(Traefik);
D[Traefik] --> E(Deployment:Analytics-Engine);
D[Traefik] --> F(Deployment:Rights-Engine);
D[Traefik] --> I(Deployment:Dashboard-SDK);
E(Deployment:Analytics-Engine)---id1[(MongoDB)]
F(Deployment:Rights-Engine)---id1[(MongoDB)]
I(Deployment:Dashboard-SDK)---id1[(MongoDB)]
end
Getting started
Each folder contains one of the components of
Local Deployment
Login in the docker registry
If you are not already logged in, you need to authenticate to the Container Registry by using your GitLab username and password. If you have Two-Factor Authentication enabled, use a Personal Access Token instead of a password.
docker login registry.digitallearning.gmbh
Create Docker network
Different containers communicate via a Docker network web
, which must be created before Docker compose files are executed.
docker network create web
Create private/public keys and .env
Please change all passwords in the configuration file and leave the passphrase empty.
cd single-compose
ssh-keygen -b 4096 -f id_rsa
cp .env.sample .env
Start containers
docker compose up -d
Please check whether all services started successfully.
docker compose ps
After that you should be able to visit http://localhost:8004/ and see the rights engine.
Migrate rights-engine database (only required once after first start)
docker compose exec -it rights-engine sh -c 'python3 manage.py sqlflush | sed s/TRUNCATE/DROP\ TABLE\ IF\ EXISTS/g | python3 manage.py dbshell && echo DROP\ TABLE\ IF\ EXISTS\ django_migrations\; | python3 manage.py dbshell && python3 manage.py migrate && python3 manage.py loaddata fixtures/initial_db.json'
Migrate analytics-engine database (only required once after first start)
docker compose exec -it scheduler sh -c 'scheduler create-db'
Adding analytics engine jobs
Analytics engines jobs are configured via yml files and read from the configuration
directory, which is a volume.
Example configuration file configuration/h5p_engines.yml
h5p_statements_count_engine:
crontab: "*/1 * * * *"
repo: "https://scheduler:glpat-MsDsrHMH-k3-DzEfNRgk@gitlab.digitallearning.gmbh/polaris/engines/dummy-engine.git"
analytics_token: "b6a4ec069ef9f688e781161d46c2a85c14a761a4eaf0074099656c7de44a65d9"
Example configuration file configuration/moodle_engines.yml
moodle_statements_count_engine:
crontab: "*/1 * * * *"
repo: "https://scheduler:glpat-MsDsrHMH-k3-DzEfNRgk@gitlab.digitallearning.gmbh/polaris/engines/dummy-engine.git"
analytics_token: "0482a0f3259c966dfddb38de867573a95995ee5e10898bb71f9ae0e99efe54ef"
Update analytics engine scheduler
docker compose exec -it scheduler sh -c 'scheduler read-configs'
Create visualization token
curl -X POST http://localhost:8004/api/v1/provider/visualization-tokens/create --data '{"engines": ["count_h5p_statements", "count_moodle_statements"]}' -H "Content-Type: application/json"
Returns JWT Token for dashboard
{"token":[JWT_TOKEN]"}
Loading JSON statements
It is recommended to import a set of sample statements into the LRS so that the analytics engines can work on this data. Furhtermore, users (user1@polaris.com and user2@polaris.com) can test the data disclousure and data deletion process.
$ docker compose exec -it mongodb_container sh -c 'mongoimport --authenticationDatabase admin --username root --password CHANGE_ME --db lrs --collection statements --drop --file ./lrs_statements_dump.json'
Start dashboard
- Clone
cd dashboard-example
- Download latest
@polaris/dashboard-sdk-X.X.X.tgz
from registry andnpm install @polaris/dashboard-sdk-1.0.2.tgz
it (TODO: improve with npm login) - Update
TOKEN
indashboard-example/src/js/app.js
- Run
npm run dev
- Visit http://localhost:8005/
(Optional) Filling DB with random statements
- Clone rights-engine
- Create provider config
$ cd rights-engine/tools/xapi-statement-generator
$ cp provider_config.json.example provider_config.json
- Open
provider_config.json
and insert Application Tokens (visible at http://localhost:8004 (Rights Engine UI), if you login as a provider) - Run
python generator.py -t http://localhost:8003/xapi/statements -r
Update Docker Images
docker compose pull