ftest is a compliance checker for the software forge federation ecosystem
Go to file
Aravinth Manivannan 73ef0dad0d
ci/woodpecker/push/woodpecker Pipeline was successful Details
fix: CI: release build, key fingerprint and LICENSES
2023-10-05 02:37:51 +05:30
.reuse feat: init 2023-09-04 13:24:02 +05:30
.sqlx fix: update sqlx offline compilation 2023-10-04 21:14:50 +05:30
LICENSES feat: init 2023-09-04 13:24:02 +05:30
config feat: optionally start nginx proxy 2023-10-04 00:37:59 +05:30
db/migrations fix: use offline sqlx compilation while building migrator 2023-10-05 01:39:10 +05:30
migrations fix: auth is per test, so in-memory is sufficient 2023-09-27 23:18:46 +05:30
scripts fix: CI: release build, key fingerprint and LICENSES 2023-10-05 02:37:51 +05:30
src feat: capture specimen logs 2023-10-05 00:57:36 +05:30
templates feat: capture specimen logs 2023-10-05 00:57:36 +05:30
tests feat: apply license header 2023-09-27 19:50:07 +05:30
.dockerignore feat: enabel CI and publish docker img and bin 2023-10-05 01:29:10 +05:30
.env_sample feat: db migrations tool and db client 2023-09-05 19:22:50 +05:30
.gitignore feat: enabel CI and publish docker img and bin 2023-10-05 01:29:10 +05:30
.woodpecker.yml fix: CI: release build, key fingerprint and LICENSES 2023-10-05 02:37:51 +05:30
Cargo.lock feat: render results to HTML page 2023-10-04 21:04:42 +05:30
Cargo.toml feat: rm DockerLike and add block method to wait for container to exit 2023-10-04 23:37:48 +05:30
Dockerfile feat: enabel CI and publish docker img and bin 2023-10-05 01:29:10 +05:30
Makefile fix: use offline sqlx compilation while building migrator 2023-10-05 01:39:10 +05:30
README.md dog: cargo test workaround 2023-09-29 19:36:23 +05:30
build.rs feat: db migrations tool and db client 2023-09-05 19:22:50 +05:30

README.md

ftest: compliance suite for software forge federation

Goals

  1. Flexible: should be able to define new tests with ease
  2. Transparent: test results should be transparently available
  3. Versioned: Evolution of a forge implementation must be easily observable
  4. Reproducible: test results should reproducible
  5. Ease of deployment
  6. Ease of use for developers and forge developers
  7. Ease of use for general public

Development Notes

Create ftest Docker subnet before running tests or deploying

docker network create --attachable -d bridge ftest

cargo test is breaking due to Exception: Error while creating access token: 400...

Run cargo test -j 1 to run tests serially.

Test Runner

Choice of test runner should not be hard-codded, it should be possible to define new runner interfaces. But for now, we are going to go with Docker.

Test suites

A test suite will be made of multiple, independently defined tests, so that the same test implementation can be reused between multiple suites.

Tests suites can live anywhere on the internet, but the have to be linked in the control repository for them to be available on an ftest instance. This will enable ftest admins authorize the code that is run on their servers.

Test jobs

Each test will be it's own container image. This allows for polyglot tests, which should make it easy to define new tests. Also, containerization will offer some level of security when running untrusted code.

A new container of of the test image will be deployed whenever a new compliance job is scheduled. The container will be given a secret associated with the job ID. When the job is complete, the test suite container should upload the results to the ftest server.

Test result schema

"tests": [{
"test_id": string,
"success": boolean,
"logs": string, // debug data that the test generates
"raw_logs": string // container logs (ex: docker logs <container-name>
}]

Compliance job

Compliance job consist of more than one test suite.

Scheduling new compliance test jobs

The test runner will accept new jobs through a Git repository called the control repository. To schedule a new compliance job(for instance when a forge instance is updated), forge developers can send a patch to the control repository with the docker-compsoe definition file to spin up their software and its dependencies, and a job file that enumerates the test suites that must be run against it.

Compliance Results

For an implementation to be 100% compliant with a test suite, it will have to successfully pass all the tests defined in the test suite. A partial compliance score can be calculated using the same method.

The compliance report will include Compliance score, test logs, and the control repository commit that triggered job. This data will be used to create a versioned report in the form of a static site deployed from a Git repository (results repository).