Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add compliance tests to docs! #2021

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from
Draft

Conversation

sneakers-the-rat
Copy link
Collaborator

Following #2018

i think it would be really cool for the compliance test results to be in the docs.

This is a ~ draft ~ intended for ~ further input ~ and is just supposed to be a starting point to see what we want!

Here what i've done is

  • put the summary files from the compliance test output into a non output directory to signal that they should be versioned (so that the docs can see them without needing to run all the tests, but also without committing the entire output directory)
  • do some very crude data munging in conf.py to expose that data in a jinja context
  • add some jinja templating to show the summary of all tests for each generator, and an example for core compliance tests for how we could have summary and expanded detail data for each test.

pretty ugly right now! but just a starting point!

So we end up with something like this for the summary:

Screenshot 2024-03-24 at 11 46 47 PM

and something like this for the core compliance test, where for each schema within a test you get a table of results

Screenshot 2024-03-24 at 11 47 04 PM

TODO

  • How much do we want to clean up after testing vs. build into the test data collection system? eg. the schema names are programmatically useful, but not really readable. do we want to store information like that at the time of data collection, or have some kind of cleaning operation afterwards?
  • What kind of information do we want per test and for an overall summary? should we group by framework, or by compliance area?
  • How to represent the relationship to the metamodel? the tests do some relation to the metamodel in the coverage file that's generated, we could use the sphinx index directive to show all the tests that are related to each metamodel term
  • How do we make it digestible? replacing all the strings "incomplete", "untested" etc. with icons is a start, but what else should we do???

@sneakers-the-rat sneakers-the-rat added documentation Improvements or additions to documentation devops poetry, setuptools, actions, etc. related changes labels Mar 25, 2024
Copy link

codecov bot commented Mar 25, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 80.69%. Comparing base (534a2db) to head (9e97210).

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2021   +/-   ##
=======================================
  Coverage   80.69%   80.69%           
=======================================
  Files         104      104           
  Lines       11622    11622           
  Branches     2910     2910           
=======================================
  Hits         9378     9378           
  Misses       1701     1701           
  Partials      543      543           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@cmungall
Copy link
Member

This is awesome, v much what I had in mind. Also multiple ways to slice and dice this - e.g. by generator/framework

I was wavering as to whether this should be into the main site or whether it might be better to have its own repo site

@sneakers-the-rat
Copy link
Collaborator Author

Also multiple ways to slice and dice this - e.g. by generator/framework

i'm still wrapping my head around the format of it, but we could make some helper functions to be able to pull and summarize compliance tests for a given generator (similar to my crude attempt in conf.py), so then we could have the compliance tests section which has them all ordered by the test, and then each generator could have its own table.

I was wavering as to whether this should be into the main site or whether it might be better to have its own repo site

up to you! it sorta makes sense to me to be in the main docs since the test results will already by in this repo, but wouldn't be that hard to make a second site. I think we have plenty of room in the toctree for tests as their own section if we want it that way!

@cmungall
Copy link
Member

The main reason I was tending towards a separate repo is because I think we'd want to check in all of the generated artefacts - or at least embed. And that is gonna be big, and get bigger...

@cmungall
Copy link
Member

cmungall commented May 1, 2024

I think as this is only committing a TSV it doesn't prevent us from going the route of a second repo later

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
devops poetry, setuptools, actions, etc. related changes documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants