-
Notifications
You must be signed in to change notification settings - Fork 18
Open
Description
- move repo to
PEtab-devorg - rename repo
benchmark-problems(i.e.PEtab-dev/benchmark-problems) - rename
Benchmark-Modelsdirectory toproblem(i.e.PEtab-dev/benchmark-problems/problems) - add metadata to each problem in its PEtab YAML file (closes Automatic generation of general information file #25)
- create schema for such metadata
- include objective function value evaluated at nominal parameters (possibly closes Be careful with comparison of objective function values #17 unless that it about curation itself). also optionally the Chi2.
- include DOI to original publication
- copy all relevant metadata from model to PEtab YAML, including e.g. model ID (closes Change
<model>metaidin all models #155)
- create script to automatically extract interesting problem "features" then produce a "problem" vs. "feature" matrix TSV file for convenience (closes Create table of benchmark models and their model/data peculiarities #18 Write function to generate benchmark_model_characteristics.xlsx automatically #19). example features (maybe difficult to extract): "splines", "log-scale observables", "parameter-dependent initial conditions". Include all features currently in README overview table.
- add check for correctly-formatted simulation tables (closes Update simulatedData files #20)
- update all PEtab files (e.g. PEtab versions) (closes Add PEtab version info to files #1)
- update contribution guide (closes Add contribution guide #32)
- auto-update CITATION.cff to match Zenodo on new "releases" (closes Add CITATION.cff #164)
- consolidate scripts and Python package (Machine readable model property overview #169)
- require(?) PEtab Results with each problem to ensure "exact"/"easy" reproducibility
- add "Support" column to overview table, that links to each model's respective issue, where users can request model-specific support
m-philipps
Metadata
Metadata
Assignees
Labels
No labels