Skip to content

Commit

Permalink
Merge pull request #239 from nf-cmgg/dev
Browse files Browse the repository at this point in the history
Release v1.9.0
  • Loading branch information
nvnieuwk authored Jan 9, 2025
2 parents b637c64 + 70d78be commit 14fb272
Show file tree
Hide file tree
Showing 261 changed files with 12,176 additions and 6,775 deletions.
12 changes: 6 additions & 6 deletions .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ If you'd like to write some code for nf-cmgg/germline, the standard workflow is
1. Check that there isn't already an issue about your idea in the [nf-cmgg/germline issues](https://github.com/nf-cmgg/germline/issues) to avoid duplicating work. If there isn't one already, please create one so that others know you're working on this
2. [Fork](https://help.github.com/en/github/getting-started-with-github/fork-a-repo) the [nf-cmgg/germline repository](https://github.com/nf-cmgg/germline) to your GitHub account
3. Make the necessary changes / additions within your forked repository following [Pipeline conventions](#pipeline-contribution-conventions)
4. Use `nf-core schema build` and add any new parameters to the pipeline JSON schema (requires [nf-core tools](https://github.com/nf-core/tools) >= 1.10).
4. Use `nf-core pipelines schema build` and add any new parameters to the pipeline JSON schema (requires [nf-core tools](https://github.com/nf-core/tools) >= 1.10).
5. Submit a Pull Request against the `dev` branch and wait for the code to be reviewed and merged

If you're not used to this workflow with git, you can start with some [docs from GitHub](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests) or even their [excellent `git` resources](https://try.github.io/).
Expand All @@ -37,7 +37,7 @@ There are typically two types of tests that run:
### Lint tests

`nf-core` has a [set of guidelines](https://nf-co.re/developers/guidelines) which all pipelines must adhere to.
To enforce these and ensure that all pipelines stay in sync, we have developed a helper tool which runs checks on the pipeline code. This is in the [nf-core/tools repository](https://github.com/nf-core/tools) and once installed can be run locally with the `nf-core lint <pipeline-directory>` command.
To enforce these and ensure that all pipelines stay in sync, we have developed a helper tool which runs checks on the pipeline code. This is in the [nf-core/tools repository](https://github.com/nf-core/tools) and once installed can be run locally with the `nf-core pipelines lint <pipeline-directory>` command.

If any failures or warnings are encountered, please follow the listed URL for more documentation.

Expand Down Expand Up @@ -68,7 +68,7 @@ If you wish to contribute a new step, please use the following coding standards:
2. Write the process block (see below).
3. Define the output channel if needed (see below).
4. Add any new parameters to `nextflow.config` with a default (see below).
5. Add any new parameters to `nextflow_schema.json` with help text (via the `nf-core schema build` tool).
5. Add any new parameters to `nextflow_schema.json` with help text (via the `nf-core pipelines schema build` tool).
6. Add sanity checks and validation for all relevant parameters.
7. Perform local tests to validate that the new code works as expected.
8. If applicable, add a new test command in `.github/workflow/ci.yml`.
Expand All @@ -79,11 +79,11 @@ If you wish to contribute a new step, please use the following coding standards:

Parameters should be initialised / defined with default values in `nextflow.config` under the `params` scope.

Once there, use `nf-core schema build` to add to `nextflow_schema.json`.
Once there, use `nf-core pipelines schema build` to add to `nextflow_schema.json`.

### Default processes resource requirements

Sensible defaults for process resource requirements (CPUs / memory / time) for a process should be defined in `conf/base.config`. These should generally be specified generic with `withLabel:` selectors so they can be shared across multiple processes/steps of the pipeline. A nf-core standard set of labels that should be followed where possible can be seen in the [nf-core pipeline template](https://github.com/nf-core/tools/blob/master/nf_core/pipeline-template/conf/base.config), which has the default process as a single core-process, and then different levels of multi-core configurations for increasingly large memory requirements defined with standardised labels.
Sensible defaults for process resource requirements (CPUs / memory / time) for a process should be defined in `conf/base.config`. These should generally be specified generic with `withLabel:` selectors so they can be shared across multiple processes/steps of the pipeline. A nf-core standard set of labels that should be followed where possible can be seen in the [nf-core pipeline template](https://github.com/nf-core/tools/blob/main/nf_core/pipeline-template/conf/base.config), which has the default process as a single core-process, and then different levels of multi-core configurations for increasingly large memory requirements defined with standardised labels.

The process resources can be passed on to the tool dynamically within the process with the `${task.cpus}` and `${task.memory}` variables in the `script:` block.

Expand All @@ -96,7 +96,7 @@ Please use the following naming schemes, to make it easy to understand what is g

### Nextflow version bumping

If you are using a new feature from core Nextflow, you may bump the minimum required version of nextflow in the pipeline with: `nf-core bump-version --nextflow . [min-nf-version]`
If you are using a new feature from core Nextflow, you may bump the minimum required version of nextflow in the pipeline with: `nf-core pipelines bump-version --nextflow . [min-nf-version]`

### Images and figures

Expand Down
32 changes: 10 additions & 22 deletions .github/ISSUE_TEMPLATE/bug_report.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,46 +9,34 @@ body:
description: A clear and concise description of what the bug is.
validations:
required: true

- type: textarea
id: command_used
attributes:
label: Command used and terminal output
description: Steps to reproduce the behaviour. Please paste the command you used
to launch the pipeline and the output from your terminal.
description: Steps to reproduce the behaviour. Please paste the command you used to launch the pipeline and the output from your terminal.
render: console
placeholder: "$ nextflow run ...
placeholder: |
$ nextflow run ...
Some output where something broke
"
- type: textarea
id: files
attributes:
label: Relevant files
description: "Please drag and drop the relevant files here. Create a `.zip` archive
if the extension is not allowed.
Your verbose log file `.nextflow.log` is often useful _(this is a hidden file
in the directory where you launched the pipeline)_ as well as custom Nextflow
configuration files.
description: |
Please drag and drop the relevant files here. Create a `.zip` archive if the extension is not allowed.
Your verbose log file `.nextflow.log` is often useful _(this is a hidden file in the directory where you launched the pipeline)_ as well as custom Nextflow configuration files.
"
- type: textarea
id: system
attributes:
label: System information
description: "* Nextflow version _(eg. 23.04.0)_
description: |
* Nextflow version _(eg. 23.04.0)_
* Hardware _(eg. HPC, Desktop, Cloud)_
* Executor _(eg. slurm, local, awsbatch)_
* Container engine: _(e.g. Docker, Singularity, Conda, Podman, Shifter, Charliecloud,
or Apptainer)_
* Container engine: _(e.g. Docker, Singularity, Conda, Podman, Shifter, Charliecloud, or Apptainer)_
* OS _(eg. CentOS Linux, macOS, Linux Mint)_
* Version of nf-cmgg/germline _(eg. 1.1, 1.5, 1.8.2)_
"
2 changes: 1 addition & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Learn more about contributing: [CONTRIBUTING.md](https://github.com/nf-cmgg/germ
- [ ] This comment contains a description of changes (with reason).
- [ ] If you've fixed a bug or added code that should be tested, add tests!
- [ ] If you've added a new tool - have you followed the pipeline conventions in the [contribution docs](https://github.com/nf-cmgg/germline/tree/main/.github/CONTRIBUTING.md)
- [ ] Make sure your code lints (`nf-core lint`).
- [ ] Make sure your code lints (`nf-core pipelines lint`).
- [ ] Ensure the test suite passes (`nextflow run . -profile test,docker --outdir <OUTDIR>`).
- [ ] Check for unexpected warnings in debug mode (`nextflow run . -profile debug,test,docker --outdir <OUTDIR>`).
- [ ] Usage Documentation in `docs/usage.md` is updated.
Expand Down
29 changes: 22 additions & 7 deletions .github/workflows/build-docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,29 +13,44 @@ jobs:
- uses: actions/checkout@v3
with:
fetch-depth: 0 # fetch all commits/branches

- uses: actions/setup-python@v4
with:
python-version: 3.x
- run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV
- name: Obtain version from nextflow config

- name: Fetch current date
id: date
run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_OUTPUT

- name: Read pipeline version from .nf-core.yml
uses: nichmor/[email protected]
id: read_yml
with:
config: ${{ github.workspace }}/.nf-core.yml

- name: Parse version
id: version
run: |
version=$(grep "version" nextflow.config | tail -1 | sed -e s'/[^=]*= //' | cut -d "'" -f 2)
[[ $version == *"dev"* ]] && pipeline_version="dev" || pipeline_version=$version
echo "pipeline_version=$pipeline_version" >> $GITHUB_ENV
[[ ${{ steps.read_yml.outputs['template.version'] }} == *"dev"* ]] && pipeline_version="dev" || pipeline_version=${{ steps.read_yml.outputs['template.version'] }}
echo "version=$pipeline_version" >> $GITHUB_OUTPUT
- name: Setup git user
run: |
git config --global user.name "${{github.actor}}"
git config --global user.email "${{github.actor}}@users.noreply.github.com"
- uses: actions/cache@v3
with:
key: mkdocs-material-${{ env.cache_id }}
key: mkdocs-material-${{ steps.date.outputs.cache_id }}
path: .cache
restore-keys: |
mkdocs-material-
- name: Install dependencies
run: pip install mkdocs-material pymdown-extensions pillow cairosvg mike

- name: Build docs
run: |
[[ ${{ env.pipeline_version }} == "dev" ]] && mike deploy --push ${{ env.pipeline_version }} || mike deploy --push --update-aliases ${{ env.pipeline_version }} latest
[[ ${{ steps.version.outputs.version }} == "dev" ]] && mike deploy --push ${{ steps.version.outputs.version }} || mike deploy --push --update-aliases ${{ env.pipeline_version }} latest
- name: Set default docs
run: mike set-default --push latest
55 changes: 31 additions & 24 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -1,46 +1,36 @@
name: nf-core CI
# This workflow runs the pipeline with the minimal test dataset to check that it completes without any syntax errors
on:
push:
branches:
- dev
pull_request:
release:
types: [published]
workflow_dispatch:

env:
NXF_ANSI_LOG: false
NFT_MAX_SHARDS: 5
SOURCE_BRANCH: ${{ github.base_ref }}

concurrency:
group: "${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}"
cancel-in-progress: true

jobs:
test_all:
name: Run nf-test with ${{ matrix.test }}-${{ matrix.NXF_VER }}
name: Run ${{ matrix.filter }} tests | shard ${{ matrix.shard }} (${{ matrix.NXF_VER }})
# Only run on push if this is the nf-core dev branch (merged PRs)
if: "${{ github.event_name != 'push' || (github.event_name == 'push' && github.repository == 'nf-cmgg/germline') }}"
runs-on: ubuntu-latest
strategy:
matrix:
NXF_VER:
- "24.04.0"
- "24.10.0"
- "latest-everything"
test:
- "pipeline_default"
- "pipeline_callers"
- "pipeline_variations"
- "pipeline_variations2"
- "pipeline_gvcfs"
- "cram_call_genotype_gatk4"
- "cram_call_vardictjava"
- "cram_prepare_samtools_bedtools"
- "input_split_bedtools"
- "vcf_annotation"
- "vcf_extract_relate_somalier"
- "vcf_ped_rtgtools"
- "vcf_upd_updio"
- "vcf_validate_small_variants"
filter:
- "process"
- "workflow"
- "pipeline"
shard: [1, 2, 3, 4, 5]
steps:
- name: Free some space
run: |
Expand All @@ -51,22 +41,39 @@ jobs:
- name: Check out pipeline code
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b # v4
with:
fetch-depth: 0

- name: Install Nextflow
uses: nf-core/setup-nextflow@v2
with:
version: "${{ matrix.NXF_VER }}"

- name: Disk space cleanup
uses: jlumbroso/free-disk-space@54081f138730dfa15788a46383842cd2f914a1be # v1.3.1
# - name: Disk space cleanup
# uses: jlumbroso/free-disk-space@54081f138730dfa15788a46383842cd2f914a1be # v1.3.1

- name: Install nf-test
run: |
conda install -c bioconda nf-test
- name: Run pipeline with test data
- name: "Run ${{ matrix.filter }} tests (changed) | ${{ matrix.shard }}/${{ env.NFT_MAX_SHARDS }}"
if: ${{ env.SOURCE_BRANCH != 'main' }}
run: |
$CONDA/bin/nf-test test \
--ci \
--changed-since HEAD^ \
--shard ${{ matrix.shard }}/${{ env.NFT_MAX_SHARDS }} \
--filter ${{ matrix.filter }} \
--junitxml=default.xml
- name: "Run ${{ matrix.filter }} tests (all) | ${{ matrix.shard }}/${{ env.NFT_MAX_SHARDS }}"
if: ${{ env.SOURCE_BRANCH == 'main' }}
run: |
$CONDA/bin/nf-test test --tag ${{ matrix.test }} --junitxml=default.xml
$CONDA/bin/nf-test test \
--ci \
--shard ${{ matrix.shard }}/${{ env.NFT_MAX_SHARDS }} \
--filter ${{ matrix.filter }} \
--junitxml=default.xml
- name: Publish Test Report
uses: mikepenz/action-junit-report@v3
Expand Down
53 changes: 43 additions & 10 deletions .github/workflows/download_pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Test successful pipeline download with 'nf-core download'
name: Test successful pipeline download with 'nf-core pipelines download'

# Run the workflow when:
# - dispatched manually
Expand All @@ -8,7 +8,7 @@ on:
workflow_dispatch:
inputs:
testbranch:
description: "The specific branch you wish to utilize for the test execution of nf-core download."
description: "The specific branch you wish to utilize for the test execution of nf-core pipelines download."
required: true
default: "dev"
pull_request:
Expand Down Expand Up @@ -39,9 +39,11 @@ jobs:
with:
python-version: "3.12"
architecture: "x64"
- uses: eWaterCycle/setup-singularity@931d4e31109e875b13309ae1d07c70ca8fbc8537 # v7

- name: Setup Apptainer
uses: eWaterCycle/setup-apptainer@4bb22c52d4f63406c49e94c804632975787312b3 # v2.0.0
with:
singularity-version: 3.8.3
apptainer-version: 1.3.4

- name: Install dependencies
run: |
Expand All @@ -54,33 +56,64 @@ jobs:
echo "REPOTITLE_LOWERCASE=$(basename ${GITHUB_REPOSITORY,,})" >> ${GITHUB_ENV}
echo "REPO_BRANCH=${{ github.event.inputs.testbranch || 'dev' }}" >> ${GITHUB_ENV}
- name: Make a cache directory for the container images
run: |
mkdir -p ./singularity_container_images
- name: Download the pipeline
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
run: |
nf-core download ${{ env.REPO_LOWERCASE }} \
nf-core pipelines download ${{ env.REPO_LOWERCASE }} \
--revision ${{ env.REPO_BRANCH }} \
--outdir ./${{ env.REPOTITLE_LOWERCASE }} \
--compress "none" \
--container-system 'singularity' \
--container-library "quay.io" -l "docker.io" -l "ghcr.io" \
--container-library "quay.io" -l "docker.io" -l "community.wave.seqera.io" \
--container-cache-utilisation 'amend' \
--download-configuration
--download-configuration 'yes'
- name: Inspect download
run: tree ./${{ env.REPOTITLE_LOWERCASE }}

- name: Count the downloaded number of container images
id: count_initial
run: |
image_count=$(ls -1 ./singularity_container_images | wc -l | xargs)
echo "Initial container image count: $image_count"
echo "IMAGE_COUNT_INITIAL=$image_count" >> ${GITHUB_ENV}
- name: Run the downloaded pipeline (stub)
id: stub_run_pipeline
continue-on-error: true
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -stub -profile test,singularity --outdir ./results
- name: Run the downloaded pipeline (stub run not supported)
id: run_pipeline
if: ${{ job.steps.stub_run_pipeline.status == failure() }}
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -profile test,singularity --outdir ./results

- name: Count the downloaded number of container images
id: count_afterwards
run: |
image_count=$(ls -1 ./singularity_container_images | wc -l | xargs)
echo "Post-pipeline run container image count: $image_count"
echo "IMAGE_COUNT_AFTER=$image_count" >> ${GITHUB_ENV}
- name: Compare container image counts
run: |
if [ "${{ env.IMAGE_COUNT_INITIAL }}" -ne "${{ env.IMAGE_COUNT_AFTER }}" ]; then
initial_count=${{ env.IMAGE_COUNT_INITIAL }}
final_count=${{ env.IMAGE_COUNT_AFTER }}
difference=$((final_count - initial_count))
echo "$difference additional container images were \n downloaded at runtime . The pipeline has no support for offline runs!"
tree ./singularity_container_images
exit 1
else
echo "The pipeline can be downloaded successfully!"
fi
Loading

0 comments on commit 14fb272

Please sign in to comment.