Skip to content

Commit 0179654

Browse files
authored
Merge pull request #117 from jfy133/dev
MultiQC and Documentation Improvements
2 parents 056c552 + 23efba7 commit 0179654

3 files changed

Lines changed: 45 additions & 23 deletions

File tree

conf/multiqc_config.yaml

Lines changed: 22 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -2,27 +2,27 @@ report_comment: >
22
This report has been generated by the <a href="https://github.com/nf-core/eager" target="_blank">nf-core/eager</a>
33
analysis pipeline. For information about how to interpret these results, please see the
44
<a href="https://github.com/nf-core/eager" target="_blank">documentation</a>.
5-
report_section_order:
6-
nf-core/eager-software-versions:
7-
order: -1000
8-
fastqc:
9-
after: 'nf-core/eager-software-versions'
10-
name: 'FastQC (raw)'
5+
top_modules:
6+
- 'fastqc':
7+
name: 'FastQC (pre-AdapterRemoval)'
118
path_filters:
129
- '*_fastqc.zip'
13-
fastqc:
14-
name: 'FastQC (trimmed)'
15-
info: 'This section of the report shows FastQC results after adapter trimming.'
16-
target: ''
17-
path_filters:
18-
- '*combined.prefixed_fastqc.zip'
19-
adapterRemoval:
20-
after: 'fastqc'
21-
Samtools:
22-
after: 'adapterRemoval'
23-
dedup:
24-
after: 'Samtools'
25-
qualimap:
26-
after: 'dedup'
27-
preseq:
28-
after: 'qualimap'
10+
path_filters_exclude:
11+
- '*.combined.prefixed_fastqc.zip'
12+
- 'adapterRemoval'
13+
- 'fastqc':
14+
name: 'FastQC (post-AdapterRemoval)'
15+
path_filters:
16+
- '*.combined.prefixed_fastqc.zip'
17+
- 'samtools'
18+
- 'preseq'
19+
- 'dedup'
20+
- 'qualimap'
21+
- 'damageprofiler'
22+
- 'gatk'
23+
24+
read_count_multiplier: 1
25+
read_count_prefix: ''
26+
read_count_desc: ''
27+
decimalPoint_format: '.'
28+
thousandsSep_format: ','

conf/shh.config

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@
77

88
singularity {
99
enabled = true
10+
cacheDir = "/projects1/users/$USER/nextflow/nf_cache/singularity/"
1011
}
1112

1213
/*
@@ -29,6 +30,7 @@ process {
2930
}
3031
}
3132

33+
3234
params {
3335
max_memory = 734.GB
3436
max_cpus = 64

docs/configuration/adding_your_own.md

Lines changed: 21 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,6 @@ process {
2727
clusterOptions = '-A myproject'
2828
}
2929
```
30-
3130
## Software Requirements
3231
To run the pipeline, several software packages are required. How you satisfy these requirements is essentially up to you and depends on your system. If possible, we _highly_ recommend using either Docker or Singularity.
3332
Please see the [`installation documentation`](../installation.md) for how to run using the below as a one-off. These instructions are about configuring a config file for repeated use.
@@ -84,6 +83,27 @@ To use conda in your own config file, add the following:
8483
process.conda = "$baseDir/environment.yml"
8584
```
8685

86+
## Software Caches
87+
88+
Each new version of a pipeline downloaded and ran, will pull down a new image (docker/singularity)/collection (conda) of all the software required for the pipeline. By default this will be placed in the `work/` directory of an EAGER run. When running lots of pipeline jobs, this can slow down the pipeline (having to create a download a new environment each time) and take up a lot of hard-disk space (as each run has it's own duplicate of the environment).
89+
90+
You can specify a central location for this using the `cacheDir` parameter [(see nextflow documentation)](https://www.nextflow.io/docs/latest/config.html). This can either be central for all users e.g.
91+
92+
```
93+
singularity {
94+
enabled = TRUE
95+
cacheDir = '/<path>/<to/<cache_dir>/'
96+
}
97+
```
98+
99+
Or if you give freedom to users as to which version they use
100+
101+
```
102+
conda {
103+
cacheDir = "/<path>/<to>/$USER/<cache_dir>"
104+
}
105+
```
106+
87107
## Job Resources
88108
#### Automatic resubmission
89109
Each step in the pipeline has a default set of requirements for number of CPUs, memory and time. For most of the steps in the pipeline, if the job exits with an error code of `143` (exceeded requested resources) it will automatically resubmit with higher requests (2 x original, then 3 x original). If it still fails after three times then the pipeline is stopped.

0 commit comments

Comments
 (0)