-
- Notifications
You must be signed in to change notification settings - Fork 237
Open
Description
I tried to run workflow in a cluster using a shared conda installation.
I have created all conda envs required by the workflow.
Full command:
cwltool --disable-js-validation --skip-schemas --no-container --beta-conda-dependencies --beta-dependencies-directory /path/to/folder/with/_conda/ --rm-tmpdir --tmpdir-prefix=/eTMP/1559223537/ /download_quality_control_single.cwl sra_download_SRR10043607.yml Running multiple jobs at the same time produce this error:
INFO [workflow ] start INFO [workflow ] starting step fastq_dump INFO [step fastq_dump] start INFO [job fastq_dump] /export/TMP/1559223537/6jb6l69q$ fastq-dump \ --gzip \ --split-files \ SRR10043607 \ --defline-seq \ '@$ac.$si/$ri' \ --skip-technical ERROR Exception while running job: Failed to get file lock for /path/to/folder/with/conda. WARNING [job fastq_dump] completed permanentFail ERROR [step fastq_dump] Output is missing expected field file:////download_quality_control_single.cwl#download_quality_control/fastq_dump/output WARNING [step fastq_dump] completed permanentFail INFO [workflow ] completed permanentFail { "fastq": null, "fastqc_html": null, "fastqc_zip": null }WARNING Final process status is permanentFail How can I make cwltool to use conda but not to lock the conda dir?
Metadata
Metadata
Assignees
Labels
No labels