You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 30, 2020. It is now read-only.
When running a job with "gcloud alpha genomics pipelines run", I have an output that is a couple different directories...
/mnt/data/output/A
/mnt/data/output/B
Is there any way to copy the directories A and B to my GCS without naming all files?
It fails because the pipelines tries: gsutil /mnt/data/output/* gs://my_bucket
Similar to the samtools example yaml, I have:
outputParameters:
name: outputPath
description: Cloud Storage path for where bamtofastq writes
localCopy:
path: output/*
disk: datadisk
I would recommend that you use dsub. I think it will provide a better experience than the gcloud command-line, including having support for wildcards and recursive inputs and outputs.
Hello,
When running a job with "gcloud alpha genomics pipelines run", I have an output that is a couple different directories...
/mnt/data/output/A
/mnt/data/output/B
Is there any way to copy the directories A and B to my GCS without naming all files?
It fails because the pipelines tries: gsutil /mnt/data/output/* gs://my_bucket
Similar to the samtools example yaml, I have:
outputParameters:
description: Cloud Storage path for where bamtofastq writes
localCopy:
path: output/*
disk: datadisk
And:
gcloud alpha genomics pipelines run
--pipeline-file my.yaml
--inputs bamfiles.bam
--outputs outputPath=gs://cgc_bam_bucket_007/output/ \
I was thinking that in the docker cmd: >, the output dir could be tarred up, and then
the output is just a tarball. But it's not a great solution.
Please help?
The text was updated successfully, but these errors were encountered: