Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DRAFT] Merge master to ah var store again [VS-1178] #8890

Draft
wants to merge 66 commits into
base: ah_var_store
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
66 commits
Select commit Hold shift + click to select a range
d40a485
Funcotator Update for Datasource Release V1.8 (#8512)
jamesemery Oct 11, 2023
423d106
Fixed Funcotator VCF output renderer to correctly preserve B37 contig…
jamesemery Oct 11, 2023
2900e01
Fix for events not in minimal representation (#8567)
meganshand Nov 3, 2023
683eaa8
Ultima.flow annotations.fix (#8442)
dror27 Nov 13, 2023
e6e4dea
Removes unnecessary and buggy validation check (#8580)
ilyasoifer Nov 13, 2023
1dc7ee4
Update our HTSJDK dependency to 4.0.2 (#8584)
droazen Nov 14, 2023
7a08754
Update picard to 3.1.1 (#8585)
lbergelson Nov 15, 2023
fa3dfed
Add option to AnalyzeSaturationMutagenesis to keep disjoint mates (#8…
odcambc Nov 26, 2023
0da6409
New/Updated Flow Based Read tools (#8579)
dror27 Nov 28, 2023
e37b344
GroundTruthScorer doc update (#8597)
dror27 Dec 6, 2023
bf24519
Add a native GATK implementation for 2bit references, and remove the …
droazen Dec 8, 2023
e2c5fab
Update dependencies to address security vulnerabilities, and add a se…
droazen Dec 8, 2023
3b8b5bf
Update http-nio and wire its new settings (#8611)
lbergelson Dec 9, 2023
5839cbd
PrintFileDiagnostics for cram, crai and bai. (#8577)
cmnbroad Dec 9, 2023
2ad4a3e
Allow GenomicsDBImport to connect to az:// files without interference…
lbergelson Dec 9, 2023
e29cbc3
Disable line-by-line codecov comments (#8613)
droazen Dec 11, 2023
0b18579
Support for custom ploidy regions in HaplotypeCaller (#8609)
lbergelson Dec 11, 2023
85d13d4
Update the GATK base image to a newer LTS ubuntu release (#8610)
droazen Dec 12, 2023
75f5104
build_docker_remote: add ability to specify the RELEASE arg to the cl…
droazen Dec 13, 2023
23c8071
Update to htsjdk 4.1.0 (#8620)
lbergelson Dec 13, 2023
70ee553
Fix the Spark version in the GATK jar manifest, and used the right co…
droazen Dec 13, 2023
8317d8b
Update http-nio to 1.1.0 which implements Path.resolve() methods (#8626)
lbergelson Dec 13, 2023
fd873e9
Fix GT header in PostprocessGermlineCNVCalls's --output-genotyped-int…
jmarshall Dec 14, 2023
39cfbba
Output the new image name at the end of a successful cloud docker bui…
droazen Dec 14, 2023
b68fadc
Reduce SVConcordance memory footprint (#8623)
mwalker174 Dec 14, 2023
e796d20
Rewrite complex SV functional annotation in SVAnnotate (#8516)
epiercehoffman Jan 23, 2024
2d50cf8
Improvements to Mutect2's Permutect training data mode (#8663)
davidbenjamin Jan 26, 2024
dd73036
normal artifact lod is now defined without the extra minus sign (#8668)
davidbenjamin Jan 30, 2024
bbc028b
Parameterize the logging frequency for ProgressLogger in GatherVcfsCl…
gbggrant Feb 7, 2024
cfd4d87
Handle CTX_INV subtype in SVAnnotate (#8693)
epiercehoffman Feb 15, 2024
c97faf6
Move to GenomicsDB 1.5.2 which supports M1 macs (#8710)
nalinigans Mar 5, 2024
a2ebb37
Standardize test results directory between normal/docker tests (#8718)
lbergelson Mar 7, 2024
b0463e4
fix no data hom refs (#8715)
ldgauthier Mar 7, 2024
c3599a0
Update the setup_cloud github action (#8651)
lbergelson Mar 8, 2024
9af1be3
Improve failure message in VariantContextTestUtils (#8725)
lbergelson Mar 11, 2024
b81a638
added --inverted-read-filter argument to allow for selecting reads th…
jamesemery Mar 11, 2024
2640404
GatherVcfsCloud is no longer beta (#8680)
lbergelson Mar 11, 2024
8ee86e7
Fix to long deletions that overhang into the assembly window causing …
jamesemery Mar 12, 2024
141529b
dragstr model in Mutect2 WDL (#8716)
davidbenjamin Mar 18, 2024
dcfaa06
Update README to include list of popular software included in docker …
rickymagner Mar 19, 2024
47a97ae
Now we should be excluding the test folder from code coverage calcula…
KevinCLydon Mar 20, 2024
105b63e
Make M2 haplotype and clustered events filters smarter about germline…
davidbenjamin Mar 25, 2024
724b5bc
Funcotator: suppress a log message about b37 contigs when not doing b…
droazen Apr 1, 2024
6739e6d
SNVQ recalibration tool added for flow based reads (#8697)
ilyasoifer Apr 4, 2024
7cdc985
Several GQ0 cleanup changes: (#8741)
ldgauthier Apr 10, 2024
47c4858
Re-commit large files as lfs stubs (#8769)
lbergelson Apr 11, 2024
986cb15
Enable ReblockGVCF to subset AS annotations that aren't "raw" (pipe-d…
ldgauthier Apr 12, 2024
aed8b1b
Gc getpipeupsummaries use mappingqualityreadfilter (#8781)
gokalpcelik Apr 18, 2024
ec39c37
Add malaria spanning deletion exception regression test with fix (#8802)
ldgauthier May 1, 2024
5c32785
Bug fix in flow allele filtering (#8775)
ilyasoifer May 2, 2024
24f93b5
Allow for GT to be a nocall if GQ and PL[0] are zero instead of homre…
nalinigans May 6, 2024
c6daf7d
Reduced docker layers in GATK image from 44 to 16 (#8808)
kevinpalis May 9, 2024
a3bbfc4
VariantFiltration: added arg to write custom mask filter description …
meganshand May 15, 2024
4ed93fe
Bigger Permutect tensors and Permutect test datasets can be annotated…
davidbenjamin May 17, 2024
d4744f7
[BIOIN-1570] Fixed edge case in variant annotation (#8810)
ilyasoifer Jun 3, 2024
0be44f2
Mutect2 germline resource can have split multiallelic format (#8837)
davidbenjamin Jun 3, 2024
2878ce5
Mutect2 WDL and GetSampleName can handle multiple sample names in BAM…
davidbenjamin Jun 4, 2024
2a420e4
Permutect dataset engine outputs contig and read group indices, not n…
davidbenjamin Jun 4, 2024
ab98a5d
Fixed a bug in AlleleFiltering that ignored more than a single sample…
ilyasoifer Jun 13, 2024
d633a57
Fixing bug in ReblockGVCFs when removing annotations (#8870)
meganshand Jun 20, 2024
abef8e1
fix for gnarly when PLs are null (#8878)
meganshand Jun 20, 2024
e170f1b
Merge remote-tracking branch 'origin/master' into vs_1178_merge_maste…
mcovarr Jun 20, 2024
d4b680c
fix compilation
mcovarr Jun 20, 2024
8e097e1
update Dockers
mcovarr Jun 20, 2024
e548bd5
Dockstore
mcovarr Jun 20, 2024
164aa5a
Merge remote-tracking branch 'origin/ah_var_store' into vs_1178_merge…
mcovarr Jun 24, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
5 changes: 2 additions & 3 deletions .dockstore.yml
Original file line number Diff line number Diff line change
Expand Up @@ -309,8 +309,7 @@ workflows:
branches:
- master
- ah_var_store
- vs_1396_bq_query_audit
- gg_VS-1395_Have_CreateVATFromVDS_MoreGoodness
- vs_1178_merge_master_to_ah_var_store_again
tags:
- /.*/
- name: GvsIngestTieout
Expand Down Expand Up @@ -439,7 +438,7 @@ workflows:
- master
- ah_var_store
- EchoCallset
- vs_1412_pgen_pvars_not_compressed
- vs_1178_merge_master_to_ah_var_store_again
- name: MergePgenHierarchicalWdl
subclass: WDL
primaryDescriptorPath: /scripts/variantstore/wdl/MergePgenHierarchical.wdl
Expand Down
11 changes: 6 additions & 5 deletions .github/actions/upload-gatk-test-results/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,16 +40,17 @@ runs:
name: test-results-${{ inputs.is-docker == 'true' && 'docker-' || '' }}${{ matrix.Java }}-${{ matrix.testType }}
path: build/reports/tests

- name: Upload to codecov
run: bash <(curl -s https://raw.githubusercontent.com/broadinstitute/codecov-bash-uploader/main/codecov-verified.bash)
shell: bash
# Disabling codecov because it is timing out and failing builds that otherwise succeed.
## - name: Upload to codecov
## run: bash <(curl -s https://raw.githubusercontent.com/broadinstitute/codecov-bash-uploader/main/codecov-verified.bash)
## shell: bash

- name: Upload Reports
if: ${{ inputs.only-artifact != 'true' }}
id: uploadreports
run: |
gsutil -m cp -z html -z js -z xml -z css -r build/reports/tests gs:/${{ env.HELLBENDER_TEST_LOGS }}${{ inputs.repo-path }}/;
VIEW_URL=https://storage.googleapis.com${{ env.HELLBENDER_TEST_LOGS }}${{ inputs.repo-path }}/tests/testOnPackagedReleaseJar/index.html
VIEW_URL=https://storage.googleapis.com${{ env.HELLBENDER_TEST_LOGS }}${{ inputs.repo-path }}/tests/test/index.html
echo "See the test report at ${VIEW_URL}";
echo view_url="${VIEW_URL}" >> $GITHUB_OUTPUT

Expand Down Expand Up @@ -91,4 +92,4 @@ runs:
run: |
pip install --user PyGithub;
python scripts/github_actions/Reporter.py ${{ steps.uploadreports.outputs.view_url }};
shell: bash
shell: bash
8 changes: 2 additions & 6 deletions .github/workflows/gatk-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ jobs:

- name: 'Set up Cloud SDK'
if: needs.check-secrets.outputs.google-credentials == 'true'
uses: google-github-actions/setup-gcloud@v0
uses: google-github-actions/setup-gcloud@v2

- name: pull lfs files
run: git lfs pull
Expand Down Expand Up @@ -188,7 +188,7 @@ jobs:

- name: 'Set up Cloud SDK'
if: needs.check-secrets.outputs.google-credentials == 'true'
uses: google-github-actions/setup-gcloud@v0
uses: google-github-actions/setup-gcloud@v2

- name: build test jars
run: ./gradlew clean shadowTestClassJar shadowTestJar
Expand Down Expand Up @@ -230,10 +230,6 @@ jobs:
bash --init-file /gatk/gatkenv.rc /root/run_unit_tests.sh;
TEST_EXIT_VALUE=$?;
$( exit ${TEST_EXIT_VALUE} );
sudo chmod -R a+w build/reports/;
mkdir build/reports/tests/test \
&& cp -rp build/reports/tests/testOnPackagedReleaseJar/* build/reports/tests/test \
&& rm -r build/reports/tests/testOnPackagedReleaseJar;

- uses: ./.github/actions/upload-gatk-test-results
if: always()
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -44,3 +44,4 @@ funcotator_tmp

#Test generated dot files
test*.dot
.vscode/
57 changes: 22 additions & 35 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,16 +1,18 @@
ARG BASE_DOCKER=broadinstitute/gatk:gatkbase-3.1.0
ARG BASE_DOCKER=broadinstitute/gatk:gatkbase-3.2.0

# stage 1 for constructing the GATK zip
FROM ${BASE_DOCKER} AS gradleBuild
LABEL stage=gatkIntermediateBuildImage
ARG RELEASE=false

RUN ls .

ADD . /gatk
WORKDIR /gatk

# Get an updated gcloud signing key, in case the one in the base image has expired
RUN rm /etc/apt/sources.list.d/google-cloud-sdk.list && \
#Download only resources required for the build, not for testing
RUN ls . && \
rm /etc/apt/sources.list.d/google-cloud-sdk.list && \
apt update &&\
apt-key list && \
curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | apt-key --keyring /usr/share/keyrings/cloud.google.gpg add - && \
Expand All @@ -19,16 +21,13 @@ RUN rm /etc/apt/sources.list.d/google-cloud-sdk.list && \
apt-get -y clean && \
apt-get -y autoclean && \
apt-get -y autoremove && \
rm -rf /var/lib/apt/lists/*
RUN git lfs install --force

#Download only resources required for the build, not for testing
RUN git lfs pull --include src/main/resources/large

RUN export GRADLE_OPTS="-Xmx4048m -Dorg.gradle.daemon=false" && /gatk/gradlew clean collectBundleIntoDir shadowTestClassJar shadowTestJar -Drelease=$RELEASE
RUN cp -r $( find /gatk/build -name "*bundle-files-collected" )/ /gatk/unzippedJar/
RUN unzip -o -j $( find /gatk/unzippedJar -name "gatkPython*.zip" ) -d /gatk/unzippedJar/scripts
RUN chmod -R a+rw /gatk/unzippedJar
rm -rf /var/lib/apt/lists/* && \
git lfs install --force && \
git lfs pull --include src/main/resources/large && \
export GRADLE_OPTS="-Xmx4048m -Dorg.gradle.daemon=false" && /gatk/gradlew clean collectBundleIntoDir shadowTestClassJar shadowTestJar -Drelease=$RELEASE && \
cp -r $( find /gatk/build -name "*bundle-files-collected" )/ /gatk/unzippedJar/ && \
unzip -o -j $( find /gatk/unzippedJar -name "gatkPython*.zip" ) -d /gatk/unzippedJar/scripts && \
chmod -R a+rw /gatk/unzippedJar

FROM ${BASE_DOCKER}

Expand All @@ -47,17 +46,17 @@ RUN chmod -R a+rw /gatk
COPY --from=gradleBuild /gatk/unzippedJar .

#Setup linked jars that may be needed for running gatk
RUN ln -s $( find /gatk -name "gatk*local.jar" ) gatk.jar
RUN ln -s $( find /gatk -name "gatk*local.jar" ) /root/gatk.jar
RUN ln -s $( find /gatk -name "gatk*spark.jar" ) gatk-spark.jar
RUN ln -s $( find /gatk -name "gatk*local.jar" ) gatk.jar && \
ln -s $( find /gatk -name "gatk*local.jar" ) /root/gatk.jar && \
ln -s $( find /gatk -name "gatk*spark.jar" ) gatk-spark.jar

WORKDIR /root

# Make sure we can see a help message
RUN java -jar gatk.jar -h
RUN mkdir /gatkCloneMountPoint
RUN mkdir /jars
RUN mkdir .gradle
RUN java -jar gatk.jar -h && \
mkdir /gatkCloneMountPoint && \
mkdir /jars && \
mkdir .gradle

WORKDIR /gatk

Expand All @@ -80,28 +79,16 @@ RUN echo "source activate gatk" > /root/run_unit_tests.sh && \
echo "ln -s /gatkCloneMountPoint/build/ /gatkCloneMountPoint/scripts/docker/build" >> /root/run_unit_tests.sh && \
echo "cd /gatk/ && /gatkCloneMountPoint/gradlew -Dfile.encoding=UTF-8 -b /gatkCloneMountPoint/dockertest.gradle testOnPackagedReleaseJar jacocoTestReportOnPackagedReleaseJar -a -p /gatkCloneMountPoint" >> /root/run_unit_tests.sh

# TODO: Determine whether we actually need this. For now it seems to be required because the version of libstdc++ on
# TODO: the gatk base docker is out of date (maybe?)
RUN add-apt-repository -y ppa:ubuntu-toolchain-r/test && \
apt-get update && \
apt-get -y upgrade libstdc++6 && \
apt-get -y dist-upgrade

WORKDIR /root
RUN cp -r /root/run_unit_tests.sh /gatk
RUN cp -r gatk.jar /gatk
ENV CLASSPATH /gatk/gatk.jar:$CLASSPATH
RUN cp -r /root/run_unit_tests.sh /gatk && \
cp -r /root/gatk.jar /gatk
ENV CLASSPATH=/gatk/gatk.jar:$CLASSPATH PATH=$CONDA_PATH/envs/gatk/bin:$CONDA_PATH/bin:$PATH

# Start GATK Python environment

WORKDIR /gatk
ENV PATH $CONDA_PATH/envs/gatk/bin:$CONDA_PATH/bin:$PATH
RUN conda env create -n gatk -f /gatk/gatkcondaenv.yml && \
echo "source activate gatk" >> /gatk/gatkenv.rc && \
echo "source /gatk/gatk-completion.sh" >> /gatk/gatkenv.rc && \
conda clean -afy && \
find /opt/miniconda/ -follow -type f -name '*.a' -delete && \
find /opt/miniconda/ -follow -type f -name '*.pyc' -delete && \
rm -rf /root/.cache/pip

CMD ["bash", "--init-file", "/gatk/gatkenv.rc"]
Expand Down
29 changes: 29 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ releases of the toolkit.
* [Requirements](#requirements)
* [Quick Start Guide](#quickstart)
* [Downloading GATK4](#downloading)
* [Tools Included in Docker Image](#dockerSoftware)
* [Building GATK4](#building)
* [Running GATK4](#running)
* [Passing JVM options to gatk](#jvmoptions)
Expand Down Expand Up @@ -115,6 +116,34 @@ You can download and run pre-built versions of GATK4 from the following places:
* You can download a GATK4 docker image from [our dockerhub repository](https://hub.docker.com/r/broadinstitute/gatk/). We also host unstable nightly development builds on [this dockerhub repository](https://hub.docker.com/r/broadinstitute/gatk-nightly/).
* Within the docker image, run gatk commands as usual from the default startup directory (/gatk).

### <a name="dockerSoftware">Tools Included in Docker Image</a>

Our docker image contains the following bioinformatics tools, which can be run by invoking the tool name from the command line:
* bedtools (v2.30.0)
* samtools (1.13)
* bcftools (1.13)
* tabix (1.13+ds)

We also include an installation of Python3 (3.6.10) with the following popular packages included:
* numpy
* scipy
* tensorflow
* pymc3
* keras
* scikit-learn
* matplotlib
* pandas
* biopython
* pyvcf
* pysam

We also include an installation of R (3.6.2) with the following popular packages included:
* data.table
* dplyr
* ggplot2

For more details on system packages, see the GATK [Base Dockerfile](scripts/docker/gatkbase/Dockerfile) and for more details on the Python3/R packages, see the [Conda environment setup file](scripts/gatkcondaenv.yml.template). Versions for the Python3/R packages can be found there.

## <a name="building">Building GATK4</a>

* **To do a full build of GATK4, first clone the GATK repository using "git clone", then run:**
Expand Down
62 changes: 33 additions & 29 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ plugins {
id "com.github.johnrengelman.shadow" version "8.1.1" //used to build the shadow and sparkJars
id "com.github.ben-manes.versions" version "0.12.0" //used for identifying dependencies that need updating
id 'com.palantir.git-version' version '0.5.1' //version helper
id 'org.sonatype.gradle.plugins.scan' version '2.6.1' // scans for security vulnerabilities in our dependencies
}


Expand Down Expand Up @@ -56,20 +57,20 @@ repositories {
mavenLocal()
}

final htsjdkVersion = System.getProperty('htsjdk.version','3.0.5')
final picardVersion = System.getProperty('picard.version','3.1.0')
final htsjdkVersion = System.getProperty('htsjdk.version','4.1.0')
final picardVersion = System.getProperty('picard.version','3.1.1')
final barclayVersion = System.getProperty('barclay.version','5.0.0')
final sparkVersion = System.getProperty('spark.version', '3.3.1')
final hadoopVersion = System.getProperty('hadoop.version', '3.3.1')
final disqVersion = System.getProperty('disq.version','0.3.6')
final genomicsdbVersion = System.getProperty('genomicsdb.version','1.5.0')
final bigQueryVersion = System.getProperty('bigQuery.version', '2.31.0')
final bigQueryStorageVersion = System.getProperty('bigQueryStorage.version', '2.41.0')
final guavaVersion = System.getProperty('guava.version', '32.1.2-jre')
final sparkVersion = System.getProperty('spark.version', '3.5.0')
final hadoopVersion = System.getProperty('hadoop.version', '3.3.6')
final disqVersion = System.getProperty('disq.version','0.3.8')
final genomicsdbVersion = System.getProperty('genomicsdb.version','1.5.3')
final bigQueryVersion = System.getProperty('bigQuery.version', '2.35.0')
final bigQueryStorageVersion = System.getProperty('bigQueryStorage.version', '2.47.0')
final guavaVersion = System.getProperty('guava.version', '32.1.3-jre')
final log4j2Version = System.getProperty('log4j2Version', '2.17.1')
final testNGVersion = '7.0.0'

final googleCloudNioDependency = 'com.google.cloud:google-cloud-nio:0.127.0'
final googleCloudNioDependency = 'com.google.cloud:google-cloud-nio:0.127.8'

final baseJarName = 'gatk'
final secondaryBaseJarName = 'hellbender'
Expand Down Expand Up @@ -267,12 +268,12 @@ dependencies {
// are routed to log4j
implementation 'org.apache.logging.log4j:log4j-jcl:' + log4j2Version

implementation 'org.apache.commons:commons-lang3:3.5'
implementation 'org.apache.commons:commons-math3:3.5'
implementation 'org.apache.commons:commons-lang3:3.14.0'
implementation 'org.apache.commons:commons-math3:3.6.1'
implementation 'org.hipparchus:hipparchus-stat:2.0'
implementation 'org.apache.commons:commons-collections4:4.1'
implementation 'org.apache.commons:commons-vfs2:2.0'
implementation 'org.apache.commons:commons-configuration2:2.4'
implementation 'org.apache.commons:commons-collections4:4.4'
implementation 'org.apache.commons:commons-vfs2:2.9.0'
implementation 'org.apache.commons:commons-configuration2:2.9.0'
constraints {
implementation('org.apache.commons:commons-text') {
version {
Expand All @@ -287,7 +288,7 @@ dependencies {
implementation 'commons-io:commons-io:2.5'
implementation 'org.reflections:reflections:0.9.10'

implementation 'it.unimi.dsi:fastutil:7.0.6'
implementation 'it.unimi.dsi:fastutil:7.0.13'

implementation 'org.broadinstitute:hdf5-java-bindings:1.1.0-hdf5_2.11.0'
implementation 'org.broadinstitute:gatk-native-bindings:1.0.0'
Expand All @@ -297,24 +298,15 @@ dependencies {
exclude group: 'org.apache.commons'
}

//there is no mllib_2.12.15:3.3.0, so stay use 2.12:3.3.0
implementation ('org.apache.spark:spark-mllib_2.12:3.3.0') {
// TODO: migrate to mllib_2.12.15?
implementation ('org.apache.spark:spark-mllib_2.12:' + sparkVersion) {
// JUL is used by Google Dataflow as the backend logger, so exclude jul-to-slf4j to avoid a loop
exclude module: 'jul-to-slf4j'
exclude module: 'javax.servlet'
exclude module: 'servlet-api'
}
implementation 'com.thoughtworks.paranamer:paranamer:2.8'

implementation 'org.bdgenomics.bdg-formats:bdg-formats:0.5.0'
implementation('org.bdgenomics.adam:adam-core-spark2_2.12:0.28.0') {
exclude group: 'org.slf4j'
exclude group: 'org.apache.hadoop'
exclude group: 'org.scala-lang'
exclude module: 'kryo'
exclude module: 'hadoop-bam'
}

implementation 'org.jgrapht:jgrapht-core:1.1.0'
implementation 'org.jgrapht:jgrapht-io:1.1.0'

Expand Down Expand Up @@ -344,10 +336,10 @@ dependencies {
implementation 'org.broadinstitute:gatk-bwamem-jni:1.0.4'
implementation 'org.broadinstitute:gatk-fermilite-jni:1.2.0'

implementation 'org.broadinstitute:http-nio:0.1.0-rc1'
implementation 'org.broadinstitute:http-nio:1.1.0'

// Required for COSMIC Funcotator data source:
implementation 'org.xerial:sqlite-jdbc:3.36.0.3'
implementation 'org.xerial:sqlite-jdbc:3.44.1.0'

// natural sort
implementation('net.grey-panther:natural-comparator:1.1')
Expand Down Expand Up @@ -986,6 +978,18 @@ task gatkValidateGeneratedWdl(dependsOn: [gatkWDLGen, shadowJar]) {
}
}

// scan-gradle-plugin security vulnerability scan
ossIndexAudit {
allConfigurations = false // if true includes the dependencies in all resolvable configurations. By default is false, meaning only 'compileClasspath', 'runtimeClasspath', 'releaseCompileClasspath' and 'releaseRuntimeClasspath' are considered
useCache = true // true by default
outputFormat = 'DEFAULT' // Optional, other values are: 'DEPENDENCY_GRAPH' prints dependency graph showing direct/transitive dependencies, 'JSON_CYCLONE_DX_1_4' prints a CycloneDX 1.4 SBOM in JSON format.
showAll = false // if true prints all dependencies. By default is false, meaning only dependencies with vulnerabilities will be printed.
printBanner = true // if true will print ASCII text banner. By default is true.

// ossIndexAudit can be configured to exclude vulnerabilities from matching
// excludeVulnerabilityIds = ['39d74cc8-457a-4e57-89ef-a258420138c5'] // list containing ids of vulnerabilities to be ignored
// excludeCoordinates = ['commons-fileupload:commons-fileupload:1.3'] // list containing coordinate of components which if vulnerable should be ignored
}

/**
*This specifies what artifacts will be built and uploaded when performing a maven upload.
Expand Down
Loading
Loading