From b561f8ca431acd65ffff5dbacacaa34d7ae92366 Mon Sep 17 00:00:00 2001 From: Karthik Vadla Date: Tue, 21 Feb 2017 11:59:54 -0800 Subject: [PATCH] Update build.sbt (#29) * clean up and formatting * formatting testcases * upgraded to scala 2.11.8 * update readme * Clean up .gitignore file, remove tf folder inside test folder * Rename spark-tf-core to core, and update all references * Remove core module, add License file and make pom changes * Renaming namespace, update all files with new namespace * Fix custom schema, correct pom * update readme * update readme * add sbt build files * Add conversion from mvn to sbt * Add conversion from mvn to sbt (#15) * Add classifier to bring in correct shaded jar and class (#16) * Add Travis support to sbt branch (#17) * Add classifier to bring in correct shaded jar and class * Add travis.yml file * Refactor travis file * Refactor travis file * Update README.md * Remove central1 dependency in sbt and sudo requirement from travis.yml (#18) * Add classifier to bring in correct shaded jar and class * Add travis.yml file * Refactor travis file * Refactor travis file * Update README.md * Cleanup * Clean up for sbt * Add exclude jars to build.sbt and update readme * SBT working, Cleaned up (#19) * Add conversion from mvn to sbt * Clean up for sbt * Add exclude jars to build.sbt and update readme * Refactor to use filterNot (#20) * Add classifier to bring in correct shaded jar and class * Add travis.yml file * Refactor travis file * Refactor travis file * Update README.md * Cleanup * use filterNot * Fix sbt-spark-package issue (#22) * Add classifier to bring in correct shaded jar and class * Add travis.yml file * Refactor travis file * Refactor travis file * Update README.md * Cleanup * use filterNot * Add sbt-spark-package plugin support * Remove github personal token details * Update builds.sbt for sbt spark packaging (#23) * Removed environment variable from path in README example (#14) * Add conversion from mvn to sbt * Clean up for sbt * Add exclude jars to build.sbt and update readme * SBT Support (#21) * clean up and formatting * formatting testcases * upgraded to scala 2.11.8 * update readme * Clean up .gitignore file, remove tf folder inside test folder * Rename spark-tf-core to core, and update all references * Remove core module, add License file and make pom changes * Renaming namespace, update all files with new namespace * Fix custom schema, correct pom * update readme * update readme * add sbt build files * Add conversion from mvn to sbt (#15) * Add classifier to bring in correct shaded jar and class * Add classifier to bring in correct shaded jar and class (#16) * Add travis.yml file * Refactor travis file * Refactor travis file * Update README.md * Add Travis support to sbt branch (#17) * Add classifier to bring in correct shaded jar and class * Add travis.yml file * Refactor travis file * Refactor travis file * Update README.md * Cleanup * Remove central1 dependency in sbt and sudo requirement from travis.yml (#18) * Add classifier to bring in correct shaded jar and class * Add travis.yml file * Refactor travis file * Refactor travis file * Update README.md * Cleanup * SBT working, Cleaned up (#19) * Add conversion from mvn to sbt * Clean up for sbt * Add exclude jars to build.sbt and update readme * use filterNot * Refactor to use filterNot (#20) * Add classifier to bring in correct shaded jar and class * Add travis.yml file * Refactor travis file * Refactor travis file * Update README.md * Cleanup * use filterNot * Add sbt-spark-package plugin support * Remove github personal token details * Add TensorFlow hadoop jar to Spark package (#24) * Remove extra license tag * Update org name in build.sbt * update build.sbt --- build.sbt | 2 -- 1 file changed, 2 deletions(-) diff --git a/build.sbt b/build.sbt index 7bddb11..820412d 100644 --- a/build.sbt +++ b/build.sbt @@ -91,5 +91,3 @@ credentials += Credentials(Path.userHome / ".ivy2" / ".sbtcredentials") // A fil test in assembly := {} spShade := true - -publishTo := Some("Artifactory Realm" at "https://tapanalyticstoolkit/spark-tensorflow-connector")