Skip to content

Commit

Permalink
Update build.sbt (#29)
Browse files Browse the repository at this point in the history
* clean up and formatting

* formatting testcases

* upgraded to scala 2.11.8

* update readme

* Clean up .gitignore file, remove tf folder inside test folder

* Rename spark-tf-core to core, and update all references

* Remove core module, add License file and make pom changes

* Renaming namespace, update all files with new namespace

* Fix custom schema, correct pom

* update readme

* update readme

* add sbt build files

* Add conversion from mvn to sbt

* Add conversion from mvn to sbt (#15)

* Add classifier to bring in correct shaded jar and class (#16)

* Add Travis support to sbt branch (#17)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Remove central1 dependency in sbt and sudo requirement from travis.yml (#18)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* SBT working, Cleaned up (#19)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* Refactor to use filterNot (#20)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Fix sbt-spark-package issue (#22)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Add sbt-spark-package plugin support

* Remove github personal token details

* Update builds.sbt for sbt spark packaging (#23)

* Removed environment variable from path in README example (#14)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* SBT Support (#21)

* clean up and formatting

* formatting testcases

* upgraded to scala 2.11.8

* update readme

* Clean up .gitignore file, remove tf folder inside test folder

* Rename spark-tf-core to core, and update all references

* Remove core module, add License file and make pom changes

* Renaming namespace, update all files with new namespace

* Fix custom schema, correct pom

* update readme

* update readme

* add sbt build files

* Add conversion from mvn to sbt (#15)

* Add classifier to bring in correct shaded jar and class

* Add classifier to bring in correct shaded jar and class (#16)

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Add Travis support to sbt branch (#17)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* Remove central1 dependency in sbt and sudo requirement from travis.yml (#18)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* SBT working, Cleaned up (#19)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* use filterNot

* Refactor to use filterNot (#20)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Add sbt-spark-package plugin support

* Remove github personal token details

* Add TensorFlow hadoop jar to Spark package (#24)

* Remove extra license tag

* Update org name in build.sbt

* update build.sbt
  • Loading branch information
karthikvadla authored Feb 21, 2017
1 parent 0d03e51 commit b561f8c
Showing 1 changed file with 0 additions and 2 deletions.
2 changes: 0 additions & 2 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -91,5 +91,3 @@ credentials += Credentials(Path.userHome / ".ivy2" / ".sbtcredentials") // A fil
test in assembly := {}

spShade := true

publishTo := Some("Artifactory Realm" at "https://tapanalyticstoolkit/spark-tensorflow-connector")

0 comments on commit b561f8c

Please sign in to comment.