Skip to content

Commit

Permalink
Update org name in build.sbt (#28)
Browse files Browse the repository at this point in the history
* clean up and formatting

* formatting testcases

* upgraded to scala 2.11.8

* update readme

* Clean up .gitignore file, remove tf folder inside test folder

* Rename spark-tf-core to core, and update all references

* Remove core module, add License file and make pom changes

* Renaming namespace, update all files with new namespace

* Fix custom schema, correct pom

* update readme

* update readme

* add sbt build files

* Add conversion from mvn to sbt

* Add conversion from mvn to sbt (#15)

* Add classifier to bring in correct shaded jar and class (#16)

* Add Travis support to sbt branch (#17)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Remove central1 dependency in sbt and sudo requirement from travis.yml (#18)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* SBT working, Cleaned up (#19)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* Refactor to use filterNot (#20)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Fix sbt-spark-package issue (#22)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Add sbt-spark-package plugin support

* Remove github personal token details

* Update builds.sbt for sbt spark packaging (#23)

* Removed environment variable from path in README example (#14)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* SBT Support (#21)

* clean up and formatting

* formatting testcases

* upgraded to scala 2.11.8

* update readme

* Clean up .gitignore file, remove tf folder inside test folder

* Rename spark-tf-core to core, and update all references

* Remove core module, add License file and make pom changes

* Renaming namespace, update all files with new namespace

* Fix custom schema, correct pom

* update readme

* update readme

* add sbt build files

* Add conversion from mvn to sbt (#15)

* Add classifier to bring in correct shaded jar and class

* Add classifier to bring in correct shaded jar and class (#16)

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Add Travis support to sbt branch (#17)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* Remove central1 dependency in sbt and sudo requirement from travis.yml (#18)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* SBT working, Cleaned up (#19)

* Add conversion from mvn to sbt

* Clean up for sbt

* Add exclude jars to build.sbt and update readme

* use filterNot

* Refactor to use filterNot (#20)

* Add classifier to bring in correct shaded jar and class

* Add travis.yml file

* Refactor travis file

* Refactor travis file

* Update README.md

* Cleanup

* use filterNot

* Add sbt-spark-package plugin support

* Remove github personal token details

* Add TensorFlow hadoop jar to Spark package (#24)

* Remove extra license tag

* Update org name in build.sbt
  • Loading branch information
karthikvadla authored Feb 21, 2017
1 parent e6173c6 commit 0d03e51
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ organization := "org.trustedanalytics"

scalaVersion in Global := "2.11.8"

spName := "trustedanalytics/spark-tensorflow-connector"
spName := "tapanalyticstoolkit/spark-tensorflow-connector"

sparkVersion := "2.1.0"

Expand Down Expand Up @@ -91,3 +91,5 @@ credentials += Credentials(Path.userHome / ".ivy2" / ".sbtcredentials") // A fil
test in assembly := {}

spShade := true

publishTo := Some("Artifactory Realm" at "https://tapanalyticstoolkit/spark-tensorflow-connector")

0 comments on commit 0d03e51

Please sign in to comment.