Skip to content

Latest commit

 

History

History
 
 

model_hosting

Cognite logo

Model Hosting Examples

Prerequisites

In order to start using these examples, you need to be using Python 3.5.0

You also need to install the dependencies required to run through the examples. You can do this using pip (requirements.txt) or pipenv.

Navigate to the root directory of this repository and use one of the following methods to install the requirements.

Using requirements.txt

$ pip install -r requirements.txt

Using Pipenv

$ pipenv shell
$ pipenv sync --dev

Examples

This example shows a source package for applying a trivial transformation to two time series and setting up a schedule run this transformation on incoming data every minute. This example is a good place to get comfortable with how the hosting environment works.

This example is a slightly more complex version of the simple function example. We will still be scheduling a transformation on some time series data, but we will do so using a model we train locally and upload to the hosting environment.

This example shows the how you can both train and deploy a model in the hosting environment.

This is a collection of a few examples showing how to use the data specs and the data fetcher found in the cognite-model-hosting library.

Deploying Jupyter notebooks

If your model is simple enough to fit into a Jupyter notebook we offer a solution for easily deploying a notebook directly to Model Hosting from within the the notebook itself. Basically creating a source package from a notebook. This functionality is available throught the cognite-model-hosting-notebook package. Here are some examples showing how to use this package.

Deploy a simple model that convert from Fahrenheit to Celsius.

Deploy a model that can convert arbitrarily between arrays of Kelvin, Celsius and Fahrenheit.

Train a simple linear regression model locally and deploy the trained model to model hosting.

Train a simple linear regression model in Model Hosting and have it deployed afterwards.

Deploy a simple model that find the average between to time series. Then schedule it to run continuously on two actual time series and outputting to a third time series.