Skip to content
This repository has been archived by the owner on Mar 4, 2025. It is now read-only.

Commit

Permalink
[create-pull-request] automated change
Browse files Browse the repository at this point in the history
  • Loading branch information
morenol authored and github-actions[bot] committed Jun 20, 2024
1 parent 00586d7 commit 6ed29d8
Show file tree
Hide file tree
Showing 13 changed files with 91 additions and 47 deletions.
2 changes: 1 addition & 1 deletion content/connectors/inbound/http.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: HTTP
title: HTTP
---

{{% inline-embed file="embeds/connectors/inbound/http.md" %}}
2 changes: 1 addition & 1 deletion content/connectors/inbound/kafka.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: Kafka
title: Kafka
---

{{% inline-embed file="embeds/connectors/inbound/kafka.md" %}}
2 changes: 1 addition & 1 deletion content/connectors/inbound/mqtt.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: MQTT
title: MQTT
---

{{% inline-embed file="embeds/connectors/inbound/mqtt.md" %}}
2 changes: 1 addition & 1 deletion content/connectors/outbound/duckdb.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: DuckDB
title: DuckDB
---

{{% inline-embed file="embeds/connectors/outbound/duckdb.md" %}}
2 changes: 1 addition & 1 deletion content/connectors/outbound/graphite.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: Graphite
title: Graphite
---

{{% inline-embed file="embeds/connectors/outbound/graphite.md" %}}
2 changes: 1 addition & 1 deletion content/connectors/outbound/sql.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: SQL
title: SQL
---

{{% inline-embed file="embeds/connectors/outbound/sql.md" %}}
7 changes: 2 additions & 5 deletions embeds/connectors/inbound/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Example:
```yaml
apiVersion: 0.1.0
meta:
version: 0.2.5
version: 0.2.8
name: my-kafka-connector
type: kafka-source
topic: kafka-topic
Expand All @@ -28,12 +28,9 @@ kafka:
```
### Usage
To try out Kafka Source connector locally, you can use Fluvio CDK tool:
%copy%
```bash
$ cdk deploy -p kafka-source start --config crates/kafka-source/sample-config.yaml
cdk deploy -p kafka-source start --config crates/kafka-source/config-example.yaml
```

## Transformations
Expand Down
5 changes: 2 additions & 3 deletions embeds/connectors/inbound/mqtt.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ This is an example of connector config file:
# config-example.yaml
apiVersion: 0.1.0
meta:
version: 0.2.5
version: 0.2.7
name: my-mqtt-connector
type: mqtt-source
topic: mqtt-topic
Expand All @@ -65,7 +65,6 @@ mqtt:
```

Run connector locally using `cdk` tool (from root directory or any sub-directory):

```bash
cdk deploy start --config config-example.yaml
Expand Down Expand Up @@ -104,7 +103,7 @@ The previous example can be extended to add extra transformations to outgoing re
# config-example.yaml
apiVersion: 0.1.0
meta:
version: 0.2.5
version: 0.2.7
name: my-mqtt-connector
type: mqtt-source
topic: mqtt-topic
Expand Down
12 changes: 7 additions & 5 deletions embeds/connectors/outbound/duckdb.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ To connect to Motherduck server, use prefix: `md`. For example, `md://motherduc
```yaml
apiVersion: 0.1.0
meta:
version: 0.1.0
version: 0.1.3
name: duckdb-connector
type: duckdb-sink
topic: fluvio-topic-source
Expand Down Expand Up @@ -99,7 +99,7 @@ Connector configuration file:
apiVersion: 0.1.0
meta:
version: 0.1.0
name: duckdb-connector
name: json-sql-connector
type: duckdb-sink
topic: sql-topic
create-topic: true
Expand Down Expand Up @@ -127,14 +127,16 @@ transforms:
```

You can use Fluvio `cdk` tool to deploy the connector:

```bash
fluvio install cdk
```
and then:
```bash
cdk deploy start --config connector-config.yaml
```

To delete the connector run:
```bash
cdk deploy shutdown --name duckdb-connector
cdk deploy shutdown --config connector-config.yaml
```
After you run the connector you will see records in your database table.
Expand Down
18 changes: 4 additions & 14 deletions embeds/connectors/outbound/graphite.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ server address is specified on the `addr` field.
# sample-config.yaml
apiVersion: 0.1.0
meta:
version: 0.1.2
version: 0.2.0
name: my-graphite-connector-test-connector
type: graphite-sink
topic: test-graphite-connector-topic
Expand Down Expand Up @@ -85,22 +85,15 @@ With the Graphite instance set, we can move into [Setting Up Fluvio with Graphit
In this section we are going use the CDK to spin up the Graphite Sink Connector
to send metrics from Fluvio Records to the Graphite instance.

Make sure the Connector Development Kit is setup in your system by issuing the following command in your terminal.

%copy%
```bash
cdk
```

> If you dont have the Fluvio CLI installed already visit the [CLI][2] section

Create a YAML file with the name `weather-monitor-config.yaml` and specify connector settings:

%copy%
```yaml
apiVersion: 0.1.0
meta:
version: 0.1.2
version: 0.2.0
name: weather-monitor-sandiego
type: graphite-sink
topic: weather-ca-sandiego
Expand All @@ -112,16 +105,15 @@ graphite:

Deploy the Connector using the CDK


```bash
cdk deploy start --config weather-monitor-config.yaml
```

> Make sure your Graphite instance is running on `localhost:2003`, use the `cdk log` subcommand to read logs from the connector instance.
> Make sure your Graphite instance is running on `localhost:2003`, use the
> `cdk log` subcommand to read logs from the connector instance.

Then produce records as usual:

%copy%
```bash
echo 120 | fluvio produce weather-ca-sandiego
```
Expand All @@ -131,12 +123,10 @@ echo 120 | fluvio produce weather-ca-sandiego

Use Graphite's REST API to check on the stored data.

%copy%
```bash
curl -o ./data.json http://localhost:12345/render\?target\=weather.temperature.ca.sandiego\&format\=json\&noNullPoints
```


[1]: https://infinyon.cloud/login
[2]: https://www.fluvio.io/cli/
[3]: https://github.com/infinyon/graphite-sink-connector/blob/main/CONTRIBUTING.md
Expand Down
6 changes: 3 additions & 3 deletions embeds/connectors/outbound/http.md
Original file line number Diff line number Diff line change
Expand Up @@ -159,15 +159,15 @@ In this case, additional transformation will be performed before records are sen
Read more about [JSON to JSON transformations](https://www.fluvio.io/smartmodules/certified/jolt/).

### Offset Management
Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart.
To activate it, you need to provide the `consumer` name and set the `strategy: auto`.
Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart.
To activate it, you need to provide the `consumer` name and set the `strategy: auto`.
See the example below:
```yaml
apiVersion: 0.2.0
meta:
version: 0.2.9
name: my-http-sink
type: http-sink
type: http-sink
topic:
meta:
name: http-sink-topic
Expand Down
35 changes: 32 additions & 3 deletions embeds/connectors/outbound/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Example without security:
```yaml
apiVersion: 0.1.0
meta:
version: 0.2.7
version: 0.2.9
name: my-kafka-connector
type: kafka-sink
topic: kafka-topic
Expand All @@ -44,7 +44,7 @@ Example with security enabled:
```yaml
apiVersion: 0.1.0
meta:
version: 0.2.7
version: 0.2.9
name: my-kafka-connector
type: kafka-sink
topic: kafka-topic
Expand All @@ -68,9 +68,38 @@ kafka:
### Usage
To try out Kafka Sink connector locally, you can use Fluvio CDK tool:
```bash
cdk deploy -p kafka-sink start --config crates/kafka-sink/config-example.yaml
```

### Offset Management
Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart.
To activate it, you need to provide the `consumer` name and set the `strategy: auto`.
See the example below:
```yaml
apiVersion: 0.2.0
meta:
version: 0.2.9
name: my-kafka-connector
type: kafka-sink
topic:
meta:
name: kafka-sink-topic
consumer:
id: my-kafka-sink
offset:
strategy: auto
kafka:
url: "localhost:9092"
topic: fluvio-topic
create-topic: true
```
After the connector processed any records, you can check the last stored offset value via:
```bash
cdk deploy -p kafka-sink start --config crates/kafka-sink/sample-config.yaml
$ fluvio consumer list
CONSUMER TOPIC PARTITION OFFSET LAST SEEN
my-kafka-sink kafka-sink-topic 0 0 3s
```

### Testing with security
Expand Down
43 changes: 35 additions & 8 deletions embeds/connectors/outbound/sql.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ in the config. If a SmartModule requires configuration, it is passed via `with`
```yaml
apiVersion: 0.1.0
meta:
version: 0.3.3
version: 0.4.2
name: my-sql-connector
type: sql-sink
topic: sql-topic
Expand All @@ -62,7 +62,7 @@ The connector can use secrets in order to hide sensitive information.
```yaml
apiVersion: 0.1.0
meta:
version: 0.3.3
version: 0.4.2
name: my-sql-connector
type: sql-sink
topic: sql-topic
Expand All @@ -71,6 +71,37 @@ meta:
sql:
url: ${{ secrets.DATABASE_URL }}
```
### Offset Management
Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart.
To activate it, you need to provide the `consumer` name and set the `strategy: auto`.
See the example below:
```yaml
apiVersion: 0.2.0
meta:
version: 0.4.2
name: my-sql-connector
type: sql-sink
topic:
meta:
name: sql-sink-topic
consumer:
id: my-sql-sink
offset:
strategy: auto
secrets:
- name: DATABASE_URL
sql:
url: ${{ secrets.DATABASE_URL }}
```

After the connector processed any records, you can check the last stored offset value via:
```bash
$ fluvio consumer list
CONSUMER TOPIC PARTITION OFFSET LAST SEEN
my-http-sink http-sink-topic 0 0 3s
```

## Insert Usage Example
Let's look at the example of the connector with one transformation named [infinyon/json-sql](https://github.com/infinyon/fluvio-connectors/blob/main/smartmodules/json-sql/README.md). The transformation takes
records in JSON format and creates SQL insert operation to `topic_message` table. The value from `device.device_id`
Expand All @@ -95,7 +126,7 @@ Connector configuration file:
# connector-config.yaml
apiVersion: 0.1.0
meta:
version: 0.3.3
version: 0.4.2
name: json-sql-connector
type: sql-sink
topic: sql-topic
Expand Down Expand Up @@ -124,16 +155,12 @@ transforms:
```

You can use Fluvio `cdk` tool to deploy the connector:

```bash
cdk deploy start --config connector-config.yaml
```

To delete the connector run:

```bash
cdk deploy shutdown --name json-sql-connector
```
After you run the connector you will see records in your database table.

Expand All @@ -155,7 +182,7 @@ Connector configuration file for upsert (assuming `device_id` is a unique column
# connector-config.yaml
apiVersion: 0.1.0
meta:
version: 0.3.3
version: 0.4.2
name: json-sql-connector
type: sql-sink
topic: sql-topic
Expand Down

0 comments on commit 6ed29d8

Please sign in to comment.