You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Dec 15, 2022. It is now read-only.
For those runners which need to be executed on a remote server using the environment of the runner, we need remote execution possibilities.
This is the case for Beam runners like Spark and Flink and possibly Apex.
Carte plugins/services needed:
start a Beam transformation
accept statistics for an executing Beam Pipeline for a transformation
get the status of an executing Beam transformation
The way this would work is like this:
A client (Spoon, Job, ...) would execute the Transformation using the Beam Runtime Configuration.
This would contact Carte on the master and pass the transformation over.
The Beam Carte plugin would execute the transformation using the Main class (SparkMain) passing the details of the Carte server.
During execution the pipeline metrics are sent periodically to the Carte server on the master.
During execution the client (Spoon, Job, ...) would periodically get the status and metrics of the Beam transformation from the Carte server.
The text was updated successfully, but these errors were encountered:
For those runners which need to be executed on a remote server using the environment of the runner, we need remote execution possibilities.
This is the case for Beam runners like Spark and Flink and possibly Apex.
Carte plugins/services needed:
The way this would work is like this:
The text was updated successfully, but these errors were encountered: