Skip to content
This repository has been archived by the owner on Dec 15, 2022. It is now read-only.

Create Beam Carte plugins for Spark/Flink/... #25

Open
mattcasters opened this issue Jan 30, 2019 · 0 comments
Open

Create Beam Carte plugins for Spark/Flink/... #25

mattcasters opened this issue Jan 30, 2019 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@mattcasters
Copy link
Owner

For those runners which need to be executed on a remote server using the environment of the runner, we need remote execution possibilities.
This is the case for Beam runners like Spark and Flink and possibly Apex.

Carte plugins/services needed:

  • start a Beam transformation
  • accept statistics for an executing Beam Pipeline for a transformation
  • get the status of an executing Beam transformation

The way this would work is like this:

  • A client (Spoon, Job, ...) would execute the Transformation using the Beam Runtime Configuration.
  • This would contact Carte on the master and pass the transformation over.
  • The Beam Carte plugin would execute the transformation using the Main class (SparkMain) passing the details of the Carte server.
  • During execution the pipeline metrics are sent periodically to the Carte server on the master.
  • During execution the client (Spoon, Job, ...) would periodically get the status and metrics of the Beam transformation from the Carte server.
@mattcasters mattcasters added the enhancement New feature or request label Jan 30, 2019
@mattcasters mattcasters self-assigned this Jan 30, 2019
mattcasters added a commit that referenced this issue Feb 24, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant