Re: Go SDK: How are re-starts handled?
There can only be one pipeline in Dataflow with the same job name so if you attempt to submit another job with the same job name you'll get back an identifier for the currently executing pipeline.
If I have a k8s process launching dataflow pipelines, what happens when the process is restarted? Can Apache Beam detect a running pipeline and join accordingly? or will the pipeline be duplicated?
Thanks in advance.