thanks but those pages are related to the authentication inside Google Cloud Platform services, I need to authenticate the job on Sheets… Since that the required scope is https://www.googleapis.com/auth/drive is there a way to pass it in the deployment phase of a Dataflow job?
Da: Chamikara Jayalath <chamikara@xxxxxxxxxx>
Inviato: martedì 5 giugno 2018 19:26
Oggetto: Re: Read from a Google Sheet based BigQuery table - Python SDK
See following regarding authenticating Dataflow jobs.
I'm not sure about information specific to sheets, seems like there's some info in following.
On Tue, Jun 5, 2018 at 10:16 AM Leonardo Biagioli <lbiagioli@xxxxxxxxxxx> wrote:
Thank you for taking time to answer!
Is there a way to authenticate properly a Beam job on Dataflow runner? I should specify the required scope to read from Sheets, but where I can set that parameter?
Il 05 giu 2018 18:28, Chamikara Jayalath <chamikara@xxxxxxxxxx> ha scritto:
I don't think BQ federated tables support export jobs so reading directly from such tables likely will not work. But reading using a query should work if your job is authenticated properly (I haven't tested this).
On Tue, Jun 5, 2018, 5:56 AM Leonardo Biagioli <lbiagioli@xxxxxxxxxxx> wrote:
just wanted to ask you if there is a chance to read from a Sheet based BigQuery table from a Beam pipeline running on Dataflow…
I usually specify additional scopes to use through the authentication when running simple Python code to do the same, but I wasn’t able to find a reference to something similar for Beam.
Could you please help?
Thank you very much!