Questions tagged [google-cloud-composer]
Google Cloud Composer is a fully managed workflow orchestration service, built on Apache Airflow, that empowers you to author, schedule, and monitor pipelines that span across clouds and on-premises data centers.
639
questions
1
vote
1answer
25 views
How to trigger a task externally in a dag cloud composer
I want to have a data pipeline that essentially looks like this
where multiple tasks are triggered by corresponding pubsub messages, process data from pubsub messages' input, and the last task is ...
0
votes
0answers
28 views
How can I run specific task/s from the Airflow dag
Current State of airflow dag:
ml_processors = [a, b, c, d, e]
abc_task >> ml_processors (all ml models from a to e run in parallel after abc task is successfully completed)
ml_processors >>...
0
votes
0answers
14 views
How do I print out the environmental variables which I passed to Airflow KubernestesPodOperator?
In GCP Cloud Composer (Airflow), I am trying to print some environmental variables, but they don't show up in the logs.
The printenv command shows other env variables, but not TEST.
import datetime
...
1
vote
1answer
41 views
Airflow DAG in Google Cloud Composer “seems to be missing” seemingly because of call to Google Cloud Storage
Previous Issues
This issue has been reported before here, here, and here, however, I suspect that this may be because of a call to google cloud storage.
Premise/Problem
The following code is placed in ...
0
votes
1answer
16 views
How to setup Google Cloud Composer to be able to launch pods on a GKE Autopilot cluster
I would like to be able to use a Google Cloud Composer cluster to launch kubernetes pods from its DAGs onto a separate GKE Autopilot cluster instead of onto the GKE cluster of Cloud Composer.
I have ...
0
votes
0answers
14 views
Airflow with mysql_to_gcp negsignal.sigkill
I'm using airflow with composer (GCP) to extract data from cloud sql for gcs and after gcs for bigquery, I have some tables between 100 Mb and 10 Gb. My dag has two tasks to do what I mentioned before....
0
votes
0answers
23 views
How to install extra linux package in GCP Composer?
I have written a dag which uses mongoexport command in BashOperator. By default mongoexport package is not installed in composer. I will need to install it using below command:
sudo apt install mongo-...
0
votes
0answers
24 views
Is my Airflow webserver in the same private VPC with my GKE created by the Google Cloud Composer?
Google Composer creates a GKE cluster and webserver managed by App Engine.
"Each Cloud Composer environment has a web server that runs the Airflow web interface. The web server is separate from ...
3
votes
2answers
50 views
GCP Cloud Composer: get_client_id.py error with required arguments
I have a question about GCP Cloud Composer.
To verify the function that triggers DAG (workflow)
I would like to get the client ID by referring to the python code in the following article.
https://...
0
votes
1answer
54 views
Airflow - Generating tasks dynamically from BigQuery, but tasks are run repeatedly before previous finishes
Context
I'm trying to build an ingestion pipeline on Google Cloud Platform using Composer, DataProc and BigQuery. I have a table in BigQuery which contains records of data source and its relevant file....
0
votes
0answers
13 views
I want to verify that Cloud Composer can perform the following DAGs using private communication
I want to verify that Cloud Composer can perform the following DAGs using private communication.
--Launch compute Engine from Cloud Composer
--Launch and connect CloudSQL from Cloud Composer
--Launch ...
0
votes
0answers
27 views
Cloud Composer using Sendgrid; email api request yields 401 forbidden error
The email operator is always returning 401 forbidden.
Version:
composer-1-14-2-airflow-1-10-12
The env variables are created in advance:
SENDGRID_MAIL_FROM=x@x.com
SENDGRID_API_KEY="SG.****"
...
1
vote
1answer
62 views
how to run dataflow job with cloud composer
I know Apache beam and I am able to create pipeline using it, I also know which operator in Cloud Composer to use to run dataflow job, I just want to know how to convert plain apache beam code into ...
1
vote
0answers
16 views
Passing JSON file as string in environment variable for composer airflow from terraform script
I am creating a composer from terraform where I want to pass a json as input variable
Terraform code:
software_config{
env_variables{
AIRFLOW_VAR_MYJSON ="{'__comment1__': 'This the global ...
1
vote
1answer
35 views
Airflow Connection build credentials
I have written a python script that works fine from the command line (with 'my-analytics.json' file stored in the same folder as the script. Now I am moving this script to AirFlow(cloud composer) ...