![]() ![]() At the end the logs will be pushed to S3. Below example will stream also the logs also to STDOUT while the task is running. ![]() git-sync will be used for initial sync of the dags to the temporary pod. The pod template will usually be the same airflow pod container with some extra added packages depending on what the dags will be required to do. apiVersion : v1 kind : ConfigMap metadata : name : airflow-env labels : app : airflow data : TZ : Etc/UTC POSTGRES_HOST : " MY_PSQL" POSTGRES_PORT : " 5432" POSTGRES_DB : " airflow" POSTGRES_USER : " airflow" POSTGRES_PASSWORD : " MY_PASS" REDIS_HOST : " MY_REDIS_HOST" REDIS_PORT : " 6379" REDIS_PASSWORD : " " FLOWER_PORT : " 5555" AIRFLOW_CORE_EXECUTOR : " KubernetesExecutor" FERNET_KEY : " oniqx7yno09xmpe9umpqxR390U-0=" AIRFLOW_CORE_FERNET_KEY : " oniqx7yno09xmpe9umpqxR390U-0=" DO_WAIT_INITDB : " true" AIRFLOW_CORE_SQL_ALCHEMY_CONN : " AIRFLOW_CELERY_RESULT_BACKEND : " AIRFLOW_CORE_DONOT_PICKLE : " false" AIRFLOW_CELERY_FLOWER_URL_PREFIX : " " AIRFLOW_CELERY_WORKER_CONCURRENCY : " 10" AIRFLOW_CORE_DAGS_FOLDER : " /usr/local/airflow/dags" AIRFLOW_WEBSERVER_BASE_URL : " AIRFLOW_CODE_ENABLE_XCOM_PICKLING : " false" AIRFLOW_KUBERNETES_POD_TEMPLATE_FILE : " /repo-sync/dags-repo/kubernetes/pod_templates/pod.yaml" AIRFLOW_KUBERNETES_NAMESPACE : " default" AIRFLOW_KUBERNETES_DELETE_WORKER_PODS : " True" AIRFLOW_KUBERNETES_DELETE_WORKER_PODS_ON_FAILURE : " True" AIRFLOW_CORE_STORE_SERIALIZED_DAGS : " False" AIRFLOW_CORE_MIN_SERIALIZED_DAG_UPDATE_INTERVAL : " 30" AIRFLOW_CORE_STORE_DAG_CODE : " False" AIRFLOW_WEBSERVER_AUTHENTICATE : " True" AIRFLOW_WEBSERVER_AUTH_BACKEND : " .google_auth" AIRFLOW_GOOGLE_CLIENT_ID : " MY_CLIENT_IDm" AIRFLOW_GOOGLE_CLIENT_SECRET : " MY_SECRET" AIRFLOW_GOOGLE_OAUTH_CALLBACK_ROUTE : " /oauth2callback" AIRFLOW_GOOGLE_DOMAIN : " MY_DOMAIN" ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |