You are on page 1of 1

1.

Go to cloud shell
2. authenticate yourself in the shell
3. activate the env. with the command --> source venv/bin/activate
4. Run the command 4th command given in the dataflow python file in the 2nd hands-
on task and to that command add the flag
--template-location='gs://your bucket name/composer/data.csv'

5. run the command gsutil cp gs://maniratnam_test/dummy_dag_test.py


./enter_anyname.py
6. open the enter_anyname.py in editor or nano
7. in the nano find the comment which says # The id you will see in the DAG airflow
page
below in the double quotes change the id in the format "empid_yourname_DAG"

8. Then in the same nano file find t8=DataFlowTemolatedJob(

)
change the template below task_id to template="gs://your bucket
name/composer/data.csv"

9. save the file

10.run the command 3rd given by mani in the teams which goes something like gsutil
cp ./youfilename.py .....

11. go to cloud composer and the open airflow link in front of oct-batch2 below
airflow webserver and then you will find your name task there

You might also like