Cloud Computing

This lesson shows how to produce a selection of data of your own starting from CMS open data. We will use “POET”, the software that you got familiar with during the workshop. We use a simple example to run some heavy-duty tasks on a public cloud vendor. We are scratching only the surface of the potential of this tool, but this lesson lays a strong foundation for how to run realistic physics analysis in GCP using kubernetes.

If you run into problems with any of these steps, please reach out to the organizers through the Mattermost channel for the workshop.

Prerequisites

For this lesson we will running Argo Workflows and Kubernetes in GCP, of which you should be familiar with if you went over the cloud pre-exercises.

Schedule

Setup Download files required for the lesson
00:00 1. Introduction How to get to your cluster?
How to get ready to submit the workflow?
00:10 2. Demo: Run a Full Analysis Flow How do I follow the progress of a workflow?
What are the different steps in the example workflow?
01:15 3. Cloud challenges How to adapt the workflow to my needs?
How to get my own code in the processing step?
How to change the resource requests for a workflow step?
02:25 4. Next steps What are the factors for efficient use of cloud resources
02:45 Finish

The actual schedule may vary slightly depending on the topics and exercises chosen by the instructor.