Loading video player...
Argo Workflows is a powerful workflow orchestration engine, built for Kubernetes. But like most systems built on Kubernetes, you usually use YAML to define resources. In Argo Workflows, these Workflow resources include complex DAGs (Directed Acyclic Graphs), with dependencies and parameters being passed between tasks, and artifacts being uploaded to storage. If you’re trying to express this imperative workflow logic using YAML, you’re in for a rough ride – a single Workflow can eventually span thousands of lines, leaving you lost in a sea of indentation and key-value pairs whenever you want to make changes. Enter Hera, the Python SDK for Argo Workflows. By treating Python functions as the unit of work that run in separate pods, orchestrating them becomes as effortless as calling the functions to create a DAG. With Hera, you get reusability and unit testing for free, something not possible in YAML, and you don’t need to use separate languages for your business logic and orchestration logic, keeping everything under one roof. In this session, you will: * Understand the fundamentals of Argo Workflows and workflow orchestration * Learn how Hera brings Kubernetes power to Python developers * Explore the best practices to write Workflows in Hera * Walk through a data science scenario workflow If you're tired of wrestling with YAML but want to run your Python jobs on Kubernetes, this session is for you!