Google today launched Cloud Composer, a managed Apache Airflow service, in beta. The tool is designed for consistent workflow creation and management.

Analysts and engineers use workflows to automate manual processes, saving time and reducing the possibility of errors. These workflows and the mechanisms that run them, often critical pieces of infrastructure, range from ad-hoc scripts to full-featured frameworks, the management of which can be time-intensive and error-prone.

The Google Cloud team wants to solve this problem with a single managed solution at the platform level. Cloud Composer and Airflow currently support BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Datastore, Cloud Storage, and Cloud Pub/Sub. Pricing for Cloud Composer is consumption-based, so you pay for what you use, as measured by vCPU/hour, GB/month, and GB transferred/month — there are multiple pricing units because Cloud Composer uses several GCP products as building blocks.

Here is Google’s justification for Cloud Composer:

When creating this workflow, did the author use standard tools and save time by reusing previously developed code from other workflows? Do other people on the team or in the organization know this workflow exists and how it works? Is it easy for everyone to understand the state of this workflow and to investigate any problems when they occur? Will workflow authors all easily or immediately know the APIs needed to create rich workflows? Without a common workflow language and system, the answer to these questions is most frequently “no.”

Considering these workflows can be mission-critical, we…