Now available in Labs

Your data engineering agent
built for airflow.

An Airflow expert directly in your terminal, Otto knows your environment, learns your conventions, and gets smarter every session.

curl -sSL install.astronomer.io | sudo bash -s

Accelerate every part of your workflow.

Build a Dag, investigate a failure, or plan your next upgrade. Otto has the context to act, not just advise. Your environment, your patterns, and the Airflow expertise to know what to do with it.

Ask Ottoconnected · prod-analytics
What tables contain customer data? Show me revenue trends by product for last quarter.
Found 4 customer-related tables. Auto Insurance leads at $4.2M (+8.4%). See the warehouse breakdown, lineage, and profile notes →
Ask a follow-up…
~/pipelines/analytics · understand-data
otto>what tables contain customer data?
─ thinking ──────────────────────────────────────
[memory] check whether warehouse discovery has been initialized
[skill] load the data-analysis workflow before answering
[lineage] trace source systems so suggestions have context
[profile] add table-level shape + quality details
 
read.astro/memory/MEMORY.md
skillwarehouse-init
skillanalyzing-data
skilltracing-upstream-lineage
skillprofiling-tables
 
Customer-related tables
analytics.dim_customers — attributes, lifecycle, region
analytics.fct_orders — order revenue · customer_id
analytics.dim_products — catalog + categories
marts.customer_revenue_daily — pre-aggregated
 
Revenue by product · last quarter
Auto Insurance $4.2M +8.4%
Roadside Assistance $1.6M +3.1%
Extended Warranty $1.1M −2.7%
Gap Coverage $780K +11.2%
 
otto>drill into auto insurance?
Understand your data

Real operational knowledge. Your team's conventions.

Every pipeline execution, failure, and correction generates signal. Otto captures it, learns from it, and feeds it back, so every session picks up where the last one left off.

Open source · Public knowledge

Airflow &
provider docs.

Available to any agent

Airflow docs, provider documentation, community best practices. Available to any agent via the astronomer/agents repo on GitHub.

Proprietary · Astronomer

Compatibility knowledge base.

Built by Astronomer

Version matrices, deprecation maps, upgrade paths, failure signatures. Built from eight years of running Airflow at enterprise scale.

Your team · Otto Memory

Conventions & operational history.

Private to your org

Your conventions, your connection configs, your corrections, and operational history. Accumulated automatically. Compounds every session.

Available where you work.

Access Otto in your terminal or directly in Astro today, with more ways to work with Otto coming soon.

Astro CLI

In your terminal

One command in Astro CLI: astro otto. Gives you pre-configured access to skills, tools, and proprietary context bundled. No API keys, no separate setup.

~/pipelines · zsh
$ astro otto
Otto v0.4 · authenticated as liv@ford.com · deployment: prod-analytics
 
otto> catch me up on overnight runs
▸ 42 dags completed · 2 warnings · 1 failure
failetl__orders__warehouse · Snowflake timeout · third occurrence
slowdaily_marketing_rollup · 3.2× normal runtime
 
otto> open etl__orders__warehouse?
Astro IDE

In Astro

Otto is available directly in Astro: author DAGs, plan upgrades, and investigate failures without leaving your browser. No local setup required.

etl__snowflake__reporting.pyproviders.pyastro.io/w/ford-analytics/ide
dags/
daily_marketing_rollup.py
etl__snowflake__reporting.py
etl__orders__warehouse.py
hourly_metrics.py
memory/
conventions.yaml
retries.yaml
L
Why is etl__orders__warehouse running 3× slower this week?
o
read_runsscan_logs Found it: the Snowflake warehouse size dropped from MEDIUMXSMALL last Tuesday. That change isn’t in the repo — someone edited it in the console. Want me to open a PR to pin it?
L
yes, and tag the platform team
o
open_pr PR #1482 opened · reviewers: @platform-infra
Coming Soon

On your desktop

Everything a data engineer needs in one place. Local Airflow that starts in seconds, Otto always available, and direct deployment to Astro. No container setup. No context switching.

Coming Soon

In any AI tool

Bring Otto’s Airflow expertise into Claude Code, Cursor, or any AI tool your team already uses via MCP. Access the same proprietary context, wherever you work.

Coming Soon

Migrate to Airflow faster with Otto.

Migration projects stall because the workflow-to-workflow translation is slow, manual, and requires deep expertise on both sides. Otto accelerates that work. Whether you're consolidating from schedulers like Control-M or Autosys, or migrating across Airflow environments.

Migration support is coming soon. If your team has an active migration project, we'd love to talk.

The future of data engineering starts here.

Stop drowning in operational work. Otto is how your team delivers trusted, timely data, without the overhead. Curious what that looks like for your environment? Let's talk.

Talk to Us

FAQs

How can I access Otto?

Otto is available in Labs to all Astro customers and can be accessed today in two places:

  • Astro CLI: run astro otto in your terminal. Otto launches pre-authenticated with your Astro credentials. No API keys, no separate setup.
  • Astro IDE: Otto is available directly in Astro for Dag authoring, debugging, and upgrades in your browser.

More interfaces are on the way, including a desktop app and access via MCP inside tools like Claude Code, Cursor and more. See how to get started with Otto.

Not an Astro customer yet? Start a free trial to access Otto.

How is Otto priced?

Otto is free to use during Labs as part of your existing Astro plan with usage limits. If you hit a limit, you can easily request more and we'll work with you from there.

Which AI models does Otto support?

Otto supports models from OpenAI, Anthropic, and Google. You can choose your model and switch mid-session. For more details, see the documentation.