Create a new PyAirbyte data ingestion job. It creates an Airflow DAG that uses PyAirbyte to read from a specified source and write to Iceberg tables.
The access token received from the authorization server in the OAuth 2.0 flow.
Request model for creating an Airbyte ingestion job.
Name of the ingestion job.
PyAirbyte source identifier (e.g., 'source-github').
Source configuration key-value pairs (including secrets).
Target Iceberg schema where tables will be created.
Write mode: append, overwrite, or merge.
append, overwrite, merge Optional list of streams to sync. If empty, all streams are synced.
Comma-separated merge key columns. If omitted for merge mode, Airbyte primary keys are used.
Cron expression for scheduled runs.
DataHub domain to assign to ingested tables.
DataHub tags to assign to ingested tables.
Owner user ID.
Optional per-stream column transformations (cast, rename, encrypt). Use the 'stream' field on each transformation to target a specific stream.