Pass parameters to SQL query when using PostgresOperator in Airflow
PostgresOperator allows us to use a SQL file as the query. However, when we do that, the standard way of passing template parameters no longer works.
For example, if I have the following SQL query:
1
SELECT column_a, column_b FROM table_name WHERE column_a = {{ some_value }}
Airflow will not automatically pass the some_value
variable as the parameter.
In this situation, I have to use the parameters
attribute of the PostgresOperator:
1
2
3
4
5
6
PostgresOperator(
task_id='postgres_task',
postgres_conn_id='postgress_sophi',
params={"some_value":"'the value I want'"},
sql='query.sql',
dag=dag)
Note that, I had to wrap the value into single quotes! Because in Postgres double quotes are used to reference a columns, tables, or functions. String literals, on the other hand, are inside single quotes.
In the query.sql
file, I have to change the template variable to params.some.value
:
1
SELECT column_a, column_b FROM table_name WHERE column_a = {{ params.some_value }}
Parsing machine learning logs with Ahana, a managed Presto service, and Cube, a headless BI solution

Check out my article published on the Cube.dev blog!
You may also like
- Use HttpSensor to pause an Airflow DAG until a website is available
- How to check whether a YARN application has finished
- How to use xcom_pull to get a variable from another DAG
- How to check the next execution date of an Airflow DAG
- How to use AWSAthenaOperator in Airflow to verify that a DAG finished successfully
Bartosz Mikulski
- Data/MLOps engineer by day
- DevRel/copywriter by night
- Python and data engineering trainer
- Conference speaker
- Contributed a chapter to the book "97 Things Every Data Engineer Should Know"
- Twitter: @mikulskibartosz