How to define an AWS Athena view using Airflow
In this brief tutorial, I will show how to define an AWS Athena view using Airflow. We will need two things:
- AWSAthenaOperator
- the SQL query that defines the view
Let’s start with the query. To define the view, we have to call the CREATE VIEW
statement. However, to make it work flawlessly in Airflow, we should make the statement pass even if the view already exists, so I suggest using CREATE OR REPLACE VIEW
. Here is an example SQL query that creates a view:
1
create_view_sql = 'CREATE OR REPLACE VIEW the_view_name AS SELECT t1.* FROM some_table t1;'
Of course, such a view makes no sense, but it is good enough to use in a tutorial.
Now, we have to import the AWSAthenaOperator
:
1
from airflow.contrib.operators.aws_athena_operator import AWSAthenaOperator
After that, we can create a new instance of the operator and add it to a dag
. Note that I have to define the AWS connection id, which refers to a connection configured in Airflow and the database in which I want to create the view. In addition to that, I have to pass the S3 location where I want to store Athena queries’ results because Athena (being a Presto-based service) stores the results in files:
1
2
3
4
5
6
7
8
create_view = AWSAthenaOperator(
task_id='create_the_view',
query=create_view_sql,
aws_conn_id='aws_connection_id',
database='athena_database',
output_location='the_output_location_of_athena_queries',
dag=dag
)
Parsing machine learning logs with Ahana, a managed Presto service, and Cube, a headless BI solution

Check out my article published on the Cube.dev blog!
You may also like
Bartosz Mikulski
- Data/MLOps engineer by day
- DevRel/copywriter by night
- Python and data engineering trainer
- Conference speaker
- Contributed a chapter to the book "97 Things Every Data Engineer Should Know"
- Twitter: @mikulskibartosz