I currently have a Python script that runs daily (being triggered using Airflow), hits the `search_issues` endpoint and returns all new issues that have been `last_updated` since the previous day/run and loads them into Snowflake. Until the last few months, this worked seamlessly and there were no issues. As of now, nothing has changed in the script, the Airflow task runs, it completes successfully, there are no errors, it runs as intended but when I look in Snowflake there are no new issues loaded.
When I run this exact same script locally in a Jupyter Notebook it runs successfully as well as updates the table in my devenviron with new/updated issues. Where it gets weird is if after I run this locally, I return to Airflow without changing a single thing in the script or deploying anything new and rerun the task, it then updates production.
This doesn't happen daily either, sometimes the task in Airflow runs and everything updates as intended but other days it goes days without anything being updated.
Does anyone have any idea why this could/would be?
Community moderators have prevented the ability to post new answers.
@Shannon welcome to the Atlassian community
I know that there was a cache bug in the past in Airflow but thought it was fixed sounds very similar to what you are having issues with. Did you ever get this resolved?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.