Re: SqlAlchemy Pool config parameters to minimize connectivity issue impact
We also faced this issue one month back for Airflow 1.9 with Celery
Executor. Unfortunately, we could not find the root cause immediately. This
can occur due to too many open connections. We end up closing all open
connections as immediate Solution and restarting Mysql Instance. But I
think what Kevin mentioned can be the cause of our problem too. We happened
to run airflow run command many times in last week before we start to see
My doubt is, When does airflow run command connection is actually closed ?
Does it follow SQL_ALCHEMY_POOL_RECYCLE (1 hour) property and kill every
hour ? Because I think we saw some very old connections running too.
On Fri, Sep 28, 2018 at 4:32 PM Kevin Yang <yrqls21@xxxxxxxxx> wrote:
> Hi Raman,
> Would you elaborate a bit more on what exactly is the connectivity issues
> that you were facing and which version of Airflow are you on? We previously
> had some connectivity issue when the # of connection was too large and we
> had it fixed with this PR
> Kevin Y
> On Tue, Sep 25, 2018 at 11:57 PM ramandumcs@xxxxxxxxx <
> > Hi All,
> > We are observing sometimes Dag tasks get failed because of some
> > connectivity issues with Mysql server.
> > So Are there any recommended settings for mysql pool's related parameters
> > like
> > sql_alchemy_pool_size = 5
> > sql_alchemy_pool_recycle = 3600
> > to minimise the connectivity issue impact.
> > Thanks,
> > Raman Gupta