I don't see an easy workaround unfortunately. Basically, till this is fixed, BigQuery columns of type TIMESTAMP will not work for pre 1900 values. So you'll either have to change the values or type of this column.
Thanks,ChamOn Wed, Apr 25, 2018 at 9:51 AM Yatish Gupta <ygupta@xxxxxxxxxx> wrote:Thanks! In the meantime, is there a workaround for this?Thanks!YatishOn Wed, Apr 25, 2018 at 12:20 AM Chamikara Jayalath <chamikara@xxxxxxxxxx> wrote:Thanks for reporting. This seems to be due to a known Python bug: https://bugs.python.org/issue1777412Seems like above bug has not been fixed for Python 2.x line. I created https://issues.apache.org/jira/browse/BEAM-4171 for tracking the Beam issue and possibly updating the Beam SDK to not use 'strftime'.Thanks,ChamOn Tue, Apr 24, 2018 at 2:33 PM Yatish Gupta <ygupta@xxxxxxxxxx> wrote:Hey,I am getting an exception when running beam on BigQuery rows that contain Timestamps less than 1900. I get a similar error when using the DirectRunner.File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/nativebigqueryavroio.py", line 83, in _fix_field_valuereturn dt.strftime('%Y-%m-%d %H:%M:%S.%f UTC')ValueError: year=1851 is before 1900; the datetime strftime() methods require year >= 1900How do I read these rows without changing my schema?Thanks!