python - Pandas DataFrame.to_sql randomly and silently failes without error message - Stack Overflow

I am trying to write several pandas dataframes (of the same format) to a postgres db, however some of t

I am trying to write several pandas dataframes (of the same format) to a postgres db, however some of them randomly are not written to the db. In that case to_sql silently fails and returns -1 (indicating the failure).

I don't use any schema which should rule out this issue as a possible cause, and I am not using SQL Server either. What totally strikes me is the fact that some of these dataframes are written to the db and some are not.

code:

from sqlalchemy import create_engine, inspect, DateTime
import psycopg
engine = create_engine('postgresql+psycopg://plantwatch:[email protected]/plantwatch') 
df.to_sql('power', con=engine2, if_exists='append', index=False, dtype={'produced_at': DateTime})

example df (for each dataframe one id is written to the db) and expected db content :

produced_at             id          value
2015-01-01 00:00:00     someid      1
2015-01-01 01:00:00     someid      2
2015-01-01 00:00:00     someid2     1
2015-01-01 01:00:00     someid2     2

actual db content:

produced_at             id          value
2015-01-01 00:00:00     someid      1
2015-01-01 01:00:00     someid      2

A whacky workaround would be to dump all dataframes to .csv files and import each of them one by one to postgres, but there has to be another way.

发布者:admin,转转请注明出处:http://www.yc00.com/questions/1745253773a4618834.html

相关推荐

发表回复

评论列表(0条)

  • 暂无评论

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信