I'm using a python script with the python library MySQLdb
to upload a large amount of data to a local MySQL database. When I used a sample of the data everything ran fine and uploaded. Now I am using all of my data (it's about 300000 lines split into 12000 text files) and I receive the following error:
"OperationalError: (2003, ""Can't connect to MySQL server on 'localhost' (10055)"")"
At the point where it crashes it has uploaded 17231 rows of information. I'm using Python 2.7, Win7 64 and MySQL 5.1.53 on a desktop machine. I used a WAMP installation and view the data using PhpMyAdmin. An example of the function I am using to upload the data is as follows:
# upload data
def updateDB(db, table, values):
db = MySQLdb.connect (host = 'localhost', user = 'root', passwd = '', db = db)
cursor = db.cursor()
print str(values)
cursor.execute(makeSQLHeader(table, values), values)
db.commit()
db.close()
This works fine for the test data, and the text file structure is identical for all of my data. What would be a good strategy to address this problem?
Best Answer
My two thoughts on this are that either it's taking a long time to load and you're disconnecting with a timeout. There are some ways to autoreconnect in python found here Here are the relevant timeout variables in mysql: wait_timeout and interactive_timeout
Second idea, and from discussion in comments it looks like the right one. You're probably hitting the mysql connection limit by opening a connection per file.
Try opening a single connection and running the test (when you do this, you might hit the timeout, depending on how long it takes).