Handling Large Amounts of Web Service Requests and Data Storage

databasescalabilityweb services

I have a desktop application that sends a POST request to my website every 5 minutes. This post request can contain a variable amount of data about the PC sending the request, but typically it contains about 20-30 variables of information.

I'm storing all of this information in a MySQL table in a format like, RequestID|Field|Value. I want to be able to keep all of it so I can run trend reports over 24 hours, 48 hours, weeks, months, years, etc.

This could amount so something like 10,000 rows of data every day. That's 3 million rows of data a year.

I can imagine 20-30 people using my website and the database storing 100 million rows of data a year.

What is the best way to approach this kind of 'notification' service and scale it?

Best Answer

Only send changes.

Since you're running trends, you don't have to send every 5 minutes. Maybe stash the data on the desktop and send when the user is not so busy. Try to keep it from getting too full.

Aggregate the data into some daily averages and then weekly or monthly. You can off-load old detail data into some other data source to file just in case you want to bring it back to run a new trend analysis/formula that requires details.

Related Topic