Blogger – How to Programmatically Back Up a Blogger Site

backupblogger

If you are logged in to your Blogger account, you can easily download a dump of your website by clicking on the Settings → Other → Export Blog link. (Source.)

How can I do this automatically each day on Linux, from a cron job?

Best Answer

First you need to do a login with curl and then you can simply invoke another call to the download link.

I would follow this article to accomplish this. The trick is to find out which URL you need to call to export your blog.

What I did is:

  1. Use Chrome
  2. Open developer tools (F12 or ctrl+shift+j).
  3. Select the network tab.
  4. Go to the "export" button/link and click it.

You will see the download happening, and you will see a new record in the network tab. Mine is : http://draft.blogger.com/feeds/7135654868651822450/archive (if not authenticated you will get 404).

enter image description here

(update by Mark Harrison) Based on the information above, here is a script that will download the xml dump of the blog. (end update)

FEED=12312312312312313
GMAIL=yourname@gmail.com

echo -n google passwd:
stty -echo
read GPASS
stty echo                  

AUTH=`curl 2>/dev/null https://www.google.com/accounts/ClientLogin \
    -d Email=$GMAIL \   
    -d Passwd=$GPASS  \
    -d accountType=GOOGLE \
    -d source=blogix \
    -d service=blogger \
  | grep '^Auth=' | cut -c 6-`

curl >blog.xml \
    -H "Authorization: GoogleLogin auth=$AUTH" \
    http://www.blogger.com/feeds/$FEED/archive