1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[SCRIPT] Automated FTP Backup/Restore of stats

Discussion in 'Tomato Firmware' started by psychowood, Mar 5, 2007.

  1. psychowood

    psychowood LI Guru Member

    Hi all! :)
    Two weeks ago I stumbled on a bricked WRT54GL v1.1 and got it for a very cheap price, thinking of trying to revive it with a JTAG cable, but at last it wasn't needed, I managed to tftp it shorting the Intel flash pins :biggrin:

    After trying DD-WRT for a day I realized it was really too much (and too slow :rolleyes: ) for my needs, and flashed Tomato. Beautiful, in one word. :agree:

    Since it seems that CIFS isn't really reliable (don't know why, but it rarely mounts the folder, and CIFS stats backup is even less usable, imho), I wanted to organize a mindless and automated FTP backup/restore procedure.

    I found a site that had the script somewhat ready, with some minor typos, so I tried to correct and enhance it.

    There are two script files needed, init.sh and backup.sh, both of them should be in "/jffs/rstatsbackup", and "/jffs/rstatsbackup/init.sh" should be in the WAN Up script section of Tomato.

    Files:

    init.sh
    Code:
    LOCATION="url" #complete url where are stored backups WITHOUT ending slash, like "http://mysite.com/tomato"
    
    #Check if 1st execution since booted
    test -e /tmp/var/lib/misc/rstatsbackup.inited
    if [ $? == 1 ] ; then
    
    #Restore backup and init cron
    led amber off
    led white on
    
    cru d rstatsbackup;                 # remove previously existing cron entry for this script 
    service rstats stop;                 # kill rstats
    cd /tmp/var/lib/misc;               # directory where rstats keeps its data
    rm rstats-*                            # remove previously existing rstats data
    
    wget $LOCATION/rstats-history.gz
    if [ $? != 0 ] ; then
        led amber on
        led white off
        return 1
        fi
    
    wget $LOCATION/rstats-speed.gz
    if [ $? != 0 ] ; then
        led amber on
        led white off
        return 1
        fi
    
    wget $LOCATION/rstats-stime
    if [ $? != 0 ] ; then
        led amber on
        led white off
        return 1
        fi
    
    wget $LOCATION/rstats-source
    if [ $? != 0 ] ; then
        led amber on
        led white off
        return 1
        fi
    
    # If restore was ok (i.e. if restore files where found) then add cron job and restart rstats
    
    # crontab: run the following script every 15 minutes...
    cru a rstatsbackup "0,15,30,45 * * * * /jffs/rstatsbackup/backup.sh";                    
    
    service rstats start;                    # restart rstats.
    echo 1 > /tmp/var/lib/misc/rstatsbackup.inited
    led white off
    
    fi
    
    backup.sh
    Code:
    USER="ftpusername"
    PASSWORD="ftppassword"
    SITE="ftpsite" #url WITHOUT ftp://, like "mysite.com" or "IPADDRESS"
    FOLDER="ftpfolder" #WITHOUT ending slash, like "/tomato"
    
    ps | grep -q rstats
    if [ $? == 0 ] ; then
    
    #backup rstats
    led amber on
    led white on
    
    ftpput  -u $USER -p $PASSWORD $SITE $FOLDER/rstats-history.gz /tmp/var/lib/misc/rstats-history.gz
    ftpput  -u $USER -p $PASSWORD $SITE $FOLDER/rstats-speed.gz /tmp/var/lib/misc/rstats-speed.gz
    ftpput  -u $USER -p $PASSWORD $SITE $FOLDER/rstats-stime /tmp/var/lib/misc/rstats-stime
    ftpput  -u $USER -p $PASSWORD $SITE $FOLDER/rstats-source /tmp/var/lib/misc/rstats-source
    
    led white off
    led amber off
    fi
    
    
    On init, the white led is on.
    On a backup, both amber and white leds are on.
    If the backup site can't be reached on init, rstats isn't restarted (so no stats) and the amber led stays on.

    NOTE: If you don't want your history gone after the 1st execution of the script, run once backup.sh BEFORE rebooting and BEFORE running init.sh

    For the rest, comments should speak for themselves, anyway if an explanation is needed, just ask. :)

    And sorry for my bad english :redface:
     
  2. der_Kief

    der_Kief Super Moderator Staff Member Member

    Hi psychowood,
    thanks for that nice instruction for automated backup/restore of BWdata to/from different location as the possibilities of Tomato (was playing around with backup BWdata per ftpput some while ago) . Will try that when i have little more time for it :wink:

    der_Kief
     
  3. psychowood

    psychowood LI Guru Member

    No problem, der_Kief :)
    Btw, the script is far from being perfect (doesn't check if the backup is done correctly, if it can't restore it simply locks up instead of retrying later and must be restarted manually, you have to have a web server together with a ftp server if you are backuping on lan, and so on...), if you have suggestions feel free to say them, of course.
    I have tested it in the last few days and did not found any bug, but you know, they are always hidden :)
     
  4. jazzkantine

    jazzkantine LI Guru Member

    I was searching for a solution to backup the bandwidth stats. Thanks for your scripting. I'm using Tomato Version 1.04.0944 on a Buffalo WHR-G54S. I configured the scripts and placed them on the router. I just made a reboot to see if it's working and it did not. The bandwidth history was gone. Do you see a problem in running the scripts on a buffalo?
     
  5. der_Kief

    der_Kief Super Moderator Staff Member Member

    Hi jazzkantine,

    i still haven't had time to test this but i think that has nothing to do with the brand of router. Maybe your adjustments are wrong ! Recheck this. Can you describe the steps you made.

    der_Kief
     
  6. psychowood

    psychowood LI Guru Member

    As der_Kief said, shouldn't depend on hardware, since Tomato is the same everywhere (excepts some details like leds, speedbooster, and similar) :)

    Did you made a stats backup before restarting the router? When wan goes up, the router firstly downloads rstats data from the web address you put in the script, so if you didn't run backup.sh once before restarting, it won't find anything online, and history is cleared (and perhaps, depends on the webserver, rstats isn't even restarted). :redface:

    [EDIT]Added a note on the 1st post about this question.[/EDIT]
     
  7. jazzkantine

    jazzkantine LI Guru Member

    First I should confess that i'm complety new to this whole linux command line thing. Before I turned the router off and on I executed the following line: "sh /jffs/rstatsbackup/backup.sh". I checked the FTP Directory and everything looked just fine.. strange thing.
     
  8. psychowood

    psychowood LI Guru Member

    Yep, that's strange, I just tried clearing the stats and power ciclying router, and it restored correctly everything...

    Do you have some particular settings under Administration -> Bandwidth Monitoring? Mine are Enabled, Save to RAM and Save Every Hour.

    What are the sizes of rstats* files files you have in the ftp server?
     
  9. jazzkantine

    jazzkantine LI Guru Member

    same here..
    I made a screenshot of the ftp folder: [​IMG]

    BTW: i'm using htaccess to restrict http access, so I my first line in init.sh looks like this: LOCATION="http://USER:pW@DOMAIN.TLD/FOLDER"
     
  10. psychowood

    psychowood LI Guru Member

    history.gz and speed.gz of mine are a bit bigger (twice in size), and one thing that is bothering me is that cron is set up to backup at every 0,15,30,45 minute, but your uploads are timed 04.33 :confused: (slow ftp, perhaps?).

    Since some days have passed, did you accumulate some traffic in the daily history now, right?

    Try to telnet into the router and (after logging in)

    -"cru d rstatsbackup;" to remove the automated execution of backup.sh (we don't want it to run while we are working, don't we? :) )

    -"cd /tmp/var/lib/misc" and "ls -l rstats*" (without "", of course :biggrin:) and read the output (filesizes and timestamps)

    -"/jffs/rstatsbackup/backup.sh", check the console output for errors, and check via ftp if filesizes are the same of above and if the timestamps are sinchronized with when you runned backup.sh

    -"/jffs/rstatsbackup/init.sh", as before check for errors and see if size of rstats* files is the same than before but with different time (beware that there is still a possibility that your history will be erased again, so you may want to save rstats* files somewhere else, in jffs or in a cifs mount, for example)

    Never tried Tomato's WGET with a restricted access url, but should work anyway...
     
  11. psychowood

    psychowood LI Guru Member

    I did a little modification to the script, replacing killall rstats and rstats with service rstats stop and service rstats start in init.sh, and making backup.sh run backup only if rstats is running. I'm not sure if it is useful, but should avoid overwriting remote backups with blank stats files, in some cases.
     
  12. der_Kief

    der_Kief Super Moderator Staff Member Member

    Hi,

    i have question about the init script file. You say that it should be put in the WAN up section and so it is executed each time the WAN comes up /restarted. I connect to the internet via PPPoE and have forced disconnect every 24 hours. So i think for me it makes no sense to put init.sh in the WAN up section. Isn't it better to put this to the Init section ? So every time the router restarts/reboots its been executed !? Any suggestion ?

    der_Kief
     
  13. psychowood

    psychowood LI Guru Member

    Hi der_Kief,
    I totally forgot of people that have forced disconnects, my shame! :redface:
    Added a if statements that checks for the presence of a file named rstatsbackup.inited in the ram files system and runs the init script only if that file does not exist. If not exists, is created after the execution of init.sh.
    This way, it will be run only after a reboot.

    Corrected also two typos and looked for errors twice, now should be ok :)

    PS.I prefer WanUp section to Init section, because this way it doens't have to check for an existing connection before running.
     
  14. der_Kief

    der_Kief Super Moderator Staff Member Member

    Hi psychowood,

    thanks for fast reply and for modifing the scripts. I will try that as soon as possible.

    der_Kief

    BTW I have put that in the FAQ :wink:
     
  15. psychowood

    psychowood LI Guru Member

    Thanks, but perhaps someone else except me should try it ;)
     
  16. Bob_C

    Bob_C Guest

    Great scripts, just what I've been searching for. Thanks for all your effort - much appreciated.

    I want to automatically retry the restore if it fails so I've changed the init.sh script as follows.

    1. After the line which deletes the cron job for the backup (cru d rstatsbackup) I've inserted these 3 lines. So if the script exits before all files are restored a cron job will retry the restore within 5 minutes.
    # Add a cron job to rerun this script within 5 minutes (delete the cron job first - it might already exist)
    cru d rstatsrestore
    cru a rstatsrestore "0,5,10,15,20, 25, 30, 35, 40, 45, 50, 55 * * * * /jffs/rstatsbackup/init.sh"

    2. Before the line which adds the cron job for the backup (cru a rstatsbackup ....) I've added these 2 lines to delete the retry cron job.
    # Restore successful so delete the cron job which reruns this script
    cru d rstatsrestore

    Do you think this will work? Unfortunately I haven't been able to test it yet because when I try to create the /jffs/rstatsbackup directory I get an error message telling me that jffs is read only. Any idea what I'm doing wrong? And do you know if I will need to do anything special on a Windoze XP box to use it as the backup location?

    Please reply in simple language as I'm new to this router-hacking lark!

    BTW, I renamed the script from init.sh to restore.sh as it seems a more appropriate name.

    Regards
    Bob
     
  17. psychowood

    psychowood LI Guru Member

    Thanks Bob, you're welcome :)

    Nice one, that's exactly how I'd have done it, I don't see why it shouldn't work. Alternatively, it could be possible to put the whole init script in a while loop that checks for rstatsbackup.inited and sleeps 5 mins on every repetition: this way you would not be using cron. There would be more changes to do, tho.

    Nope, sorry.. never had that kind of problem, I just enabled jffs, formatted the partition, and gotta it run. Tried rebooting the router after enabling it?
    If you want to avoid jffs (or you simply cant get it working :) ) you could put every file in the Win box and wget (or ftpget) them in the Wan Up Script. Of course, remember to change file paths in the scripts first :rolleyes:

    I'd say you need both a ftpserver and a webserver. On a Win box I'd probably prefer using a ftp server alone, and I'd replace every wget call with ftpget calls in the script.

    Lol, I'm not a router jeek either, just know a bit a linux scripting :)
    And I hope my english is understandable :redface:

    That was it's first name, lately I changed it because it has expanded a bit (in the beginning most of the script was in the Wan Up section). The final version will probably have three files: init, restore & backup, so everyone will be happy :biggrin:
     
  18. soganta

    soganta Network Guru Member

    cru a rstatsrestore "*/5 * * * * /jffs/rstatsbackup/init.sh"

    would do the same but shorter :)
     
  19. psychowood

    psychowood LI Guru Member

    Thanks for the advice, soganta :)

    I corrected (another :/ ) typo in init.sh:

    in the first lines,
    "if [ $? != 1 ] ; then"
    should have been
    "if [ $? == 1 ] ; then".
     
  20. derlbear

    derlbear Guest

    OK...... This script seems to be perfect to me. I tested WRT-DD with BWlog, but the backup-thing didn't work. Now I installed Tomato because of the great Bandwidth-Monitor.

    I don't know how the backup-thing works. Sometimes the router (WRT54GL) restarts and I don't know why. Result: BW-counter starts at 0 kb.

    I tried to backup the BW-files with Tomato (JFFS2), but this doesn't really work. When I restart the router, it doesn't load the last backup it made.

    So... How do I use your script? I want to save the backup files on my webserver , update it every 60 minutes. I don't know how to install backup.sh, .. . Could you please explain it to someone who hasn't done something like this before?

    Thank you very much.
    derlbear
     
  21. mehmehmeh

    mehmehmeh LI Guru Member

    I am attempting to get this script on my Buffalo WHR-G54S but as soon as I change the "save history location" for the stats from ram to custom path /tmp/var/lib/misc/, my bandwidth graph stops working.

    that problem aside I did get the the backup script to ftp the files from my router to my computer.

    So here is my rough guide, but I never got this working right, maybe it's my hardware? (the other buffalo user in this thread seeded to have trouble and then never posted if they ever got it working)

    programs used: (from a windows machine)
    winscp
    putty
    filezilla server - if you want to backup to lan rather than outside ftp

    1. using a web browser - log into your router
    2. go to administration > admin access
    3. enable SSH, allow password login, click save at the bottom of the page
    4. administration > jffs2
    enable jffs2, click on format/erase, wait then click save at the bottom of the page
    5. administration > bandwidth monitoring
    save history location change to custom path "/tmp/var/lib/misc/", save frequency every hour (or whatever you want)
    (this is where my graphs stop working, but it is the only way that makes sense to me since they need to be there for the ftp script to work)

    6. using winscp, switch protocol in the login window to SCP, enter your router's address and userid/password ( i needed to use "root" as my username here, I normally type "admin" out of habit which seems to be fine for the webUI but not scp or ssh)
    7. navigate to the top level folder by click the ".." folder until l you can't go up anymore
    8. switch to jffs folder
    9. create new folder "rstatsbackup"
    10. edit scripts from page 1 of the thread and save them to your computer.
    11. copy scripts from your computer to jffs/rstatsbackup using winscp
    12. you may need to wait a while until the router saves it's backup. you can check by navigating to "/tmp/var/lib/misc/" to see if there are four files that start with "rstats-"
    13. log into router using putty, enter at the command line "/jffs/rstatsbackup/backup.sh" then hit enter to create the initial backup of the bandwidth files.

    This is as far as I got using this script. It did backup the files properly to my computer but I noticed that my bandwidth graph were not working at the time. So I never tried the restore function but I'm sure it work. Went back and changed the save history location back to ram and everything seems fine.

    last step would be to add the restore script to the wan up scripts in the web interface for the router, go to
    administration>scripts>WAN Up tab
    enter the line "/jffs/rstatsbackup/init.sh" and click save.

    if anyone can see any glaring errors in the above steps or has had any luck getting this to work on a buffalo router please let me know.
     
  22. psychowood

    psychowood LI Guru Member

    Hi derlbear,
    Sorry if I didn't reply before, I missed your post, probably the reply notification email got lost :indifferent:

    I'll write a little installation guide asap. :)

    EDIT: Didn't notice mehmehmeh post :doh:
     
  23. psychowood

    psychowood LI Guru Member

    Hi mehmehmeh,

    I have not changed the default location, probably the problem lies in the fact that saving the history location in a different path is not reliable (I couldn't make CIFS saving work, either), or in that the rstats files are already saved in /tmp/var/lib/misc/ by rstats itself, I don't really know :)

    Thanks for the hard work putting up the guide :)
    I didn't answer before simply because I didn't notice the new post :/

    Anyway, your step by step is quite complete, but if you agree, I'd change that a bit to make it more simple ;) :

    1. Edit scripts from page 1 of the thread and save them to your computer.

    2. Using a web browser - log into your router

    3. Go to administration > admin access

    4. Enable (and start) the Telnet Daemon

    5. Administration > jffs2
    Enable jffs2, click on format/erase, wait then click save at the bottom of the page

    6. Upload the scripts files on your ftp server (the one you'll use to store the stats backup) using a ftp client of your choice, like filezilla (or internet explorer :biggrin: )

    7. Telnet into the router (open a command prompt, then run "telnet routerIP" with your router IP, of course :) ) and log in

    8. Via telnet, run (in order, of course):
    • mkdir /jffs/rstatsbackup/
    • cd /jffs/rstatsbackup/
    • ftpget -u USER -p PASSWORD FTPserver backup.sh backup.sh
    • ftpget -u USER -p PASSWORD FTPserver init.sh init.sh
    • chmod 755 *
    • ./backup.sh
    • exit

    9. Browse on the router, go to administration>scripts>WAN Up tab
    and enter the line "/jffs/rstatsbackup/init.sh", then click save.

    Everything should be fine, and don't forget to delete the scripts files from your ftp server, since they contains your ftp password :)

    If you can help me "debugging" this guide (I triple checked, but that's not enough ^^ ), I'll edit the 1st post.
     
  24. mehmehmeh

    mehmehmeh LI Guru Member

    Thanks for the quick reply and simplified instructions. I wrote out every step since i had been having issues this morning getting it to work and I wasn't sure if i was missing a small detail..

    The reason I changed the backup save location, when I first got and ran the backup.sh script on the router it through an ssh terminal it showed four errors for the ftpput lines because it could not find the four rstats files. After changing the backup location the script could find the files but the graph were broken So i wasn't sure what was going on.

    Is the init.sh file somehow involved in this? I had not put it on the router yet since I wanted to be sure the router sent a backup to my computer first. It a router at my office so I'll give it a try tomorrow.
     
  25. psychowood

    psychowood LI Guru Member

    I don't really know what could be..
    If your configuration isn't too much complex, you could try clearing nvram and reconfiguring it manually (sometimes it helps), or you could change the save location in a different /tmp directory (like /tmp/rstats) and see if the monitoring keeps broking.

    Btw, I don't know if this is the case, but I've read that for some settings, is a good thing to reboot the router after saving, even if it does not ask for a reboot (for example, I need to reboot for CIFS settings, otherwise they're not applied), so save and reboot :)

    Nope, the init.sh is only used to restore the stats and for the cron job.
     
  26. mehmehmeh

    mehmehmeh LI Guru Member

    got the script working today, and I'm stumped as to what I did differently than the first time I tried. But as long as it works I'm happy. thanks for your help psychowood.
     
  27. HarshReality

    HarshReality Network Guru Member

    nevermind I'm a twit
     
  28. psychowood

    psychowood LI Guru Member

    I haven't upgraded yet to 1.07 so I can't confirm that behaviour, but it seems very strange to me to have read only jffs (it isn't in the changelog either), since the only point of having jffs is to write files in it.... :rolleyes:
     
  29. HarshReality

    HarshReality Network Guru Member

    I neglected to format the thing prior to attemp, my only issue now is syntax error line 20 of backup.sh "fi" unexpected (expecting "then"

    ps | grep -q rstats
    if [ $? == 0 ] ; then

    #backup rstats
    led amber on
    led white on

    ftpput -u $USER -p $PASSWORD $SITE $FOLDER/rstats-history.gz /tmp/var/lib/misc/rstats-history.gz
    ftpput -u $USER -p $PASSWORD $SITE $FOLDER/rstats-speed.gz /tmp/var/lib/misc/rstats-speed.gz
    ftpput -u $USER -p $PASSWORD $SITE $FOLDER/rstats-stime /tmp/var/lib/misc/rstats-stime
    ftpput -u $USER -p $PASSWORD $SITE $FOLDER/rstats-source /tmp/var/lib/misc/rstats-source

    led white off
    led amber off
    fi

    **Variables ommited
     
  30. psychowood

    psychowood LI Guru Member

    Try to put a newline after "fi", the sintax is correct and working here...
     
  31. HarshReality

    HarshReality Network Guru Member

    then it throws syntax error 22 unexpected expecting then

    is ok another stab that failed for me ;)
     
  32. HarshReality

    HarshReality Network Guru Member

    might be me using notepad.. dont suppose you could mail your currents (minus passwords) to hreality@gmail.com
     
  33. psychowood

    psychowood LI Guru Member

    It could be notepad... anyway, you've got mail :)
     
  34. HarshReality

    HarshReality Network Guru Member

    hmm... same result (sort of) ./backup.sh: 24: Syntax Error: end of file unexpected (expecting "then")
     
  35. psychowood

    psychowood LI Guru Member

    Perhaps there is something different in v1.07.. During the w-e I'll give it a try.
     
  36. HarshReality

    HarshReality Network Guru Member

    ok, either way I appreciate the effort on your part ;)
     
  37. mraneri

    mraneri LI Guru Member

    I'm still relatively new to Tomato, but loving it completely. Not interested in JFFS at all, I was able to implement this kind of thing strictly with a Startup script. This works with 1.07, the only version I've tested it on. The other contributors to this thread really deserve all the credit. Changes I made were all relatively minor. See below..

    I put this in the WAN UP script. Change the FTP connection info, and customize the cron times as you wish and it should work fine. Careful while editing the echo statements. Screwups will break the generated script files.

    Code:
    USER="username"
    PASS="password"
    PORT=21
    SERVER="servernameorip"
    RPATH="Remote Path/rstats.tar"
    LPATH="/tmp/rstats.tar"
    
    if [ ! -s /tmp/backup-rstats ] ; then
        echo -e "#!/bin/sh\nkillall -1 rstats\nsleep 1\ntar c /tmp/var/lib/misc/rstats-* > \"$LPATH\"\nftpput -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$RPATH\" \"$LPATH\"\nrm \"$LPATH\"" > /tmp/backup-rstats
        chmod 777 /tmp/backup-rstats
        echo -e "#!/bin/sh\nservice rstats stop\nftpget -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$LPATH\" \"$RPATH\"" > /tmp/restore-rstats
        echo -e "if [ \$? != 0 ] ; then\n  logger RStats Restore Failed... will retry\n  cru a rstats \"*/5 * * * * /tmp/restore-rstats\"\n  return 1\nfi\ntar x -f \"$LPATH\" -C /\nrm \"$LPATH\"\nservice rstats start\ncru a rstats \"1 */12 * * * /tmp/backup-rstats\"" >> /tmp/restore-rstats
        echo -e "logger RStats Data Restored\nrm /tmp/restore-rstats" >> /tmp/restore-rstats
        chmod 777 /tmp/restore-rstats
        /tmp/restore-rstats
    fi
    
    In my situation, I'm using FTP to backup and restore. If an FTP server is running, I don't see much reason in not logging in via FTP to do the restore either. If you telnet in, you will see the ftp connection information ends up hardcoded in the two scripts that get executed.

    One fundamental change I made is that I'm tar'ing (archiving into one file) the 4 rstats files, so it's only one file to upload or download. A little easier to manage, and only one error to check for if the file can't be downloaded. Of course, to eliminate JFFS, the scripts are echoed to files.

    These are the two scripts created (Note, the scripts are created without the comments):
    backup-rstats (created by the above startup script)
    Code:
    #!/bin/sh
    killall -1 rstats       {This makes rstats update the files before we FTP them}
    sleep 1
    tar c /tmp/var/lib/misc/rstats-* > "/tmp/rstats.tar"   {Archive all rstats-* files (the 4 significant files) into one file}
    ftpput -u "username" -p "password" -P 21 servernameorip "Remote Path/rstats.tar" "/tmp/rstats.tar"
    rm "/tmp/rstats.tar"     {remove the archive after ftping it out}
    Note, there's no error checking here.. If it fails, what are you gonna do?

    restore-rstats (created by the above startup script)
    Code:
    #!/bin/sh
    service rstats stop  {stop rstats - will remain stopped until we successfully get the previous data}
    ftpget -u "username" -p "password" -P 21 servernameorip "/tmp/rstats.tar" "Remote Path/rstats.tar"
    if [ $? != 0 ] ; then
      logger RStats Restore Failed... will retry
      cru a rstats "*/5 * * * * /tmp/restore-rstats"  {Setup CRON to retry at 5 minute intervals (forever) until we get the file - rstats remains disabled, and not tracking statistics}
      return 1
    fi
    tar x -f "/tmp/rstats.tar" -C /  {file fetched successfully - untar it to the right place}
    rm "/tmp/rstats.tar"    {remove the tar file, dont need it anymore}
    service rstats start    {start rstats - will pick up the new data}
    cru a rstats "1 */12 * * * /tmp/backup-rstats"  {set or modify the cron job to backup data... This does 12 hour intervals...  Setup the CRON however you want}
    logger RStats Data Restored
    rm /tmp/restore-rstats  {remove the restore script..  No need to ever restore again until the next reboot.  You'll see why this is helpful later.} 
    Wanting to be able to make sure the data is saved before any intentional reboot, I put this in the Shutdown script... (optional)
    Code:
    if [ ! -s /tmp/restore-rstats ] ; then  {make sure any previous restore was successful, or we will overwrite the backup with nothing...}
    /tmp/backup-rstats
    fi
    This works very well for me. The shutdown script works any time the router is intentionally rebooted. So (assuming the FTP server is up) I only risk losing 12 hours of data in the event of a power failure. All with no JFFS.

    Just search the "echo" statements above for the "cru" sections and change the settings to your liking. (You may want to reduce the 12 hour backup interval... I keep it at 12, because I don't want the harddrive spinning up every hour on an otherwise, relatively un-busy server.)

    Questions/further improvements welcome.

    - Mike
     
    Ped Man likes this.
  38. HarshReality

    HarshReality Network Guru Member

    How about a detail to get the thing running?
     
  39. mraneri

    mraneri LI Guru Member

    If you're referring to my scripts, from two posts above, try the following:

    1. Upgrade to 1.07, if desired.
    2. Go to the Administration -> Bandwidth Montoring page and manually backup your bandwidth monitoring stats to your local PC. (This does not have to be to the FTP server.)
    3. Go here (replace 192.168.1.1 with your router's IP if it's different).
      http://192.168.1.1/admin-scripts.asp
    4. Click to the "Wan UP" script tab.
    5. Copy and Paste the entire contents of the FIRST "CODE" box into the script box.
    6. Edit the script with your FTP login information and remote path (RPATH). You should not change the local path (LPATH).
    7. create a dummy file on your FTP server with the name/path in RPATH, as you specified above. (i.e. "/Remote Path/rstats.tar" - This file can be 0 bytes)
    8. Click to the "Shutdown" script tab. Copy and paste the entire contents of the LAST "CODE" box into the script box. Click "Save."
    9. Reboot the Router.
    10. Check the router log after the reboot and make sure you see something in there that says: "RStats Data Restored". This verifies that it picked up the dummy file you created 3 steps earlier. If you see "RStats Restore Failed... will retry" wait 5 minutes and check again. If it repeats, double check the FTP server is working, the login info is correct, and the path to the dummy file is valid. Update the scripts, reboot and theck again.
    11. Check Rstats is actually running by looking at your 24 hour bandwidth charts.
    12. Go back to the Administration -> Bandwidth Montoring page and manually restore your bandwidth monitoring stats from your local PC. This should be the last time you have to do that.
    13. Reboot the router again. (Use the webpage to reboot the router... Don't just kill the power). This will cause the backup to occur before the reboot.
    14. Your bandwidth stats should restore automatically this time.

    There are lots of instructions, but it should go quickly. It should be very easy. If it doesn't work, most likely, double check step 6. Make sure this is all correct. If you're having trouble, telnet into the router, and try an ftpput manually and make sure it works.

    A few explanations of a few steps above, if you're interested:
    2 and 12. If you don't do this, you will lose all the stats you have to this point.
    7. The scripts require a successful download of backup data before starting up rstats. If you don't create this dummy file, the script will keep trying to restore a backup which doesn't exist. The script does not start backing up if data has not been restored. (Consider a case where the router is rebooted, but the FTP server is offline at the time. The restore doesn't occur. The FTP server comes up. A backup occurs, overwriting the previous data, and all the history is then gone.)
     
  40. kwmcglon

    kwmcglon LI Guru Member

    great stuff

    Im going to try this tonight now you cant get more newbie than me so ill replay back with my results ill go with this method here. Anyone have great site to learn more about scripting? I have a few that i need to get for my network. Thanks for the information guys

    :thumbup:
     
  41. Overflow-ar

    Overflow-ar LI Guru Member

    I've tried and it works like a charm! :thumbup:. Just add a sleep
    PHP:
    /bin/sleep 10
    at the beggining of the Init script and It will work fine
     
  42. kwmcglon

    kwmcglon LI Guru Member

    Well no lck so far but my network has a slow connection 512kb.d/128kb.u and its over satilite wth a 900-2500ms lag SUCKS i know but with that in mind do i need to increase the sleep time to 30 maybe? I plan on a thread on tips to handle multple connections on this network dont ask.... I inherited it hehe
     
  43. srouquette

    srouquette Network Guru Member

    (mraneri is everywhere :) )
    I'm using your latest script, and it works great.
    but I lost my stats today, and I don't know how.
    the tar on my ftp jumped from 24k to 41k without more information, I don't know if it's related to the problem.
    my FTP was up for the latest backup, but it seemed to have failed because there was nothing left after the reboot I did one hour after.
     
  44. mraneri

    mraneri LI Guru Member

    Hmm.. I have to think about scenarios where this could happen.
     
  45. srouquette

    srouquette Network Guru Member

    The logs on my ftp server show me that it stored the stats 3 times (at 12:01, 12:06 and 12:17) without restoring it.
    I updated my router because I had some problems with OpenDNS, but I don't remember if I rebooted.
    Then I did some changes to the ad block script, and I had to reboot the router to test it. Stats were uploaded then retreived so it worked.
    My ftp server allow every operations (delete, append, create, etc...)
    I hope this will help you.
     
  46. mraneri

    mraneri LI Guru Member

    After a reboot, the script will continuously attempt to retrieve the stats, until the stats are restored. If the stats are never restored, the backups don't start. This is to make sure that if stats were not restoreable, you don't wipe out the stats on the FTP server.

    I can't really think of what could go wrong. Only thing is if the untar didn't work or something. Don't see why that would happen...

    Did you do anything you know of that would have wiped your stats? (I'm not sure what would have.) Are you running JFFS and did you put any of these scripts into your JFFS partition? These scripts expect to be created in RAM. Actually, their presence (or lack thereof) is the indicator as to whether you just came out of a reboot.
     
  47. srouquette

    srouquette Network Guru Member

    I don't run JFFS.
    do you have an idea why the tar was bigger after the second upload ? (25k to 50k)
    here is my log :
    Code:
    FileZilla Server version 0.9.23 beta
    Copyright 2001-2006 by Tim Kosse (tim.kosse@filezilla-project.org)
    Connecting to server...
    Connected, waiting for authentication
    Logged on
    (000001) 18/08/2007 10:46:08 - (not logged in) (192.168.1.1)> Connected, sending welcome message...
    (000001) 18/08/2007 10:46:08 - (not logged in) (192.168.1.1)> USER tomato
    (000001) 18/08/2007 10:46:08 - (not logged in) (192.168.1.1)> 331 Password required for tomato
    (000001) 18/08/2007 10:46:08 - (not logged in) (192.168.1.1)> PASS ******
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> 230 Logged on
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> TYPE I
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> 200 Type set to I
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> PASV
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> 227 Entering Passive Mode (192,168,1,10,19,136)
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> ALLO 20992
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> 202 No storage allocation neccessary.
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> STOR /rstats.tar
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> 150 Connection accepted
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> 226 Transfer OK
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> QUIT
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> 221 Goodbye
    (000001) 18/08/2007 10:46:08 - tomato (192.168.1.1)> disconnected.
    (000002) 18/08/2007 10:46:44 - (not logged in) (192.168.1.1)> Connected, sending welcome message...
    (000002) 18/08/2007 10:46:44 - (not logged in) (192.168.1.1)> USER tomato
    (000002) 18/08/2007 10:46:44 - (not logged in) (192.168.1.1)> 331 Password required for tomato
    (000002) 18/08/2007 10:46:44 - (not logged in) (192.168.1.1)> PASS ******
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> 230 Logged on
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> TYPE I
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> 200 Type set to I
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> PASV
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> 227 Entering Passive Mode (192,168,1,10,19,137)
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> SIZE /rstats.tar
    [b](000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> 213 20992
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> RETR /rstats.tar[/b]
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> 150 Connection accepted
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> 226 Transfer OK
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> QUIT
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> 221 Goodbye
    (000002) 18/08/2007 10:46:44 - tomato (192.168.1.1)> disconnected.
    (000003) 18/08/2007 12:01:13 - (not logged in) (192.168.1.1)> Connected, sending welcome message...
    (000003) 18/08/2007 12:01:13 - (not logged in) (192.168.1.1)> USER tomato
    (000003) 18/08/2007 12:01:13 - (not logged in) (192.168.1.1)> 331 Password required for tomato
    (000003) 18/08/2007 12:01:13 - (not logged in) (192.168.1.1)> PASS ******
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> 230 Logged on
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> TYPE I
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> 200 Type set to I
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> PASV
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> 227 Entering Passive Mode (192,168,1,10,19,138)
    [b](000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> ALLO 25088
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> 202 No storage allocation neccessary.
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> STOR /rstats.tar[/b]
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> 150 Connection accepted
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> 226 Transfer OK
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> QUIT
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> 221 Goodbye
    (000003) 18/08/2007 12:01:13 - tomato (192.168.1.1)> disconnected.
    (000004) 18/08/2007 12:06:27 - (not logged in) (192.168.1.1)> Connected, sending welcome message...
    (000004) 18/08/2007 12:06:27 - (not logged in) (192.168.1.1)> USER tomato
    (000004) 18/08/2007 12:06:27 - (not logged in) (192.168.1.1)> 331 Password required for tomato
    (000004) 18/08/2007 12:06:27 - (not logged in) (192.168.1.1)> PASS ******
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> 230 Logged on
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> TYPE I
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> 200 Type set to I
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> PASV
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> 227 Entering Passive Mode (192,168,1,10,19,139)
    [b](000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> ALLO 50176
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> 202 No storage allocation neccessary.
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> STOR /rstats.tar[/b]
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> 150 Connection accepted
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> 226 Transfer OK
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> QUIT
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> 221 Goodbye
    (000004) 18/08/2007 12:06:27 - tomato (192.168.1.1)> disconnected.
    (000005) 18/08/2007 12:17:44 - (not logged in) (192.168.1.1)> Connected, sending welcome message...
    (000005) 18/08/2007 12:17:44 - (not logged in) (192.168.1.1)> USER tomato
    (000005) 18/08/2007 12:17:44 - (not logged in) (192.168.1.1)> 331 Password required for tomato
    (000005) 18/08/2007 12:17:44 - (not logged in) (192.168.1.1)> PASS ******
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> 230 Logged on
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> TYPE I
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> 200 Type set to I
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> PASV
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> 227 Entering Passive Mode (192,168,1,10,19,140)
    [b](000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> ALLO 41984
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> 202 No storage allocation neccessary.
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> STOR /rstats.tar[/b]
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> 150 Connection accepted
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> 226 Transfer OK
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> QUIT
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> 221 Goodbye
    (000005) 18/08/2007 12:17:44 - tomato (192.168.1.1)> disconnected.
    
    in bold, you can see the size and the command. RETR when the router try to get the tar, and STOR when the router send the tar to the ftp.
    at the end, the router send 3 times the tar without restoring the stats.
     
  48. mraneri

    mraneri LI Guru Member

    It's hard to tell. Why was the upload occurring every 40 seconds initially (at 10:46:08 and 10:46:44?

    Also, The restore only occurs when the router is rebooted. Exactly what time did you reboot?

    I imagine you don't have the tar files anmore (they were probably overwritten)?
     
  49. srouquette

    srouquette Network Guru Member

    I've just received my router, so I'm doing some testing :)
    there is only one upload at 10:46:08 (command STOR), and at 10:46:44 is a download (command RETR), the router was restoring the stats.
    the automatic backup occured at 12:01:13 with the STOR command.
    but the following backups are stranges, at 12:06:27, the router upload the stats again without retrieving it like you said (but maybe it's normal, because the router wasn't rebooted, maybe the dns update reset the WLAN, and the WLAN UP execute again)
    but the size is twice bigger than the upload at 12:01, maybe the tar appended at the end of the previous one.
    I don't have the old tar, but I have another one which have a strange size.
     

    Attached Files:

  50. srouquette

    srouquette Network Guru Member

    I updated a little bit the backup script :
    Code:
    USER="username"
    PASS="password"
    PORT=21
    SERVER="192.168.1.10"
    RPATH="/rstats.tgz"
    RPATH_DATE="/rstats\`date +%Y.%m.%d-%H.%M.%S\`.tgz"
    LPATH="/tmp/rstats.tgz"
    
    if [ ! -s /tmp/backup-rstats ] ; then
        echo -e "#!/bin/sh\nkillall -1 rstats\nsleep 1\ntar -czf \"$LPATH\" /tmp/var/lib/misc/rstats-*\nftpput -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$RPATH\" \"$LPATH\"\nftpput -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$RPATH_DATE\" \"$LPATH\"\nrm \"$LPATH\"" > /tmp/backup-rstats
        chmod 777 /tmp/backup-rstats
        echo -e "#!/bin/sh\nservice rstats stop\nftpget -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$LPATH\" \"$RPATH\"" > /tmp/restore-rstats
        echo -e "if [ \$? != 0 ] ; then\n  logger RStats Restore Failed... will retry\n  cru a rstats \"*/5 * * * * /tmp/restore-rstats\"\n  return 1\nfi\ntar -xzf \"$LPATH\" -C /\nrm \"$LPATH\"\nservice rstats start\ncru a rstats \"1 */12 * * * /tmp/backup-rstats\"" >> /tmp/restore-rstats
        echo -e "logger RStats Data Restored\nrm /tmp/restore-rstats" >> /tmp/restore-rstats
        chmod 777 /tmp/restore-rstats
        /tmp/restore-rstats
    fi
    
    now, the script upload the same tar twice, but the second one has the date in its name.
    if one backup fail, you can still restore your stats with a dated backup.
    I also added the -z flag to the tar command.
     
  51. pixelfreak

    pixelfreak LI Guru Member

    It seems that you are experts regarding to rstats on a tomato machine.

    The problem is that my rstats didnt not work at all on the webinterface.

    Only the Real Time and Last 24 Hours stats are working. the other only display this: http://img405.imageshack.us/img405/5233/rstats1tt1.jpg

    For a few weeks all worked correctly. I upgraded to version 1.09 but still the same problem. I hope that you have some ideas, about that problem. I tried evertything. Changed the save location to ram, nvram, cifs .... tried all configurations, but none of them solved my problem.
     
  52. mraneri

    mraneri LI Guru Member

    Can't really help you. Personally, I'm not an expert in rstats. Others have just pointed out which files needed to be saved/restored.

    My bandwidth monitoring configuration is setup to save to RAM, once a week. (The script does an explicit save, whenever it runs, so once a week is not important.) and everything else works fine...

    Sorry I can't help.
     
  53. pixelfreak

    pixelfreak LI Guru Member

    anyone else ? ;(
     
  54. liquidzyklon

    liquidzyklon LI Guru Member

    @pixelfreak, did you try clearing out the bandwidth files and start a new one? or worse comes to worse, you might need to clear your NVRAM and re-configure your router.

    I am using the scripts from page 4 and page 5 and everything works as should. I still have a few more "tests" to try to make sure that the script is really fool-proof. Right now, I have the bandwidth configurations set to save to RAM every 2 hours from the webGUI. The script I have also backs-up the files every 2 hours (as shown in the attachment). One thing I notice is that the log backups are increasing by 2KB every backup and this is a concern to me. Wouldn't this increase make the files almost impossible to be loaded into the RAM say after 6 months? Has anyone been using the Bandwidth tracker for longer periods of time (i.e. 6+ months) that have large files?
     

    Attached Files:

  55. psychowood

    psychowood LI Guru Member

    My traffic has been logged down since March 2007, but my files aren't that much bigger

    Code:
    # ls -l /tmp/var/lib/misc/
    -rw-r--r--    1 root     root          110 Oct 15 18:25 dnsmasq.leases
    -rw-rw-rw-    1 root     root          992 Oct 15 18:48 rstats-history.gz
    -rw-------    1 root     root            0 Oct 13 15:46 rstats-source
    -rw-rw-rw-    1 root     root        27792 Oct 15 18:48 rstats-speed.gz
    -rw-------    1 root     root            4 Oct 15 18:48 rstats-stime
    -rw-------    1 root     root            2 Oct 13 15:46 rstatsbackup.inited
    -rw-rw-rw-    1 root     root            4 Oct 15 18:25 wantime
    
     
  56. mraneri

    mraneri LI Guru Member

    I only have two pages of posts, not 5 or 6 or more. You can refer to specific posts by number... The post number is shown in the post, just above the poster's join date. THIS post, for example, is #56 in the thread...
     
  57. liquidzyklon

    liquidzyklon LI Guru Member

    @psychowood, you are right. After running the logs for longer than a day, I see that the file has stabilized. The reason is that rstats-speed.gz is what logs the last 24 hours so it was growing in size. I think it will stabilize once it tracks 24 hours of traffic time. The rstats-history.gz file will grow very slowly because it just tracks bandwidth usage.

    @mraneri, I followed your instructions from post 39 and used the script from post 50 (modified to back-up twice, once just rstats.tgz and another rstatsDATE-TIME.tgz). Now that I don't have the worry about the size of the files, but I just did another "test" and the backup failed because the rstats.tgz was corrupted.

    Basically my test was to unplug the router and unplug the FTP computer to simulate a power outage in the middle of the night. Plug in router and not the FTP computer, waited 2 min and checked logs to show that Rstats Failed to Restore. Then plug in the FTP computer and see that RStats Data Restored perfectly. When I check the Bandwidth --> Daily, I see nothing (I should have about 2.5 days of bandwidth tracked). I go to the FTP folder and check the files. I open the rstats.tgz with WinRAR and it shows that it was corrupted. ARRGGHH. At noon, the file was fine. At 2PM, the file is missing rstats-history.gz. At 4PM to now, all the archives are corrupted. So can you please provide some suggestions as to why the TAR would be corrupted?
     
  58. mraneri

    mraneri LI Guru Member

    No idea why the tar.gz is getting corrupted, unless the FTP server is doing something funny? Any possibility the transfer is occurring in ascii instead of binary? I'm not sure how to specify ftpput should use ascii mode. I assume it's always binary.

    the .tar.gz file should be openable in WinRAR. My files are..

    Which server software are you using? (Not that I'm familiar with many, anyway, but maybe there's a problem there.)

    - Mike
     
  59. mraneri

    mraneri LI Guru Member

    Actually, I just noticed that srouquette added the gzip option to the tar command line. This is really unnecessary, as the files that are being tar-ed are already compressed, so you wouldn't theoretically save any significant file size. I don't have any explanation why that would be causing a problem, but it is a difference.

    Not sure what else to add.
     
  60. liquidzyklon

    liquidzyklon LI Guru Member

    I am using Xlight FTP server because it's free and light. I don't know what settings the ftpput is using but it works fine for some files and not so fine for others.

    Yep srouquette changed the script from *.tar to *.tgz. So basically last night, I re-edited his script to *.tar the files (so that it does not need to compress the files) and see how it works. After a few days of using this, I'll try to open up each *.tar file and see if any are corrupted.

    Does it matter when the Bandwidth Monitor backup its stats (I'm referring to the GUI: Admin-->BWM-->Backup Option)? I have the BWM to backup every 2 hours. I also have the script backup every 2 hours. Could this be clashing i.e. both trying to back up around the same time and it messed up the script?

    Thanks for your input.
     
  61. mraneri

    mraneri LI Guru Member

    I doubt that's the problem, but it's definitely unnecessary.. I set mine in the GUI to once a week because the SCRIPT instructs rstats to update the files immediately before the backup. (the "killall -1 rstats" sends a signal to rstats to refresh the files...) Also, save history location is RAM, but you already know that. Create new file/reset data is also unchecked.

    You may want to change it, but I'm not confident it's a significant factor.

    - Mike
     
  62. liquidzyklon

    liquidzyklon LI Guru Member

    I just changed it to 1 week through the webGUI.

    Another problem, I have two back-up logs today where one *.tar file is missing rstats-history.gz while the other one has it. I notice that your script has the sleep 1 second after closing rstats to give it time to backup. To me, 1 seconds should be long enough for the router to backup the settings to allow it to tar the files. I am going to try and increase the the delay to 2 seconds and see what happens. Does anyone else notice this problem? How come this has not occur when I reboot the router (b/c a reboot would cause the router to backup the settings and restore it from the FTP server)?

    Here's my WAN Up script:
    Code:
    USER="name"
    PASS="pw"
    PORT=21
    SERVER="192.168.123.50"
    RPATH="/rstats.tar"
    RPATH_DATE="/rstats\`date +%Y.%m.%d-%H.%M.%S\`.tar"
    LPATH="/tmp/rstats.tar"
    
    /bin/sleep 10
    
    if [ ! -s /tmp/backup-rstats ] ; then
        echo -e "#!/bin/sh\nkillall -1 rstats\nsleep 2\ntar -cf \"$LPATH\" /tmp/var/lib/misc/rstats-*\nftpput -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$RPATH\" \"$LPATH\"\nftpput -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$RPATH_DATE\" \"$LPATH\"\nrm \"$LPATH\"" > /tmp/backup-rstats
        chmod 777 /tmp/backup-rstats
        echo -e "#!/bin/sh\nservice rstats stop\nftpget -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$LPATH\" \"$RPATH\"" > /tmp/restore-rstats
        echo -e "if [ \$? != 0 ] ; then\n  logger RStats Restore Failed... will retry in 5 minutes\nled amber off\n  cru a rstats \"*/5 * * * * /tmp/restore-rstats\"\n  return 1\nfi\nled amber on\ntar -xf \"$LPATH\" -C /\nrm \"$LPATH\"\nservice rstats start\ncru a rstats \"1 */1 * * * /tmp/backup-rstats\"" >> /tmp/restore-rstats
        echo -e "logger RStats Data Restored\nrm /tmp/restore-rstats" >> /tmp/restore-rstats
        chmod 777 /tmp/restore-rstats
        /tmp/restore-rstats
    fi
    
     
  63. SirAdam2k

    SirAdam2k LI Guru Member

    I have got a problem with my WANUP Script I guess.

    Everything worked for me fine until 19th of june. On that my ISP changed my connection to VDSL (got a new modem and everything). But I don't think that it has to do anything with it (at least I hope so).

    The Problem is that my router (WRT54GL Tomato 1.19) just uploads 0 byte files and the amber LED is still on.

    Here is the WANUP script

    PHP:
    USER="username"
    PASS="pw"
    PORT=21
    SERVER
    ="IP"
    RPATH="/rstats.tar"
    RPATH_DATE="/rstats\`date +%Y.%m.%d-%H.%M.%S\`.tar"
    LPATH="/tmp/rstats.tar"
    /bin/sleep 10
    if [ ! -/tmp/backup-rstats ] ; then
        
    echo -"#!/bin/sh\nkillall -1 rstats\nsleep 2\ntar -cf \"$LPATH\" /tmp/var/lib/misc/rstats-*\nftpput -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$RPATH\" \"$LPATH\"\nftpput -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$RPATH_DATE\" \"$LPATH\"\nrm \"$LPATH\"" > /tmp/backup-rstats
        chmod 777 
    /tmp/backup-rstats
        
    echo -"#!/bin/sh\nservice rstats stop\nftpget -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$LPATH\" \"$RPATH\"" > /tmp/restore-rstats
        
    echo -"if [ \$? != 0 ] ; then\n  logger RStats Restore Failed... will retry in 5 minutes\nled amber off\n  cru a rstats \"*/5 * * * * /tmp/restore-rstats\"\n  return 1\nfi\nled amber on\ntar -xf \"$LPATH\" -C /\nrm \"$LPATH\"\nservice rstats start\ncru a rstats \"1 */1 * * * /tmp/backup-rstats\"" >> /tmp/restore-rstats
        
    echo -"logger RStats Data Restored\nrm /tmp/restore-rstats" >> /tmp/restore-rstats
        chmod 777 
    /tmp/restore-rstats
        
    /tmp/restore-rstats
    fi
    I already cleared the NVRAM and setup everything again twice. I used the same script for 2 to 3 months now without any trouble.

    The Router can download the empty rstats file, I know that because it doesn't give me an error it says "RStats Data Restored" but nothing is in there because it is empty :D.

    Hope someone can help me.

    Cheers

    Michael

    PS: I hope my english is not to bad :)
     
  64. test100

    test100 LI Guru Member

    居然一直没看到这贴,自己还打算尝试ssh下的scp之类的 命令解决这个备份的问题。呵呵。稍后尝试,目前无法远程登陆
     
  65. Morac

    Morac Network Guru Member

    I found that with my version of the script above, after upgrading to 1.22, the backups were getting corrupted. It turns out that the 1 second sleep is no longer long enough for rstats to do the data and gzip it up so my backups ended up containing half dumped data. I increased the sleep time to 3 seconds and now everything works again. I have about 9 months worth of history which may also have something to do with it.
     
  66. toobs

    toobs Addicted to LI Member

    Hi,

    I'm new to Tomato and I am using 1.23.

    I am having problems in getting the script and ftp to work.

    In my ftp, I created "Remote Path/rstats.tar"

    I edited the script with my username, password, ftp address.

    For RPATH, I have:

    RPATH=/Remote Path/rstats.tar

    For LPATH, I have:

    LPATH="/tmp/rstats.tar"

    I have "rstats\nsleep 3" set to 3.

    Under Bandwidth Monitoring, I'm not sure what to change it to or what to configure.

    I am still using Ram.

    Help!

    Thanks.
     
  67. mraneri

    mraneri LI Guru Member

    Spaces in Paths, while possible in linux, can create all sorts of problems. Change "Remote Path" to "RemotePath" or "Remote_Path" and try again. I didn't look at the details, but I can almost guarantee the space in the path is going to cause a problem unless you quote the heck out of the script, which I don't recommend bothering with. Make the change to the directory name and write back if it still doesn't work.

    - Mike
     
  68. toobs

    toobs Addicted to LI Member

    Thank you for your reply. I made the changes, but how do I know if the log is ftp'ing or not?

    In your instructions, you mentioned that we need to set Bandwidth monitoring to manual.

    <<Go to the Administration -> Bandwidth Montoring page and manually backup your bandwidth monitoring stats to your local PC. (This does not have to be to the FTP server.)>>

    This is what I don't understand. How do you manually backup to the ftp or local?

    Here is my log:

    Apr 24 13:57:53 ? user.notice root: RStats Data Restored
    Apr 24 13:58:09 ? cron.err crond[100]: time disparity of 20676778 minutes detected
    Apr 24 14:00:01 ? syslog.info root: -- MARK --
     
  69. mraneri

    mraneri LI Guru Member

    Log into your FTP server and check the time stamp on the rstats file.
    Don't have to. The script saves the bandwidth to the rstats file immediately before ftping it, so it doesn't matter if you have the router's UI set to save it or not.
    On that page, click the backup button. Take the file that you save to your desktop and manually put it on the FTP server, where the script will download it.
     
  70. toobs

    toobs Addicted to LI Member

    Thank you so much. It is working!

    How often does the script ftp's the log?

    Can you change on how often it ftp?
     
  71. lanmtl

    lanmtl Addicted to LI Member

    I cant get this working its driving me mad!!
    following mraneri's method I get stuck at stage 10.
    No matter what I do or how much checking in username password and ftp address/directories is involved, I can't get the rstats to be restored successfully.
     
  72. mraneri

    mraneri LI Guru Member

    What does the log say??? Anything useful? When you login to the FTP server with an FTP program, you can see/download the file to your PC?

    I assume you checked the path, username, password, etc?
     
  73. lanmtl

    lanmtl Addicted to LI Member

    I just says rstats could not restore blablabla will try again
    I can see and download the file no problem on the FTP its 0 bytes though as the script never managed to write anything to it.

    path. username and passwords are all valid
     
  74. mraneri

    mraneri LI Guru Member

    Telnet into your router,
    then type the following two lines:
    cd /tmp
    cat restore-rstats

    I assume you see the script show up.
    Now type:
    restore-rstats

    Do you get any errors? Check the log again. I assume is still says will retry???
     
  75. lanmtl

    lanmtl Addicted to LI Member

    Thanks for your help, I found what was the problem.
    It seems there is a $RPATH maximum length limit. I used a directory with 40ish a-z 0-9 random characters to make it somewhat more "secure". When I changed it to /tomato/ then it worked fine.
    I don't know what the hard limit on $RPATH is but at least I can testify it's <40
    Moving on to the next step!
     
  76. lanmtl

    lanmtl Addicted to LI Member

    one more question though, how do I change the timing for saving FTP? I'd like to change it to every hour instead of 12 hours?
     
  77. FlashSWT

    FlashSWT LI Guru Member

    Awesome thread, thanks for the scripts mraneri! I do have a quick question, I'd like to use the updated script with the additional dated backups, but there was some discussion of unnecessary flags or something and I got lost trying to compare the two.

    Was there ever any agreement on adding the dated option but maintaining the original flags, can someone post that code or clarify?

    Thanks.
     
  78. mraneri

    mraneri LI Guru Member

    I have never investigated others use of dated backups. I still use the script I posted way back in this thread and as a result still have history back through November 2008. Maybe others can comment, or you can try your own experimentation. Sorry I can't offer any additonal assistance.

    - Mike
     
  79. FlashSWT

    FlashSWT LI Guru Member

    Mike, I used your scripts from earlier in the thread and they seem to be working just fine. I've rebooted a couple times and the stats get restored like they should.

    I used the date code from srouquette's example and that part looks to be working as well.

    I just switched it around to more closely match your original example. The main difference (besides specifying the new DATE path) is in the first echo line with a second nftpput command.

    Code:
    USER="username"
    PASS="password"
    PORT=21
    SERVER="ip"
    RPATH="/rstats.tar"
    RPATH_DATE="/rstats\`date +%Y.%m.%d-%H.%M.%S\`.tar"
    LPATH="/tmp/rstats.tar"
    
    if [ ! -s /tmp/backup-rstats ] ; then
        echo -e "#!/bin/sh\nkillall -1 rstats\nsleep 2\ntar c /tmp/var/lib/misc/rstats-* > \"$LPATH\"\nftpput -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$RPATH\" \"$LPATH\"\nftpput -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$RPATH_DATE\" \"$LPATH\"\nrm \"$LPATH\"" > /tmp/backup-rstats
        chmod 777 /tmp/backup-rstats
        echo -e "#!/bin/sh\nservice rstats stop\nftpget -u \"$USER\" -p \"$PASS\" -P $PORT $SERVER \"$LPATH\" \"$RPATH\"" > /tmp/restore-rstats
        echo -e "if [ \$? != 0 ] ; then\n  logger RStats Restore Failed... will retry\n  cru a rstats \"*/5 * * * * /tmp/restore-rstats\"\n  return 1\nfi\ntar x -f \"$LPATH\" -C /\nrm \"$LPATH\"\nservice rstats start\ncru a rstats \"1 */1 * * * /tmp/backup-rstats\"" >> /tmp/restore-rstats
        echo -e "logger RStats Data Restored\nrm /tmp/restore-rstats" >> /tmp/restore-rstats
        chmod 777 /tmp/restore-rstats
        /tmp/restore-rstats
    fi
    Seems to work just fine. It still restores from the single rstats.tar file, the dated ones are just extra backups.
     
  80. mraneri

    mraneri LI Guru Member

    Great. Glad it's all working for you the way you want. Thanks for contributing what's working for you. (It's what makes this forum great.)
     
  81. FlashSWT

    FlashSWT LI Guru Member

    No problem, I really like this solution for backing up the bandwidth stats (the main reason I found Tomato in the first place). I only wish I had found this thread sooner so my stats would go back farther then they do.
     
  82. alienrex

    alienrex Addicted to LI Member

    Anyone use external (public) ftp and is implemented md5 for checking the files?
     
  83. psychowood

    psychowood LI Guru Member

    I'm using an external ftp, but I'm not using MD5 because on Tomato I only have ftpget and ftpput commands, and I cannot retrieve the MD5 from the remote server. The only way I see is trasferring back the files and comparing them...
     
  84. alienrex

    alienrex Addicted to LI Member

    Do you use any other kind of protection in remote ftp?

    For example if thats same server as www - do you protect it via .htaccess or something similar?
     
  85. psychowood

    psychowood LI Guru Member

    Simple plain ftp auth, I'm using a non www-accessible folder in a hosting space.

    They are just stats files, after all :)
     
  86. alienrex

    alienrex Addicted to LI Member

    Beside stats - is it possible to backup something else?
    I was thinking to archive logs or even configuration (of course only backup not restore).


    P.S: What time period ob backup is suitable - imho every 5 minutes is a bit too much.
     
  87. psychowood

    psychowood LI Guru Member

    Yes you can, you just have to save the settings in a file and upload it somewhere...

    For example, if you want to backup your settings you can run "nvram backup backupfilename" and then upload backupfilename.
     
  88. FlashSWT

    FlashSWT LI Guru Member

    I'm using the same location setup as psychowood and currently backing up at 2 hour intervals. Seems to be working great.
     
  89. Morac

    Morac Network Guru Member

    I'm using the backup script that uploads the stats twice, with the second one with the date in the filename in case of a failure. It works great, with only one issue.

    I have a tendency to completely forget that it's running so I'll log into my server and find several hundred (or thousand) backup files which are completely unnecessary.

    Is there a way to set up the script to purge the older backup files periodically? I don't really need anything older than 30 days.


    P.S. - For people backing up every hour, there's no reason to specify */1 in the hour field of the cru expression since * will work just as well.
     
  90. psychowood

    psychowood LI Guru Member

    Not automagically.
    You have to write a script to do so (I'm rather sure you can find plenty with google), but you'll also need the required programs (ncftp, for example) available on your router shell, which is not always the case.

    A solution that would not need external programs could be done generating the filename of the second backup based on the day of the month alone (rstats-history.01.tgz, ... , rstats-history.31.tgz): the only drawback I see is that you need to rely on the server timestamp to identify the latest backup. You could upload a text file containing the latest filename in it, but that would not necessarily help you in case of failure.
     
  91. Morac

    Morac Network Guru Member

    That's actually a good idea. The server I'm using timestamps files so figuring out the latest isn't an issue.

    In the few years I've been using this script, I've only had one failure and that's when the backup data got so large that a one second sleep time wasn't long enough to wait after it was generated to tar it. I increased the sleep time to 3 seconds which should keep me going for a good time longer (especially since I have since wiped the data).
     
  92. sm69th2

    sm69th2 Networkin' Nut Member

    Script not working...my fault I am sure...help :)

    I am trying to install this script, and I am having some issues. Probably cause I am a windows guy :).

    If anyone can help, it woudl be greatly appreciated. I have been using CIFS for backing up my bandwidth stats, but my PC isn't always on, so I want to use a FTP solution.

    I am working in notepad++. I have it set to "Encode in ANSI". When I save the file, I save it as a "Unix Script File(*.sh,*.bsh)", giving it the name "c:\test.sh".

    Then I FTP it to my webserver with FlashFXP (in ASCII Text mode).

    I then telnet to my router and issue the following command:
    wget http://mydomain.com/backup.sh

    The file is uploaded to the router without problem. Next I try to run the script using:

    sh backup.sh

    I get the following error:
    backup.sh: line 21: syntax error: unexpected end of file (expecting "then")

    So..what am I doing wrong? Please bear in mind I am a windows guy, so if the correct way is complicated, please provide as many details as possible.

    Here is a copy of the script I am trying to run:

    USER="XXXX"
    PASSWORD="XXXX"
    SITE="555.555.55.555" #url WITHOUT ftp://, like "mysite.com" or "IPADDRESS"
    FOLDER="/BandwidthStats" #WITHOUT ending slash, like "/tomato"

    ps | grep -q rstats
    if [ $? == 0 ] ; then

    #backup rstats
    led amber on
    led white on

    ftpput -u $USER -p $PASSWORD $SITE $FOLDER/rstats-history.gz /tmp/var/lib/misc/rstats-history.gz
    ftpput -u $USER -p $PASSWORD $SITE $FOLDER/rstats-speed.gz /tmp/var/lib/misc/rstats-speed.gz
    ftpput -u $USER -p $PASSWORD $SITE $FOLDER/rstats-stime /tmp/var/lib/misc/rstats-stime
    ftpput -u $USER -p $PASSWORD $SITE $FOLDER/rstats-source /tmp/var/lib/misc/rstats-source

    led white off
    led amber off
    fi
     
  93. psychowood

    psychowood LI Guru Member

    You probably just need to convert line endings in UNIX format:

    [​IMG]
     
  94. sm69th2

    sm69th2 Networkin' Nut Member

    Well, that fixed that problem. Now I have a new one..in the init.sh file, this section always fails:

    wget $LOCATION/rstats-source
    if [ $? != 0 ] ; then
    led amber on
    led white off
    return 1
    fi

    I am running it from a telnet prompt and I get this:

    root@?:/jffs/statsbackup# sh init.sh
    led <diag> <on/off> [...]
    led <diag> <on/off> [...]
    .
    Done.
    Connecting to mydomain.com <555.555.55.555:80>
    Connecting to mydomain.com <555.555.55.555:80>
    Connecting to mydomain.com <555.555.55.555:80>
    wget: server returned error: HTTP/1.1 404 Not Found
    led <diag> <on/off> [...]
    led <diag> <on/off> [...]

    But if I FTP to my server...that file IS there:

    rstats-source (size 36)

    I have noticed that I can't browse to that file...I think because it has no extension. Not sure if my web server likes that.

    Any idea what I can do? It does backup the files now..if I can get it to reload them,then I am all set.
     
  95. Mojonba

    Mojonba Network Guru Member

    I have used someone else copy of the script but it is doing the backing up hourly. I prefer to do it daily. What do I need to change to change the frequency? Thx

    Code:
    ncru a rstats \"1 */X
    /X, X in hours?
     
  96. sm69th

    sm69th Guest

    This would run it at midnight daily:

    cru a rstatsbackup "0 0 * * * /jffs/rstatsbackup/backup.sh";

    The first 0 is the minute (0 to 59).
    The second 0 is the hour ( 0 to 23).
     
  97. Mojonba

    Mojonba Network Guru Member

    Thanks. This wikipedia link helped too.
     
  98. Zaheer Abbas

    Zaheer Abbas Serious Server Member

    Thanks for a good script.

    But this script only saves the bandwidth stats of "Bandwidth Monitor", it doesn't save the stats of "IP Traffic".

    Please help me I want to also save the stats of IP Traffic to FTP server.

    Thanks.
     
  99. Zaheer Abbas

    Zaheer Abbas Serious Server Member

    Looking for your help.

    Thanks in advance.
     
  100. Zaheer Abbas

    Zaheer Abbas Serious Server Member

Share This Page