1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hourly Bandwidth Log

Discussion in 'Tomato Firmware' started by sumo9, Dec 14, 2008.

  1. sumo9

    sumo9 Guest

    I have set up the CIFS client to save the log at a network share.

    Is it possible to save detailed WAN bandwidth on an hourly basis, so I can for example go back what the usage was at 2am 3 months ago ?
     
  2. sumo1

    sumo1 Addicted to LI Member

  3. chchia

    chchia LI Guru Member

    do you notice about the last 24 hour selection?
     
  4. sumo1

    sumo1 Addicted to LI Member

    yes, it would be good if the Bandwidth log was recorded hourly. Instead of just lasting for 24 hours. Im sure there must be a script to bring up WAN usage hourly then recorded into CIF.
    Only if I knew the command, maybe then this could be put into CRON.
     
  5. tievolu

    tievolu Network Guru Member

    This script will get you the WAN bandwidth used in the last X minutes:

    Code:
    #!/bin/sh
    #
    # getbwstat.sh
    #
    # Get WAN bandwidth used in the last X minutes.
    # e.g. bandwidth used in the last 2 hours:
    #
    #     > ./getbwstat.sh 120
    #     292.49 MB down, 388.48 MB up
    
    if [[ "$#" -ne 1  || "$1" -lt 0 || "$1" -gt 1440 ]]
    then
    	echo "Usage: getbwstat.sh [number of minutes (0-1440)]"
    	exit
    fi
    
    killall -USR1 rstats
    sleep 2
    
    read_next_rx_tx=0
    while read line
    do
    	case $line in
    		*vlan1*) read_next_rx_tx=1 ;;
    	esac
    
    	if [[ $read_next_rx_tx = 1 ]]
    	then
    		case $line in
    			*rx:*) rx_data=$line ;;
    			*tx:*) tx_data=$line ; read_next_rx_tx=0 ;;
    		esac
    	fi
    done < /var/spool/rstats-speed.js
    rm /var/spool/rstats-speed.js
    
    number_of_values=`expr $1 / 2`
    
    total_bytes_rx=`echo $rx_data | awk '
      {
        values_string = substr($0, index($0, "[") + 1, index($0, "]") - 6);
        split(values_string, values_array, ",");
        for (i=(721-number_of_values); i<=720; i++) {
        total += values_array[i];
        } printf("%0.f", total);
      }' number_of_values=$number_of_values total=0`
    
    total_bytes_tx=`echo $tx_data | awk '
      {
        values_string = substr($0, index($0, "[") + 1, index($0, "]") - 6);
        split(values_string, values_array, ",");
        for (i=(721-number_of_values); i<=720; i++) {
        total += values_array[i];
        } printf("%0.f", total);
      }' number_of_values=$number_of_values total=0`
      
    total_megabytes_rx=`echo "" | awk '{ printf("%.2f", bytes/1048576) }' bytes=$total_bytes_rx`
    total_megabytes_tx=`echo "" | awk '{ printf("%.2f", bytes/1048576) }' bytes=$total_bytes_tx`
    
    echo "$total_megabytes_rx MB down, $total_megabytes_tx MB up"
    
    For example, the following would give you the bandwidth used in the last 2 hours:

    Code:
    > ./getbwstat.sh 120
    292.49 MB down, 388.48 MB up
    
    You could then run the script in a cron job to record the hourly stats. Something like this running every hour would probably be enough:

    Code:
    echo "`date` - `/jffs/getbwstat.sh 60`" >>/cifs1/hourlystats.txt
    
    This would give you timestamped stats in hourlystats.txt. Obviously you'd need to sort out the filenames and paths for your particular environment.

    Hope this helps
     
  6. sumo1

    sumo1 Addicted to LI Member

    Thank you very much Tievolu, the script works very well. It looks very complicated, appreciate the work.
     

Share This Page