Weekly DC Stats scripts
Posted: Thu Jun 23, 2022 8:03 pm
Here are the scripts which I am currently using to create the Weekly DC Stats postings over at anandtech.com. The principle is that you start the main script in a terminal, then copy and paste the output in the forum posting editor box, and finish it off by adding some commentary of your own. (The boilerplate text and the weblinks section is already included in the script output, you don't have to copy it from anywhere else.)
Scripts? Main script? — Yep, there are three scripts involved. (Or four, since we currently have Bonus Stats.) Originally, there was just one script which fetched all of the stats data from FreeDC. But because FreeDC stopped collecting stats for non-BOINC projects in July 2020, I eventually added two more scripts:
The second script fetches Folding@home stats from several web pages at folding.extremeoverclocking.com. EOC's "last 7 days" data are somewhat hard to get, so I am using "24hr Avg" figures instead and multiply them by 7, which is equivalent apart from very small rounding errors.
The third script fetches stats from a few web pages at distributed.net. I am currently using this for the projects OGR-28 and RC5-72. Unfortunately, weekly stats are not available at distributed.net at all. Therefore I am using local cache files for all-time stats from the previous Sunday. Let's say you start this script on Sunday, August 28. The script will download up-to-date all-time stats of all current TeAm members, and of up to 1000 teams, and store them all in a files called OGR-28/28AUG2022.txt, OGR-28/28AUG2022_teams.txt, RC5-72/28AUG2022.txt, and RC5-72/28AUG2022_teams.txt. The data in there consist of name and credit, separated by tabs and newlines. Then it will look for a files called OGR-28/21AUG2022.txt and so on. If such files are there, it will match names in the respective pairs of files, subtract previous Sunday's credit from current credit, and then log the result to the text terminal in a format which looks like FreeDC's weekly stats. (The user name matching is a bit more complicated than it seems at first glance, because names are not unique. I could extract unique user IDs out of the HTML, but I implemented an easier heuristic method instead.) The script will also delete any {OGR-28,RC5-72}/{day}{month}{year}{,_teams}.txt files which don't belong to the current day or to last Sunday.
To get OGR-28 stats, the script needs to be invoked with the following parameters:
./dnet_weekly_stats_dump.sh OGR-28 28
(28 is distributed.net's numerical ID of the project called 'OGR-28'.)
To get RC5-72 stats, the script needs to be invoked such:
./dnet_weekly_stats_dump.sh RC5-72 8
or simply
./dnet_weekly_stats_dump.sh
(8 is distributed.net's numerical ID of the project called 'RC5-72'.)
Notes:
weekly_stats_dump.sh (the main script)
fah_weekly_stats_dump.sh
dnet_weekly_stats_dump.sh
Scripts? Main script? — Yep, there are three scripts involved. (Or four, since we currently have Bonus Stats.) Originally, there was just one script which fetched all of the stats data from FreeDC. But because FreeDC stopped collecting stats for non-BOINC projects in July 2020, I eventually added two more scripts:
The second script fetches Folding@home stats from several web pages at folding.extremeoverclocking.com. EOC's "last 7 days" data are somewhat hard to get, so I am using "24hr Avg" figures instead and multiply them by 7, which is equivalent apart from very small rounding errors.
The third script fetches stats from a few web pages at distributed.net. I am currently using this for the projects OGR-28 and RC5-72. Unfortunately, weekly stats are not available at distributed.net at all. Therefore I am using local cache files for all-time stats from the previous Sunday. Let's say you start this script on Sunday, August 28. The script will download up-to-date all-time stats of all current TeAm members, and of up to 1000 teams, and store them all in a files called OGR-28/28AUG2022.txt, OGR-28/28AUG2022_teams.txt, RC5-72/28AUG2022.txt, and RC5-72/28AUG2022_teams.txt. The data in there consist of name and credit, separated by tabs and newlines. Then it will look for a files called OGR-28/21AUG2022.txt and so on. If such files are there, it will match names in the respective pairs of files, subtract previous Sunday's credit from current credit, and then log the result to the text terminal in a format which looks like FreeDC's weekly stats. (The user name matching is a bit more complicated than it seems at first glance, because names are not unique. I could extract unique user IDs out of the HTML, but I implemented an easier heuristic method instead.) The script will also delete any {OGR-28,RC5-72}/{day}{month}{year}{,_teams}.txt files which don't belong to the current day or to last Sunday.
To get OGR-28 stats, the script needs to be invoked with the following parameters:
./dnet_weekly_stats_dump.sh OGR-28 28
(28 is distributed.net's numerical ID of the project called 'OGR-28'.)
To get RC5-72 stats, the script needs to be invoked such:
./dnet_weekly_stats_dump.sh RC5-72 8
or simply
./dnet_weekly_stats_dump.sh
(8 is distributed.net's numerical ID of the project called 'RC5-72'.)
Notes:
- After you download these scripts, don't forget to set their executable flag.
- The normal mode of operation is that you call only the main script. It will call the other two scripts and insert their output into the overall output.
- The main script expects the other two scripts to be named "fah_weekly_stats_dump.sh" and "dnet_weekly_stats_dump.sh", and that these two scripts are located in the same directory as the main script.
- Each of the scripts requires the text web browser "links". Most distributions should have this packaged, but it is unlikely to be installed by default.
I am relying on "links" to a) perform the web download, b) convert HTML to plaintext. - I spoke above about cache files like OGR-28/28AUG2022.txt. Actually, the dnet stats script expects these data files to reside at the path "${HOME}/Distributed_Computing/Weekly_Stats/OGR-28" and "${HOME}/Distributed_Computing/Weekly_Stats/RC5-72". You need to create these exact directories first in order to be able to run that script. If you prefer a different path, edit the top of dnet_weekly_stats_dump.sh accordingly.
weekly_stats_dump.sh (the main script)
Code: Select all
#!/bin/bash
projects=(
'alb&teamid=12'
'ami&teamid=174'
'ast&teamid=183'
'beef&teamid=27'
'tac&teamid=33'
'csg&teamid=115'
'bcpdn&teamid=4'
'col&teamid=126'
'cos&teamid=73'
'den&teamid=192'
'dhe&teamid=52'
'dpad&teamid=Team+Anandtech'
'eah&teamid=12'
'fah&teamid=198'
'gaia&teamid=97'
'ger&teamid=58'
'goo&teamid=2628'
'gooc&teamid=29'
'goof&teamid=189'
'ps3&teamid=175'
'ibe2&teamid=62'
'ithc&teamid=189'
'ith&teamid=179'
'kry&teamid=17'
'lts&teamid=54'
'lhc&teamid=4'
'lhcc&teamid=86'
'loda&teamid=28'
# 'maj&teamid=22'
'mil&teamid=77'
'min&teamid=141'
'mine&teamid=206'
'mlc&teamid=187'
'moo&teamid=173'
'nub&teamid=139'
'nfs&teamid=125'
'num&teamid=54'
'odl&teamid=32'
'odlk25&teamid=5'
# 'ogr28&teamid=AnandTech+10635'
'pgrid&teamid=132'
'pgfn&teamid=30'
'quc&teamid=180'
'rad&teamid=173'
'rak&teamid=1758'
'ralph&teamid=60'
'rnma&teamid=8'
'rc572&teamid=AnandTech+10635'
'rna&teamid=126'
'rah&teamid=79'
'sah&teamid=30191'
'sid&teamid=52'
'srb&teamid=37'
'spt&teamid=10'
'pad&teamid=3183'
'gne&teamid=114'
'uni&teamid=176'
'vdw&teamid=195'
'vgtu&teamid=67'
'bwcg&teamid=16700'
'wep&teamid=8'
'wup&teamid=125'
'yaf&teamid=27'
'yoy&teamid=116'
)
bar () {
printf '\n=========================================================\n\n'
}
mangle() {
sed -e '# set the project name in bold
2{s/\(^ \)\(.*\)\( overall position\)/[B]\2[\/B]\3/}
t
# un'link'ify e-mail addresses
s/@[-0-z]\+[.]/&/
T
s/[.]/[I].[\/I]/g'
}
bar
echo "Weekly DC Stats - $(date +%d%^b%Y)"
bar
cat <<'EOF'
[SIZE=2]**********************************************
In the event we have any non-crunching AnandTech readers who happen to wander into this thread: Distributed Computing is where you allow your computing device (smartphones/tablets included) to work on things like medical research, mathematical stuff, sifting through telescope data to further the field of Astronomy, and many other 'citizen science' projects. It allows networked computers to band together to act as a supercomputer. And you should join us. Thanks go, as always, to the folks responsible for Free-DC, who make this possible by keeping score for us.
**********************************************[/SIZE]
EOF
for proj in ${projects[*]}
do
case "${proj}" in
'fah&teamid=198')
stats=$($(realpath $(dirname $0))/fah_weekly_stats_dump.sh);;
'ogr28&teamid=AnandTech+10635')
stats=$($(realpath $(dirname $0))/dnet_weekly_stats_dump.sh 'OGR-28' '28');;
'rc572&teamid=AnandTech+10635')
stats=$($(realpath $(dirname $0))/dnet_weekly_stats_dump.sh 'RC5-72' '8');;
*)
stats=$(links -dump "https://stats.free-dc.org/spacehead.php?page=team&proj=${proj}")
sleep 0.4;;
esac
grep -q 'TeAm total for the week - 0' <<< ${stats} && (($(wc -l <<< ${stats}) < 7)) && continue
mangle <<< ${stats}
echo
done
cat <<'EOF'
[SPOILER=stats links]
[URL='https://stats.free-dc.org/teambycpid/TeAm+AnandTech']TeAm AnandTech at Free-DC[/URL]
[URL='https://www.boincstats.com/stats/-1/team/detail/8/projectList']TeAm AnandTech at BOINCstats[/URL]
[URL='https://folding.extremeoverclocking.com/team_summary.php?t=198']TeAm AnandTech Folding@home stats at EOC[/URL]
[URL='https://stats.distributed.net/team/tmsummary.php?project_id=8&team=10635']TeAm AnandTech RC5-72 stats at distributed.net[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=alb&teamid=12']Albert[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=ami&teamid=174']Amicable Numbers[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=ast&teamid=183']Asteroids[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=beef&teamid=27']Beef[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=tac&teamid=33']BOINC@TACC[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=csg&teamid=115']Citizen Science Grid[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=bcpdn&teamid=4']Climate Prediction[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=col&teamid=126']Collatz Conjecture[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=cos&teamid=73']Cosmology[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=den&teamid=192']DENIS[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=dhe&teamid=52']DHEP[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=dpad&teamid=Team+Anandtech']DPAD[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=eah&teamid=12']Einstein[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=fah&teamid=198']Folding@Home (stale)[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=gaia&teamid=97']Gaia[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=ger&teamid=58']Gerasim[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=goo&teamid=2628']Goofyxgrid[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=gooc&teamid=29']Goofyxgrid CPU[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=goof&teamid=189']goofyxGrid@Home NCI[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=ps3&teamid=175']GPU Grid[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=ibe2&teamid=62']IberCivis2[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=ithc&teamid=189']iThena.Computational[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=ith&teamid=179']iThena.Measurements[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=kry&teamid=17']Kryptos@Home[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=lts&teamid=54']Latin Squares (aka ODLK1)[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=lhc&teamid=4']LHC[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=lhcc&teamid=86']LHC-dev[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=loda&teamid=28']LODA[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=maj&teamid=22']Majestic12 (stale)[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=mil&teamid=77']Milkyway[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=min&teamid=141']MindModeling[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=mine&teamid=206']Minecraft[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=mlc&teamid=187']MLC[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=moo&teamid=173']Moo! Wrapper[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=nub&teamid=139']NanoHub[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=nfs&teamid=125']NFS[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=num&teamid=54']Number Fields[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=odl&teamid=32']ODLK[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=odlk25&teamid=5']ODLK2025[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=pgrid&teamid=132']PrimeGrid[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=pgfn&teamid=30']Private GFN Server[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=quc&teamid=180']QuChemPedIA[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=rad&teamid=173']Radioactive@Home[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=rak&teamid=1758']Rakesearch[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=ralph&teamid=60']Ralph[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=rnma&teamid=8']Ramanujan Machine[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=rc572&teamid=AnandTech+10635']RC5/72 (stale)[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=rna&teamid=126']RNA World[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=rah&teamid=79']Rosetta[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=sah&teamid=30191']SETI[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=sid&teamid=52']SiDock[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=srb&teamid=37']SRBase[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=spt&teamid=10']Symmetric Prime Tuples[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=pad&teamid=3183']T.Brada[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=gne&teamid=114']TN-Grid[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=uni&teamid=176']Universe[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=vdw&teamid=195']Van Der Waerden Numbers[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=vgtu&teamid=67']VGTU[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=bwcg&teamid=16700']WCG[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=wep&teamid=8']WEP-M2[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=wup&teamid=125']WUProp[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=yaf&teamid=27']YAFU[/URL]
[URL='https://stats.free-dc.org/spacehead.php?page=team&proj=yoy&teamid=116']Yoyo[/URL]
[/SPOILER]
EOF
bar
fah_weekly_stats_dump.sh
Code: Select all
#!/bin/bash
commarize () {
a=$(printf "%12s" ${1})
b=$(
((${1} > 999999999)) && printf "${a::-9},"
((${1} > 999999)) && printf "${a: -9:3},"
((${1} > 999)) && printf "${a: -6:3},"
printf "${a: -3}"
)
printf "${b// /}"
}
delay_next_fetch () {
sleep 0.4 # don't hammer the web server
}
fetch () {
links -http.fake-user-agent 'Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101 Firefox/102.0' \
-width 200 -dump "$1"
}
echo
page=$(fetch 'https://folding.extremeoverclocking.com/team_summary.php?t=198' | cut -c27-)
line=($(sed -nE '/Overall +24hr +7days +24hr Avg/{n;p;}' <<<"${page}"))
echo " Folding@Home overall position - ${line[0]}"
echo " TeAm total for the week - ${line[-6]}"
delay_next_fetch
page=$(fetch 'https://folding.extremeoverclocking.com/team_list.php?srt=3' | cut -c27-)
line=($(grep ' Team AnandTech ' <<<"${page}"))
echo " TeAm rank for weekly production - ${line[0]}"
echo
echo ' __Credit/week _ UserName'
r=1
for p in {1..100} # will usually break out after p=1 or 2 already
do
delay_next_fetch
page=$(fetch "https://folding.extremeoverclocking.com/user_list.php?srt=1&t=198&p=${p}" | cut -c27-)
while read line
do
l=(${line})
[ "${l[0]}" != "${r}" ] && continue
avg="${l[-6]}"
[ "${avg}" = '0' ] && break
rank="${r}_______"
score=$((${avg//,/}*7))
((score > 99999999)) && w=18 || w=13
score="$(commarize ${score})____________"
printf " ${rank::8}${score::${w}}${l[2]}\n"
((r++))
done <<<"${page}"
[ "${avg}" = '0' ] && break
done
dnet_weekly_stats_dump.sh
Code: Select all
#!/bin/bash
project_name="${1:-RC5-72}"
project_id="${2:-8}"
team_name="${3:-AnandTech 10635}"
team_id="${4:-10635}"
cd "${HOME}/Distributed_Computing/Weekly_Stats/${project_name}" || exit 1
LC_ALL=C
today="$(date +%d%^b%Y)"
last_sunday="$(date -d 'last Sunday' +%d%^b%Y)"
today_members_file="${today}.txt"
last_sunday_members_file="${last_sunday}.txt"
today_teams_file="${today}_teams.txt"
last_sunday_teams_file="${last_sunday}_teams.txt"
# delete superfluous data files
for file in [0-3][0-9][JFMASOND][AEPUCO][NBRYLGPTVC]20[2-9][0-9]{,_teams}.txt
do
case "${file}" in
"${today}"*) ;;
"${last_sunday}"*) ;;
*) rm "${file}" 2>/dev/null;;
esac
done
commarize () {
((${1}<0)) && { s='-'; ((a = 0-${1})); } || { s=''; a=${1}; }
b=$(printf "%12s" ${a})
c=$(
((${a} > 999999999)) && printf "${b::-9},"
((${a} > 999999)) && printf "${b: -9:3},"
((${a} > 999)) && printf "${b: -6:3},"
printf "${b: -3}"
)
printf -- "${s}${c// /}"
}
# download at most this many team records, or at most this many user records per team
download_limit=1000
if [ ! -f "${today_members_file}" ]
then
# download and cache today's data
for ((i=0,p=1; p<download_limit; p+=100))
do
((i+1 < p)) && break
page=$(links -width 200 -dump "https://stats.distributed.net/team/tmember.php?project_id=${project_id}&team=${team_id}&low=${p}")
while read line
do
rank=$(cut -d'|' -f2 <<<"${line}" | cut -d'(' -f1 | sed 's/^[ ]*//;s/[ ]*$//')
[ "${rank// /}" != $((i+1)) ] && continue
n1[i]=$(cut -d'|' -f3 <<<"${line}" | sed 's/^[ ]*//;s/[ ]*$//')
b1[i]=$(cut -d'|' -f9 <<<"${line}" | sed 's/^[ ]*//;s/[ ]*$//')
printf -- "%s\t%s\n" "${n1[i]}" "${b1[i]}" >> "${today_members_file}"
((i++))
done <<< "${page}"
done
else
# read today's already cached data
i=0
while read line
do
n1[i]=$(cut -f1 <<< "${line}")
b1[i]=$(cut -f2 <<< "${line}")
((i++))
done < "${today_members_file}"
fi
if [ -f "${last_sunday_members_file}" ]
then
# read last Sunday's cached data
i=0
while read line
do
n0[i]=$(cut -f1 <<< "${line}")
b0[i]=$(cut -f2 <<< "${line}")
n0_bak[i]="${n0[i]}"
((i++))
done < "${last_sunday_members_file}"
fi
if [ ! -f "${today_teams_file}" ]
then
# download and cache today's data
for ((i=0,p=1; p<download_limit; p+=100))
do
((i+1 < p)) && break
page=$(links -width 200 -dump "https://stats.distributed.net/team/tlist.php?project_id=${project_id}&low=${p}")
while read line
do
rank=$(cut -d'|' -f2 <<<"${line}" | cut -d'(' -f1 | sed 's/^[ ]*//;s/[ ]*$//')
[ "${rank// /}" != $((i+1)) ] && continue
tn1[i]=$(cut -d'|' -f3 <<<"${line}" | sed 's/^[ ]*//;s/[ ]*$//')
tb1[i]=$(cut -d'|' -f8 <<<"${line}" | sed 's/^[ ]*//;s/[ ]*$//')
printf -- "%s\t%s\n" "${tn1[i]}" "${tb1[i]}" >> "${today_teams_file}"
((i++))
done <<< "${page}"
done
else
# read today's already cached data
i=0
while read line
do
tn1[i]=$(cut -f1 <<< "${line}")
tb1[i]=$(cut -f2 <<< "${line}")
((i++))
done < "${today_teams_file}"
fi
if [ -f "${last_sunday_teams_file}" ]
then
# read last Sunday's cached data
i=0
while read line
do
tn0[i]=$(cut -f1 <<< "${line}")
tb0[i]=$(cut -f2 <<< "${line}")
((i++))
done < "${last_sunday_teams_file}"
fi
# bail if we don't have all the data
[ -f "${today_members_file}" -a -f "${last_sunday_members_file}" -a \
-f "${today_teams_file}" -a -f "${last_sunday_teams_file}" ] || exit
for ((i=0; i<${#tn1[*]}; i++))
do
[ "${tn1[i]}" = "${team_name}" ] && break
done
for ((j=0; j<${#tn0[*]}; j++))
do
[ "${tn0[j]}" = "${team_name}" ] && break
done
team_total_for_the_week=$(commarize $((${tb1[i]//,/} - ${tb0[j]//,/})))
echo
echo " ${project_name} overall position - $((i+1))"
echo " TeAm total for the week - ${team_total_for_the_week}"
teams_chart=$(
for ((i=0; i<${#tn1[*]}; i++))
do
for ((j=0; j<${#tn0[*]}; j++))
do
if [ "${tn1[i]}" = "${tn0[j]}" ]
then
b=$((${tb1[i]//,/} - ${tb0[j]//,/}))
if ((b))
then
printf -- "%s\t%s\n" $(commarize $b) "${tn1[i]}"
fi
tn0[j]=' ' # don't re-use this record, in case of duplicate names
break
fi
done
done
)
teams_chart=$(sort -nr <<< "${teams_chart}")
i=1
while read line
do
tn=$(cut -f2 <<< "${line}")
[ "${tn}" = "${team_name}" ] && break
((i++))
done <<< "${teams_chart}"
echo -n " TeAm rank for weekly production - $i"
# Since only at most ${download_limit} teams are checked, there could be more teams with
# higher weekly production than ours if we made less than the total credit
# of the lowest-ranking team.
((${team_total_for_the_week//,/} < ${tb1[-1]//,/})) && echo ' (estimated)' || echo
echo
echo ' __Credit/week _ UserName'
members_chart=$(
# for each of today's members, look up a matching name from last Sunday
for ((i=0; i<${#n1[*]}; i++))
do
for ((j=0; j<${#n0[*]}; j++))
do
if [ "${n1[i]}" = "${n0[j]}" ]
then
b=$((${b1[i]//,/} - ${b0[j]//,/}))
if ((b))
then
blocks="$(commarize $b)____________"
echo "${blocks::13}${n1[i]}"
fi
n0[j]=' ' # don't re-use this record, in case of duplicate names
break
fi
done
# n1[i] not found in n0[*] -> either a new member, or changed their name
if ((j==${#n0[*]}))
then
blocks="${b1[i]}____________"
echo "${blocks::13}${n1[i]}"
fi
done
# check if any of ast Sunday's members dropped out (either changed their name, or switched teams)
for ((i=0; i<${#n0_bak[*]}; i++))
do
for ((j=0; j<${#n1[*]}; j++))
do
if [ "${n0_bak[i]}" = "${n1[j]}" ]
then
n1[j]=' ' # don't re-use this record, in case of duplicate names
break
fi
done
# n0_bak[i] not found in n1[*] -> log negative credit
if ((j==${#n1[*]}))
then
blocks="-${b0[i]}___________"
echo "${blocks::13}${n0_bak[i]}"
fi
done
)
single_digit_credit_records=$(grep -c '[1-9]____________' <<< "${members_chart}")
members_chart=$(grep -v '[1-9]____________' <<< "${members_chart}" | sort -nr)
i=1
while read line
do
rank="${i}_______"
echo " ${rank::8}${line}"
((i++))
done <<< "${members_chart}"
((single_digit_credit_records == 1)) && echo ' and a single-digit credit record'
((single_digit_credit_records > 1)) && echo " and ${single_digit_credit_records} single-digit credit records"
# debug: print weekly production of all teams
if false
then
echo
i=1
while read line
do
printf " %d\t%s\n" $i "${line}"
((i++))
done <<< "${teams_chart}"
fi