Update the database


  • --updatedb
  • --config the configuration files to use
  • --path for a absolute path to update
  • --tailpath for scanning specific sub directory (Tailpath is not allowed to start with a /)
  • --rescan Ignores file modification date and re-inserts the file.
  • --noclean Do not remove file records from the database
  • --recreate Drops and recreates database tables
  • --dumpheader Shows NetCDF header

Get layers - scan a file, returns the possible layers for that file


  • --getlayers
  • --file
  • --inspiredatasetcsw
  • --datasetpath


  • --adagucserver --createtiles --config ~/adagucserver/data/config/adaguc.pik.xml

Output report to a file


  • --report=filename to write report specified by filename
  • --report to write report specified in environment variable ADAGUC_CHECKER_FILE. If variable ADAGUC_CHECKER_FILE is not set, write file to default location ./checker_report.txt


Aggregates multiple files over the time dimension into one big file. Does work on files without unlimited dims as well.
Commandlande arguments are <input dir> <output file>, optionaly third argument can be a comma separated list of variable names to add the time dimension to.


aggregate_time /data/swe_L3A_daily/ /tmp/ swe

Automatically delete files from database and filesystem


postgrescredentials="dbname=SEVIR_OPER_R___VOLE____L2 host=localhost" 
# Get the right tablenames from the database, based on the directory.
tablenames=$(psql -t "${postgrescredentials}" -c "select tablename from pathfiltertablelookup where path = 'P_${datadir}' and dimension='time';")

for tablename in $tablenames;do
  # Get the list of files with more than $limitdaysold days old.
  timeolder=`date --date="${limitdaysold} days ago" +%Y-%m-%d`T00:00:00Z
  filelist=$(psql -t "${postgrescredentials}" -c "select path from ${tablename} where time < '${timeolder}' order by time asc;")
  for file in $filelist;do
    echo "RM file $file" 
    rm $file
    psql -t "${postgrescredentials}" -c "delete from ${tablename} where path = '${file}';" 

Check how many seconds a webservice is behind current date

/bin/bash -c datasetdate=`curl -s "" | sed -E 's/xmlns:/aaa/g' | sed -E 's/xlink:/aaa/g'| sed -E 's/xsi:/aaa/g' |sed -E 's/xmlns/aaa/g' | xmllint --shell --xpath '(/WMS_Capabilities/Capability/Layer/Layer)[1]/Dimension/@default' - | awk -F\" 'NR % 1 == 0 { print $2 }' | xargs date +%s -d` && currentdate=`date +%s` && echo `expr $datasetdate - $currentdate`

Or continuously:

while true; do datasetdate=`curl -s "" | sed -E 's/xmlns:/aaa/g' | sed -E 's/xlink:/aaa/g'| sed -E 's/xsi:/aaa/g' |sed -E 's/xmlns/aaa/g' | xmllint --shell --xpath '(/WMS_Capabilities/Capability/Layer/Layer)[1]/Dimension/@default' - | awk -F\" 'NR % 1 == 0 { print $2 }' | xargs date +%s -d` && currentdate=`date +%s` && echo `expr $datasetdate - $currentdate`; sleep 2; done