You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
is there a possibility to use check_files to determine, if a file age is correct when several files in a directory exists?
e.g.:
directory "/backups/somefiles" has 7 files (7 days written at 22:50, after 7 days they will be overwritten)
if i use check_files with age-parameter like that
check_files -a path="/backups/somefiles" pattern="*.gz" filter="age > 23h" crit="count > 0" empty-state=0 empty-syntax="no old files found"
i get:
critical - 0/6 files
which is correct but not useful.
Better will be to have a parameter like "check-only-latest-file"
the directory looks like this:
-rw------- 1 root root 29809 Apr 12 22:50 0.sql.tar.gz
-rw------- 1 root root 29809 Apr 13 22:50 1.sql.tar.gz
-rw------- 1 root root 29809 Apr 14 22:50 2.sql.tar.gz
-rw------- 1 root root 29807 Apr 15 22:50 3.sql.tar.gz
-rw------- 1 root root 29809 Apr 16 22:50 4.sql.tar.gz
-rw------- 1 root root 29809 Apr 10 22:50 5.sql.tar.gz
-rw------- 1 root root 29809 Apr 11 22:50 6.sql.tar.gz
so only "4.sql.tar.gz" is of interest here.
Thanks for all helps!
Joe
Beta Was this translation helpful? Give feedback.
All reactions