[wplug] Re: Shell: 'find' trick

Brandon Kuczenski brandon at 301south.net
Mon Oct 17 19:59:22 EDT 2005


On Mon, 17 Oct 2005, Chester R. Hosey wrote:

> Christian Holtje wrote:
>> Brandon Kuczenski wrote:
>>
>>
>>> Can anyone think of a way to use 'find' to select only the newest file
>>> in a directory tree that starts with a given name?
>>
>> From the example, this sounds like it would work:
>>
>> # warning: this was written off the cuff....but any bugs should be harmless
>> for i in $(ls -1 | perl -p -e 's/^(.*-fleem)-\d\d\d\d-\d\d-\d\d$/$1/;' |
>> sort | uniq); do
>> ls -1 $i-* | sort | tail -n 1
>> done
>>
>> That's assuming that you only want to go by filename and ignore the
>> filesystem information (mtime and ctime).
>>
>> Ciao!
>
> Bzzt. This doesn't cover the directory tree, although if anyone gives
> bonus points for using Perl in a context where it really isn't needed, I
> suppose you'd get a few there.
>
> If you can sort the filenames in lexical order to come up with the
> newest/oldest, something as simple as the following will work:
>
> find . -name 'ocean-*' -printf '%f\n' | sort -n | head -1
>

The following seems to do what I want:

for target in $list_of_targets ; do
   find /backup/local -name "`hostname -s`-$target-*" -exec ls -l {} \; \
     | sort -rn +4 | head -1 | awk '{ print $NF }'
done

gives me a full path to the most recent(ly datestamped) file for each 
target.

FYI, this is for intermittently copying the latest locally-stored backups 
to a backup server.  Some backups are created nightly, others weekly, and 
some - also - intermittently, but this ensures that I always and only 
store the most recent one remotely.

I know there are probably better ways to do this (with, say, rsync) but ..

well, hey. This is the way it's being done at present.

-Brandon



More information about the wplug mailing list