Wednesday, July 23, 2008

Grovelling isn't so bad

Been working on a couple of little utility scripts that I think are ready for public viewing. I'm interested in any tips on writing better code, or other ways to do what I'm doing that are more efficient.

The first one is which we need to be able to quickly get rid of old log files so that when we grovel through for errors, only the files of interest are being scanned.

Once you've got the old log files cleared out, you can use to scan through for TEST-UNEXPECTED-FAIL. This script looks through each directory passed in from the command line, and prints all the failure lines to a .errors file for that directory - so the darwin log errors end up in a darwin_timestamp.errors file. The script also keeps a counter of TEST-PASS, TEST-KNOWN-FAIL, and TEST-PASS(EXPECTED RANDOM) and then prints the total tests run as well as these counters on the last line of the .errors file.

Next steps:

  • Add gathering up all the .errors files into a tarball

  • Set up a weekly cron job that will run these scripts and email the tarball

  • Create a database and insert results

  • Web interface for aforementioned db that will allow for searching

Even though these are pretty simple utility scripts, I'm excited because they will make my life a little easier and also because it's the first python I've written from scratch...oh, and it's not a school assignment :)

No comments: