Wednesday, July 23, 2008

Grovelling isn't so bad

Been working on a couple of little utility scripts that I think are ready for public viewing. I'm interested in any tips on writing better code, or other ways to do what I'm doing that are more efficient.

The first one is cleanup.py which we need to be able to quickly get rid of old log files so that when we grovel through for errors, only the files of interest are being scanned.

Once you've got the old log files cleared out, you can use grovel.py to scan through for TEST-UNEXPECTED-FAIL. This script looks through each directory passed in from the command line, and prints all the failure lines to a .errors file for that directory - so the darwin log errors end up in a darwin_timestamp.errors file. The script also keeps a counter of TEST-PASS, TEST-KNOWN-FAIL, and TEST-PASS(EXPECTED RANDOM) and then prints the total tests run as well as these counters on the last line of the .errors file.

Next steps:

  • Add gathering up all the .errors files into a tarball

  • Set up a weekly cron job that will run these scripts and email the tarball

  • Create a database and insert results

  • Web interface for aforementioned db that will allow for searching



Even though these are pretty simple utility scripts, I'm excited because they will make my life a little easier and also because it's the first python I've written from scratch...oh, and it's not a school assignment :)

Monday, July 21, 2008

Discussing Data

Some general thoughts on the discussion of data, inspired by Mitchell's blog post.

When I first started using the internet with some regularity, about 13 years ago, I was suspicious about entering any personal information whatsoever. This was before identity theft was a common occurrence, before I had any money to worry about losing, I don't think I even had a credit card yet. Some of the fears were based on run of the mill rebellion against "The Man" but some if it was just a reaction to something new.

For many years, whenever prompted for personal information, I would look for a way around having to enter it. If I couldn't get under it or over it, I would make stuff up...or leave. Creating false accounts gets tiring, because then you have to remember all your lies. Firefox wasn't around yet to help me keep track of all my phony accounts. I sure do appreciate the password manager and extensions like BugMeNot.

Skipping forward to the present, I still look for a way out of having to enter any identifying data wherever possible. Something that continually annoys me is being required to choose between male and female on a form when I am making a purchase. This should NOT be required to buy a sticker, test beta software or sign up for a social networking site. I'd like to see the end of generalized marketing based on gender and find new ways of triangulating what cat owners are doing that is different than dog owners.

Back to the data...

Even though I hate the thought of anyone assuming they know me because of a few hastily checked radio buttons, I also want the freedom to go about my business on the internet as easily as I do in real life - with my driver's license and a bank card. I have proof of who I am and I have money - what more do you want? I should now get to do whatever it is I'm looking to do with as few clicks as possible.

So if the future web browser allows me to safely keep all the important stuff handy, to know that I am who I say I am, and let me skip the 3 page sign up process, this is a Good Thing.

How can we get to that kind of level without talking about data and all the good/bad/lukewarm associations we have with it?

I tell people all the time that they should be using Firefox because it is the safest. People care about safety, and this is what they need to hear. If Firefox starts to work with data, I trust that we will do so in the best interest of the people who came to us for safety. I'm excited to talk about data and what we can do with it.

My hope is that data collection will become less of a top-down "Tell me this information or you can't access {fabulous service name here}" and instead will become the equivalent of the clerk at Best Buy asking you for your postal code and being able to say "No, I don't want to give that information to you, but I will still buy Rock Band from you".




Thursday, July 17, 2008

Set the VNC Password for Mac's Remote Desktop in Terminal

I was stuck in trying to access one of our xserve machines that just got moved from the QA network to the Build network. I could connect via ssh, and Justin could ping it but attempting to connect with VNC wasn't working. It wouldn't accept the usual passwords. Justin seemed to think that it was possible to change the VNC password through the command line, so I google it and read a post from 2 years ago.

Something I've learned from reading "how-to" blogs is that you should always read the comments first. That's where the most up to date information will be, if there is any. The person who wrote the post used strange template structure that made his idea hard to read and understand. Anyone who didn't read the comments wouldn't know that kickstart now takes plain text passwords.

The long and short? If you want to change the VNC passoword do this:

sudo /System/Library/CoreServices/RemoteManagement/ARDAgent.app/Contents/Resources/kickstart -configure -clientopts -setvnclegacy -vnclegacy yes -setvncpw -vncpw [newpassword]


Apparently you can enable VNC access and set the VNC password via the kickstart command. It isn't terribly well documented, but since it now accepts plain text passwords, I think that's a step in the right direction.

Wednesday, July 2, 2008

Chasing rainbows is easier

I was so thrilled to discover Splunk that I installed it on one of the buildbot masters - qm-rhel02 - without realizing that in fact, Splunk starts to quickly eat up disk space and hogs memory usage. Yesterday afternoon some Talos boxes started to go down because of this, and once I stopped the Splunk server everything started to right itself.

Lessons learned:
     Do not play with the buildbot master. 
Do not look directly at the buildbot master.
Do not taunt the buildbot master.


So today's tasks include getting access to the samba share that was set up, creating a cron job that will rsync the buildbot master logs to said share and then finding a safe place to set up Splunk again.

We really need to have a way to look at data from the buildbot master over a long period of time - otherwise filing bugs on these intermittent failures is just a shot in the dark. Take yesterday for example. qm-win2k3-pgo01 is being "unreliable" and had the same errors in refest for two consecutive builds. I file a bug, and the response is "grab me a copy of the offending objdir so we can poke at it". Wouldn't you know that the very next build does not have the same error output - this time it has mochitest issues that are seemingly unrelated. This morning I check again and it's had a compile failure, an exception (the most hideous purple) and then a completely green run.

Intermittent failures == needle in a haystack