Thursday, November 22, 2007

Blind Optimism?

This morning, when I got to this computer, my screen saver had crashed. As this particular device is a rather beaten-up (although originally well-specced) dot.compost laptop, running a Windows OS, this is hardly a surprise.

However, I am running BOINC and the application that had crashed was the UK Met Office climate prediction model. This needs nearly a month of dedicated processor time and is hideously sensitive to errors - both mathematical and operational. It is the latter that seems to be the problem to my contribution (the former may render the whole thing pointless). Both file-system glitches and other errors - this morning's was an illegal memory access - regularly ruin runs at these extensive file sets. For comparison, Seti-At-Home takes under 6 hours per work-package and, on my (albeit much newer and considerably faster) Mac, Einstein@Home packages take about 10 hours.

Here is the dilemma - splitting up the calculation sets / work packages into smaller will clearly result in some loss of efficiency. However, given the (linearly increasing with duration?) chance of failure and the associated loss of the already committed calculation time, there must be a maximally efficient package duration? Especially as you are having these things run on machines where you, the organisers, have minimal control over patching and other operating parameters, I would expect this to be much less than 633 hours ...

Techie note: neither the number of cores per processor nor the number of processors are particularly relevant here (except as they take the normal work load of the machine) - it is the actual elapsed time duration of the specific work that is important, not the package throughput

No comments:

 
HTTP Error 403: You are not authorised to access the file "\real_name_and_address.html" on this server.

(c) 'Surreptitious Evil' 2006 - 2017.