Knuth's Challenge

Analyze everything your computer does in one second

Donald Knuth presented the following challenge in 1989:


My challenge problem is simply this: Make a thorough analysis of everything your computer does during one second of computation. The computer will execute several hundred thousand instructions during that second; I'd like you to study them all. The time when you conduct this experiment should be chosen randomly; for example, you might program the computer itself to use a random number generator to decide just what second should be captured and recorded.

...

I want to urge everyone who has the resources to make such a case study to do so, and to compare notes with each other afterward, because I am sure the results will be extremely interesting; they will tell us a lot about how we can improve our present use of computers.

...

Here are some of the questions I would like to ask about randomly captured seconds of computation:

Knuth, Donald E. "Theory and Practice." Theoretical Computer Science 90 (1991), 12-14.


Has it been done?

I've tried searching the web a few different times in the last couple years to see if anyone has responded to the challenge. So far, I've come up with basically nothing. If you know of any published results from this type of study, please share!

There are definitely people who profile their applications, generate flame graphs, and step through line-by-line with a debugger. What about the entire system, though? Is there a utility I can use to log every instruction performed in one second for an operating system running on bare metal? Can it be done with an emulator like QEMU?

Is it still interesting?

Knuth's four original questions seem marginally relevant today. It might be shocking if an erroneous program was discovered, but it could certainly happen. We may find that all the running programs are full of nontrivial theoretical results, or none of them. More than just "several hunred thousand" instructions may occur within one second.

The computing landscape has changed a lot since 1989. Windows has grown up a bit. Linux was born. We now have a collection of open source operating systems. The World Wide Web is a thing. Computers are now everywhere.

Everything has changed!

In addition to Knuth's questions, I will propose some of my own. How has the one second of runtime changed over time? Are compilers better now than they were 5 years ago? 10? If we caputred one random second of time every year, would we find things improving or getting worse?

There is also a great diversity in computing platforms. Can we take one random second of execution from every type of device from a cloud server to a thermostat to a pacemaker? Which devices could benefit most from better application of known theory?

Renewing the challenge

Let's develop a repeatable way to sample one second of execution from a variety of machines.

Just knowing the machine instructions that happened during that time period isn't enough -- we'll need the source, all the way up the stack.

It's going to be a lot of instructions. Combing through the whole thing by hand might not be reasonable. We'll need some tooling to help inspect what happened and tie it up into a coherent narrative.

Do you want to try it?

I'd love to hear about a successful test. Or we could work on doing one together. If you have ideas, related experience, or just want to say hello, you can email me at:

gaxun@mail.gaxun.net

Comments sent with "Knuth's Challenge" in the subject line will be posted to this page, with personally identifying information removed.

Comments

This post has generated a number of comments, view them here:

Knuth's Challenge Comments

Think about it!


Home - Commentary