Slow Chess Blitz by Jonathan Kreuzer

Mainpage (with download) | Using Slow Chess | Playing Style | Programming Details | Program History |

Playing Style: (Blitz WV)

Overview: When I started the Slow Chess 2 series, I concentrated on tactical ability, because at that time I didn't know that much about Chess strategy, but I was very interested in game tree search. Each new version however accumlated new positional knowledge. The latest, Blitz WV, has a much slower NPS, but is balanced between positional and tactical ability. Playing strength in games has always been a secondary concern, but luckily what I work on seems to correlate well with strength. One of my hopes was for Slow Chess to be used by strong (or weak) players for analysis, so I tried to make it solve certain positions quickly, and give reasonably accurate evaluations. I haven't done many Comp vs. Comp games, mostly when I did I followed the games and attempted to correct anything that looked like dumb play even to me, a rather weak chess player.

The Attack: I wouldn't describe Slow Chess as an attacking player, since it was never my intention to create an attacker. A lot of people like aggressive chess engines, but there's no shortage of well-done aggressive engines out there. I wanted Slow to attack mainly when there was obvious weakness in the enemy King position, but still prefer to force the game into a (probably) won endgame rather than continue an attack. King Saftety values are quite small, I wanted conservative but accurate. If I did chess programming professionally I'd probably spend the time to make more accurate King Safety, and thus feel confident enough to up the values. I also tried to give Slow a strong ability to execute a tactical attack. I tuned for this partially by using tactical test suites, including one of my own made from Slow comp-comp games, or positions I found on the internet. Slow is usually very fast at finding mates by attack (fastmates.epd). One thing I noticed is that Slow will often appear agressive against weak computer opponents who don't know to keep their king safe, but more conservative against strong programs.

Complexity: Also, as anyone whose looked at the 2.96 source has probably noted, I didn't try to make a simple, elegant and strong program. When in doubt, I left in any extension or piece of chess knowledge of dubious value to strength, and that's still the philosophy behind the latest version. The reason is that I feel this makes for a more interesting personality. It also helps me feel the Slow is very distinct from any other chess engine out there (even if it isn't necessarily obvious,) which is important to me. Slow Chess will find some moves quite fast (eg. some mates, or specific positional endgame wins). My hope is to sometimes have someone think "wow, how'd Slow do that?" Every once in a while it will be near blind to a particular move though. A long time ago, when I made 2D adventure games I'd always add little animations to the backgrounds that people wouldn't always see, but were extra touches that were sometimes appreciated, and that's the same reason I add so much silly stuff to the Slow Engine.

The Endgame: Endgame ability has also been hugely improved in recent versions. Not to say it's great. I've come to the conclusion it's impossible not to capture straight into a cleary lost pawn endgame =) I like adding random tidbits of endgame knowledge, for example Slow knows this is a draw (8/2K4k/4B3/8/1P5P/4b3/8/8 w - -). The Bitbases in latest versions were part of this, as they probably have little(no?) effect on strength, but help in analyzing or playing a few positions (8/1K6/2P1kp2/8/6r1/1R6/8/8 b - -) or (8/7p/p4kp1/P7/5PP1/4K3/3R3r/8 b - -). For the endgame, knowledge can solve many positions much faster than search, so I've worked more on knowledge in the endgame than endgame search.

Test Suite Results:

These results are for SlowChess Blitz WV2.1 running on an Athlon 64 3500. I should point out that any test suite isn't that meaningful in predicting gameplay results. Since the WAC.epd is too easy for modern programs on modern hardware, I chose ecmgcp and arasan5. You can get both .epd files of positions from the Arasan Test Page.

ecmgcp.epd test suite, 166 out of 183 solved in 10 seconds.

Time in seconds: 1 2 3 4 5 6 7 8 9 10
Total # solved before time 109 134 140 147 151 155 160 162 164 166

arasan5.epd test suite, 125 out of 141 solved in 30 seconds

Time in seconds: 1 2 3 4 5 10 15 20 30
Total # solved before time 38 58 72 79 87 100 111 121 125

Playing Strength (old): (description for version 2.93)
I'm a poor chess player myself, when I played against Slow Chess I usually played bullet time-control (2mins + 1sec/move) games against 2-ply, and lost more than I won. I tested an old much weaker version on the ICC once, where it could take on IMs/FMs (Chess Masters) at fast time controls. The games were very fast with no increment though, and were decided on time quite a bit. The few recent results against strong human players I have suggest that in blitz Slow is quite good, although once in a while gets into a horrible position (blocked pawns, trapped pieces,) and might lose. The strongest human opponents almost always went into complicated/open/interesting positions instead of playing anti-computer chess; I've seen Slow play a lot of really poor games in blocked positions. It's hard to give an accurate guess of playing strength against computers, so I'll just say that from test games at Blitz time controls it's stronger than average for a free program.

I tested SlowChess 2.93 on the Win At Chess (wacnew.epd) test suite on an AMD Athlon 2700+ computer. This suite consists of 300 tactical problems that are usually easy for a computer. It can give an idea of a range of strength but isn't that meaningful. Here are the results:

Time in seconds: 1 2 11
Total # solved before time 297 298 299

I also tried SlowChess 2.93 on the ecmgcp.epd test suite. (On an AMD Athlon 2700+) This suite is much tougher, so it gives a better idea of the range of strength of a program. Again though, I want to point out that any test suite isn't that meaningful in predicting gameplay results.

Time in seconds: 1 2 3 4 5 6 7 8 9 10
Total # solved before time 74 98 111 121 129 137 141 146 150 150