Okay, a bit more information on how I came up with these opening bids. I rate the openings by how much “information” is given in the bid compared to how much bidding space is taken. I used the entropy concept (defined below) from information theory around the distribution of best contracts. The basic story is that a bid is a “good” bid if it helps concentrate its distribution of best contracts. So a bid defined as having a lot of spades will tend to have a high percentage of 2S/3S/4S… and so that is a good bid, provided of course that it’s not too specific because that will leave a lot of junk left over for the other bids to handle.
An example. Start with a large sample of double dummy hands. Before any bidding begins and you have a random hand, your distribution of perfect contracts looks like this:
P 2.2%
1C 0.8%
1D 1.2%
1H 1.7%
1S 2.2%
1N 1.5%
2C 2.7%
2D 3.3%
2H 4.2%
2S 5.8%
2N 3.0%
3C 4.6%
3D 5.3%
3H 6.2%
3S 7.3%
3N 3.6%
4C 3.2%
4D 3.9%
4H 5.7%
4S 6.5%
4N 3.3%
5C 1.9%
5D 2.3%
5H 3.2%
5S 3.7%
5N 2.0%
6C 0.9%
6D 1.1%
6H 1.5%
6S 1.5%
6N 1.7%
7C 0.2%
7D 0.3%
7H 0.3%
7S 0.5%
7N 0.7%
In information theory, entropy represents how much uncertainty there is around the distribution. Before any bidding starts there is quite a bit of uncertainty as to the best contract. To get the entropy, you multiply the chance of each bid happening by the logarithm of that chance and then add them all up. If you use base 2 for the log, then the number you get will in units of bits. So you take 2.2% * log(2.2%) + 0.8% * log(0.8%) +… and so on. I get 4.83 bits as my starting entropy.
Then you use whatever means you want to decide how each hand bids (it doesn’t have to be a NN). Measure the entropy of each bid definition and weight them according to how often that bid is actually made. For example, my network’s 1S opener now has the following distribution of best contracts:
P 0.1%
1C 0.1%
1D 0.1%
1H 0.9%
1S 0.3%
1N 0.1%
2C 1.0%
2D 1.0%
2H 5.3%
2S 1.6%
2N 0.7%
3C 2.6%
3D 2.2%
3H 10.1%
3S 2.0%
3N 2.1%
4C 3.9%
4D 2.0%
4H 18.7%
4S 2.5%
4N 2.9%
5C 0.3%
5D 3.4%
5H 15.0%
5S 0.9%
5N 1.5%
6C 0.9%
6D 1.7%
6H 9.1%
6S 0.8%
6N 2.8%
7C 0.7%
7D 0.7%
7H 1.4%
7S 0.1%
7N 0.2%
And its entropy is 4.13. Concentrating the best contracts into as few as possible is basically what we’re trying to accomplish here. The concetration into heart contracts has reduced the entropy.
But unfortunately this is only half the story. The entropy tells us how many bits of information we need to land in the right contract, but we usually will not have enough time/space to fully exchange the needed info. We need a measure of how many bits of information are available for us to use during bidding. We’re only defining opening bids here, so how will we know how good the opening bids are without defining all of the follow-ups, rebids, etc.? Most of this concept comes from Matt Ginsberg, and he and I describe it in several RGB posts several years ago. I also used a similar concept in my bidding system design contest I held, but I’ve modified it slightly. This is a way where you can quantitatively measure the information passed by your openings without having to define what your follow-ups are.
If your best contract is 1H, how much info can you exchange to get there? Well if you start with an opening pass, there are exactly 4 bidding sequences:
P-1H
P-1C; 1H
P-1D; 1H
P-1C; 1D-1H
4 sequences is 2 bits of information (2^2 = 4). To get to 1S after an opening pass there are 8 sequences or 3 bits. Basically each level gives you 1 more bit. But our bidding is never 100% efficient and there are a number of reasons. First, we can’t define rules that allow us to use the space 100% effectively because we have to use features like card length, points, honor placement, etc. and we only know our own hand, not the other 3 players. Second, our opponents will sometimes bid, taking away our bidding space. So you have to set your efficiency to some low number (I actually used 10%). That means to stop in 1S, I don’t have 3 bits of info available, I only have 0.3.
Each best contract will have a certain number of bits needed to find it (the entropy) and there will be a certain number of bits available. Usually we have fewer bits available than we need, so we’ll miss the contract some of the time. Plus our opening might have overbid. If we open 1S and the best contract is 1H then we’ll never reach it. So to sum it all up, basically what I’m doing is trying to find a set of opening bids that if I compare the amount of info each bid gives and the amount of info I need/have on follow-ups, I can maximize the chance that I’ll find the best contract.
To make it feasible to get a solution I did
partially eliminate competitive bidding. The model does expect that it will occasionally get interference (it has low efficiency and so might not have enough bits of information in future rounds of bidding). However, it does not go out of its way to make it hard for the opponents to bid. But hopefully your preempts will take care of most of this problem. I purposely used a low efficiency of follow up bids to counter the fact that these definitions aren’t trying to interfere.
In the end, it’s not perfect, but no measuring method ever will be. But it is a way to quantitatively define how good your openings are. I haven’t seen a better method to rate bids, but I’d love to hear about one.
This was a fun experiment, and take it for what you will. It was also fun to watch the preferences evolve over time. The initial conditions are totally random bids. Then it realizes that if all the opening bids are pretty much random, then it might as well start with a pass so that it has the most bidding space with its follow-ups. So it starts to evolve towards passing almost all the time. But then it realizes that it’s not getting any differentiation with always passing… so it begins exploring other bids, starting with 1C, then 1D, etc.
Tysen