Would you join a group effort to write a new simulation? Appeal for a bridge program that is not a GIB clone.
#1
Posted 2012-July-22, 20:17
A previous topic revealed that a number of BBOers are involved in writing Bridge simulations. Now I believe that completing a really worthwhile simulation in a reasonable timespan is beyond an individual programmer's capacity. Consequently I am calling for volunteers to pool their efforts into producing the ultimate definitive bridge program.
Let me spell out briefly what I have in mind. Forgive me obvious tautologies, and if you find my style insultingly simple. I am concentrating entirely on setting out my points as clearly as possible.
First I do not see much point in producing another random simulation program even if we could improve on the existing crop. We all know the problems and limitations of GIB and any significant improvement would only bring it closer to double-dummy cheating.
Thus I feel that the only worthwhile approach is rule-based both for bidding and play. The trouble with a rule-based simulation is that until the knowledge base is complete, the standard of bidding and play is woefully low, even lower than for a random simulation written over a comparable period.
Because of this I suggest we divide the simulation into two parts:
First we write a shell program covering user interaction, card handling and graphics, and also setting up the shell of a rule-based, expert program with both bidding systems and play based on open-ended knowledge bases which could be entered,corrected, and expanded by the end-user.
At this stage anyone who wishes could drop out of the joint effort, without letting anyone down, and continue development on his/her own: perhaps a limited simulation such as a bidding system comparison, or a play aid.
I would hope however that enough participants would continue so that several bidding systems could be entered simultaneously, and the play could be completed in good time.
So there it is. I think the project would be worthwhile and might well shape the future of computer programming.
It only remains to pose the question: Would you be prepared to participate in such a project?
#2
Posted 2012-July-22, 21:08
It's not really the direction in which my bridge programming interests lie, personally.
#3
Posted 2012-July-22, 23:01
Quote
Bridge programs used to rely on rules, then GIB came along, and they all changed. This seems like a pretty big hint that the winning direction is not what you're proposing.
#4
Posted 2012-July-22, 23:13
not sure how a bridge program does not rely on rules in a broad sense of the word.
I have long argued that judgment and exp. is just a better set of rules
not perfect ...just better
#5
Posted 2012-July-22, 23:53
Antrax, I take it you are querying my reference to "double-dummy cheating". By this, I mean peeking at the unseen hands. As random simulation improves it moves closer to double dummy play. I have no objection to a program cheating to improve performance to give me an acceptable game but it is not exactly simulating expert play. Or at least I hope not.
I freely admit that random simulations produce results at a certain level more quickly than rule based systems (see my opening post) but I think further improvement is limited and a rule based system will ultimately out perform random simulation.
Mike777 I would hate to get involved in semantics or philosophy but is it not true to say that random simulation is the antithesis of pragmatism(= rule based system?)?
#6
Posted 2012-July-22, 23:55
I wish you would
in any event I did not say that.
MY only point is that rule base systems have alot going for them..not perfect but pretty good....just improve the rules over time....
------------------
Is yur point random system wins ?
#7
Posted 2012-July-23, 01:26
#8
Posted 2012-July-23, 02:11
fuburules3, on 2012-July-23, 01:26, said:
Put like that I probably am being presumptuous. My point is that if you build on the shoulders of giants you can reach great heights.
Would you not concede that a program with open code which can be continually improved by new participants could have unlimited potential? That is my particular new insight.
#9
Posted 2012-July-23, 04:12
I'm half wondering if my design for a bidding engine would be able to win the computer world championships eventually, and if it does I would rather take the money But the rest of it I'd be more than happy to help build for sure.
ahydra
#10
Posted 2012-July-23, 04:34
Scarabin, on 2012-July-22, 23:53, said:
Seriously, it seems to me (and fuburules3 put it well) that if domain experts decided that rule-based approaches are inferior to simulation, it's probably good to heed their advice, if your goal is to create a competitive program.
#11
Posted 2012-July-23, 05:57
Scarabin, on 2012-July-22, 20:17, said:
A previous topic revealed that a number of BBOers are involved in writing Bridge simulations. Now I believe that completing a really worthwhile simulation in a reasonable timespan is beyond an individual programmer's capacity. Consequently I am calling for volunteers to pool their efforts into producing the ultimate definitive bridge program.
Let me spell out briefly what I have in mind. Forgive me obvious tautologies, and if you find my style insultingly simple. I am concentrating entirely on setting out my points as clearly as possible.
First I do not see much point in producing another random simulation program even if we could improve on the existing crop. We all know the problems and limitations of GIB and any significant improvement would only bring it closer to double-dummy cheating.
Thus I feel that the only worthwhile approach is rule-based both for bidding and play. The trouble with a rule-based simulation is that until the knowledge base is complete, the standard of bidding and play is woefully low, even lower than for a random simulation written over a comparable period.
Because of this I suggest we divide the simulation into two parts:
First we write a shell program covering user interaction, card handling and graphics, and also setting up the shell of a rule-based, expert program with both bidding systems and play based on open-ended knowledge bases which could be entered,corrected, and expanded by the end-user.
At this stage anyone who wishes could drop out of the joint effort, without letting anyone down, and continue development on his/her own: perhaps a limited simulation such as a bidding system comparison, or a play aid.
I would hope however that enough participants would continue so that several bidding systems could be entered simultaneously, and the play could be completed in good time.
So there it is. I think the project would be worthwhile and might well shape the future of computer programming.
It only remains to pose the question: Would you be prepared to participate in such a project?
Why mention GIB? There is a much better program on the market: Jack
GIB is pathetic compared to Jack
#12
Posted 2012-July-23, 09:46
mike777, on 2012-July-22, 23:13, said:
not sure how a bridge program does not rely on rules in a broad sense of the word.
I have long argued that judgment and exp. is just a better set of rules
not perfect ...just better
GIB uses a combination of rules and simulations. During bidding it uses the Meadowlark bidding rules to decide on bids and determine what the other players' bids show, but many decisions can be overridden by simulations. During play it relies almost on simulations (exceptions are made for things like standard honor leads).
#13
Posted 2012-July-23, 13:58
Can I suggest starting with bidding only? It is a smaller problem, it is GIB's weak area, and it is well suited to a primarily rules based approach with some simulation. Also, development of bidding rules is probably the area where the combined wisdom of the forums can add the most value.
#14
Posted 2012-July-23, 17:15
To Ahydra: Thanks for your support.We will have to wait to see if my proposal gains enough support to get off the ground. Good luck with the bidding engine.
To Antrax: I appreciate your humour but I would really appreciate your examining my actual proposal.
To Advanced: I chose GIB as being familiar to most BBOers and because Barmar gives us insights into its actual methods. I agree Jack is very professional but so is Wbridge5 and both have serious limitations and, I find, infuriating defects.
To Barmar: Thanks for clearing up the misunderstanding. An ounce of fact outweighs tons of speculation, if you will forgive my mixed metaphor.
To Nigel_k: Glad to have your support but at the risk of losing it I have to confess that what you propose is already, at least partially, available. See my review of Oxford Bridge (State of the Art 5 in Bridge Material Review).
May I ask you all to bear with me if I restate my proposal:
I am asking for volunteers to build a shell program dealing with the basics of hand display, hand entry, hand analysis, etc, and also set up an expert system to deal with three knowledge bases.
The knowledge bases would comprise:
hand evaluation,
bidding systems,
play system - plans and strategies, tactics and methods.
The knowledge bases would be entered in normal language, with limited vocabulary, and would benefit from being a group effort, although individual bidding systems could be entered individually.
As regards the play system, a complete rule based system would take years to write and in the interim one could use a double dummy program.
The only use I would propose for random simulation is the provision of judgment in placing the final contract in cases where the rules do not cover this.
Thats it. I think it's possible and worthy of support. Lets debate this as far as possible, and debate possible variations but not set up paper tigers based on what I could have said but did not. Thanks again.
#15
Posted 2012-July-24, 13:11
nigel_k, on 2012-July-23, 13:58, said:
Can I suggest starting with bidding only? It is a smaller problem, it is GIB's weak area, and it is well suited to a primarily rules based approach with some simulation. Also, development of bidding rules is probably the area where the combined wisdom of the forums can add the most value.
I also think this is the best approach. I've had very few problems with the rule based approach in the early stages of the auction. Even on invitational auction rules with KNR points evaluation can be effective. Competitive auctions probably either need simulations or another valuation method that adjusts the value of a hand based on the opposition bidding.
#16
Posted 2012-July-24, 16:20
For me the point of writing bridge playing (or analyzing) program is to discover something new or something human players get wrong. If it's based on "expert system" and "huge database of knowledge" then it's limited to what opinions humans already hold about the game.
So, for example, if there is a rule in constructive bidding: "with 12-14hcp if it goes 1D - 1S we raise to 2S with 4 spades" then it's instantly not interesting because there is nothing new we can learn from that. On the other hand if simulations or similar process shows the bid being effective then that's something - even if only confirmation in this somewhat obvious case.
I want my program to answer questions like : "is it better to bash 3NT here or to go via stayman and try to discover 4-4 major fit" or: "if I bid 3NT here, how often I will make in real play", or "what is the difference in EV between preempting to 3S and 4S here".
Once it's rule based it will just tell me what common wisdom here is which - especially that you are proposing that many people contribute to the knowledge base - about useless in my mind.
#17
Posted 2012-July-24, 16:25
#18
Posted 2012-July-24, 16:27
catch22, on 2012-July-24, 13:11, said:
Agree completely with your conclusions. I think the approach I propose is very flexible and would accommodate this. As soon as the initial shell is complete (and you would need an initial shell for any bridge program, I think of it as the user-input-output part) those who wish could concentrate on entering the knowledge base for whatever bidding system they choose.
#19
Posted 2012-July-24, 16:55
bluecalm, on 2012-July-24, 16:20, said:
For me the point of writing bridge playing (or analyzing) program is to discover something new or something human players get wrong. If it's based on "expert system" and "huge database of knowledge" then it's limited to what opinions humans already hold about the game.
So, for example, if there is a rule in constructive bidding: "with 12-14hcp if it goes 1D - 1S we raise to 2S with 4 spades" then it's instantly not interesting because there is nothing new we can learn from that. On the other hand if simulations or similar process shows the bid being effective then that's something - even if only confirmation in this somewhat obvious case.
I want my program to answer questions like : "is it better to bash 3NT here or to go via stayman and try to discover 4-4 major fit" or: "if I bid 3NT here, how often I will make in real play", or "what is the difference in EV between preempting to 3S and 4S here".
Once it's rule based it will just tell me what common wisdom here is which - especially that you are proposing that many people contribute to the knowledge base - about useless in my mind.
For sure, but I think you have answered this point yourself in a post on a different topic. The quickest answer to the sort of alternatives you envisage must come from "common wisdom". If the program has a very accurate play engine( and in the present this virtually means double dummy) you could bid the hand twice(?) and compare the final results. Of course this would not allow for the possibility of confusing the opponents and getting a better result with a technically inferior bid. Your original suggestion of rule based very accurate opening leads followed by double dummy play would be a step further.
This may look like "using a sledge-hammer to crack a nut", but how would you go about comparing different methods of hand evaluation, for instance?
#20
Posted 2012-July-25, 03:08
And how are you going to do artificial conventions like Stayman without rules?