Active Users:272 Time:03/05/2024 10:36:26 PM
Re: Do or do not.... Joel Send a noteboard - 12/03/2013 02:19:56 AM

View original post
View original postThe second example seems better to me. A big issue in teaching math, one you reference below, is that different people do conceptualize the operations and relations differently, so illustrations that convey them to some people are often useless to others.

Yes but there are common methods, if I show someone 4x4 and they say 20 I know what the mental error probably is and can test it pretty quickly. Once I know the error I can pick from a wide array of tools to help fix that. People who use something totally different are rare enough you won't often have to encounter them, there's usually only a handful of methods people use.

True, but that handful tends to get screwed hard precisely because there are no wellknown methods of teaching them and the rarity of their optimal learning style makes the chance of any developing remote.
View original post
View original postThat is a very interesting phenomenon in itself, and one that has always fascinated me: That most people look at a randomly distributed group of up to five items and think "number," but look at anything else and think "lots." Most people can make it to six if presented that many objects arranged like the vertices of a hexagon, but must count the sides/corners of anything larger to know their number.

I think overall it is more representative of a thresh hold between conscious and subconscious thought. There's still subconscious thought and memorization going on, because people tend to remember 4x4 =16 rather than 4+4+4+4=16 and explains why on a 4x4 grid of dots removing one or two dots would significantly slow down their answer. They can 'count' the stars on a 8x8=64 grid a lot faster than the US flag of two overlapping grids, 6x5 and 5x4 = 30+20=50 setup. Of course, I see a flag as two rectangle grids overlapping, to be counted separately then added, other might view it as 4 double rows of 11 stars plus half a remainder, 4x11+6=50. The first method will likely be the better one, except in cases where one has a very high ratio of width to length, in which case that double row method is probably faster.

For the record, I tend to conceive it best as alternating lines with one less in the second, especially if I must reproduce it; then I need only remember which comes first, and my visual memory is at least good enough to say, "the long one." I just do not consciously think of them as rectangle; obviously your conceptualization as two rectangles offers the advantage of knowing from the start where to stop, though mine is probably easier to remember.

The thing is though, with very rare exception most people draw a blank with any arrangement of >5 objects. I actually stumbled across that observation in one of my old Encylopedia Americanas, in the article on "awareness," but usage of the word and understanding of what it conveys has altered so much in the fifty-odd years since that article that it may be hard to link the phenomenon with that term now. I meant to post a thread about this a year or two ago, but doing a search for "awareness" turned up nothing except the temptation to necro, so I guess it never went beyond a thought. Type a series of lines like:

....................
..
......
.
..........
....
........
.....

and most people will be able to instantly identify every other one, but ONLY every other one. A few people might get 6, but I doubt even 1% of people would spot 10 without counting, despite the fact we see ten objects everytime we glance at our hands. Even choosing an easily recognizable way to represent >5 objects is not easy unless we put them in multiple rows (so we can count the rows;) we could use the vertices of a hexagon for 6, and a central point for 7, but after that, well, OK, we could do the same with a septagon, but how many people recognize even a regular septagon without counting the sides/vertices?

We just do not do well with large numbers, which is why most people have a very difficult time conceptualizing truly huge numbers. A million is a thousand thousands, a trillion is a million millions; that is about as far as precise as most of us can get. Subconsciously or otherwise it just does not register as anything but "a crapton," or perhaps "a crapton more than a crapton." About the only people who begin to grasp the difference between $10 million in assets vs. $10 billion are the ones who actually possess one or the other.


View original post
View original postInteresting; makes me wonder what I was missing learning remedial handwriting while my classmates were, well, LEARNING. :blush: I have mixed feelings about formal geometry; exposure to formal logic was helpful, but it is definitely far more intuitive to me on a numerical basis.

I wasn't even aware geometry included a logic section. I'm also in the bad handwriting club, but never had remedial, just bad grades. :P

Plane geometry is SOLELY logic; what frustrated me for a long time was its utter lack of numbers. Telling a 13 year old to do math without numbers produces ":confused:"

I only got remedial handwriting in sixth grade; after that I think most teachers gave me up as a lost cause. I will never forget the time in APUS History when our teacher went around telling each of us the one area she thought we should brush up on just before the test; her comment to me was basically "you know ALL of it; concentrate—hard—on writing neatly and legibly." And that was despite printing everything; I do not even bother with cursive, because it is as painful for me to write as for others to read. I got a 5, so I guess block letters were good enough. ;)


View original post
View original postLogical, though standardization promotes communication; a discoverys profundity matters little if the discoverer cannot "show their work" and transmit it to others.

Absolutely on the latter, the best way to master a subject is to teach it, but I only agree in part on standardization, I think it better to teach all major standards, more flexibility, less 'oops' mental barriers to trip over.

Well, the more standards known for teaching the same principle, the more accessible it is. And you are right that more valid approaches to the same principle reduces the prevalence of blindspots.
View original post
View original post
View original postI'd almost have to see it written out and annotated to grasp it. Remember that a lot of card players casually think in a parallel of base 13 superbase 4 but never view it that way and never apply it to anything but cards, even though they cheerfully make card analogies to life. If our clocks consisted of 4 periods, morning, afternoon, evening, and night, divided into 13 segments (27 minutes) and 52 'minutes' of 32 seconds subdivided into 52 'seconds' of .6 normal seconds you could be almost assured that card games and time would have all sort of common analogies and comparisons. "I'll meet you at club king for the film, I might be a suit late though" referring to a period of about half an hour and saying he might be abut 6 or seven minutes late. Or alternatively expressions like 'high noon' could work their way into cards. Any sort of competitive game or religious ritual are going to encourage those involved to rapidly assimilate the concept even if it has no outside parallel or logic and I think predispose them to try to graft that onto the outside world wherever there is any perceived overlap. Witness that 2d10 or d% is used to get a well known concept but a d20, with no daily use equivalent, generates them as 'natural 20!' or 'fuck, rolled a 1!' or even snake-eyes or boxcars. I don't think a game or religious divination would lead to adaptation for math or practical use but I could easily see existing math or common concept being brought into a game the way a d% is.

Though I feel there's something confused, rambling, and very much a massive digression to everything I wrote here :P Do not feel obliged to reply point for point


It is something of a sickness with me. :P I will say THIS card player does NOT think in base 13 superbase 4, even when playing. Perhaps I SHOULD, but my memory is not in good enough shape to count every card; usually I just count honors so I know what is high in each suit, and distribution so I do not lead anything CERTAIN to be trumped (or worse, give the bad guys a rough-slough.) Sometimes that gets me in trouble once all the honors are gone and I cannot remember if an 8 or 7 or whatever is good or a higher non-honor spot is still out there. One such occasion proved especially embarrassing because I had lost count of the distribution as well, which left me wondering if my heart 8 (or 7, forget which) was high when it was not just the high heart, but the LAST heart. :blush:

From what I can tell, most people tend to think in terms of "un/somewhat/very likely," and do not go further absent the incentive you reference. In AD&D a natural 20 is a crit success and 1 is a crit fail (or vice versa,) while in GURPS a 3 or 4 is crit success and a 17 or 18 is a crit fail*. Most people will look at that and think "makes sense; criticals are supposed to be rare, or at least uncommon," some might even opine that the ability to produce either with two rolls rather than one makes them more common in GURPS. However, the chance of rolling 20 (or 1) on a d20 is a fairly respectable 5%, while the chance of rolling 17 or 18 on 3d6 is <2%—even though there are 4 times as many ways to do it! People who are not veteran gamers (or mathematicians) seldom realize that.

Anyway, to see it written out and annotated, try the below link.

*GURPS further complicates things because a natural 17 is only a "normal" failure for skills >15, and any natural roll 10+ below an unmodified skill is a crit success. Both incentivize buying skills past 15, which would otherwise be almost pointless since there are only 11/216 ways to roll >15 on 3d6.


Ah, gurps... the only RPG think mentioned in the conversation thus far :P

As it should be, since it is the only one WORTH mention. :| I am a bit surprised you have no further comment on cards though; bridge is far more mathematically fascinating than dice are. I only know one person with the math and card background to debate my position that the "8 ever, 9 never" finesse rule is wrong, and he refuses to take the bait. If you care to google "8 ever 9 never finesse" you should see quickly what I mean:

With 9 cards, without the queen, playing for the drop is better than an IMMEDIATE finesse, but playing the king or ace first drops all singleton queens (on the left 1/2^4 times and on the right 1/2^4 times) while preserving the ability to THEN finesse the jack and 10 into the king (in the other 14 cases.) A finesse is 50/50 by definition (the queen is either "onside" or not,) so playing the ace then finessing wins (1+1+7)/16 times, or 56.25%.

On the other hand, playing for the drop throughout only works when the queen is singleton (12.5%) or doubleton (37.5%), which ironically makes it "the inferior 50% chance." We can hedge a bit by saying playing the ace immediately exposes the 4-0 onside split in time to take the finesse, but that still only brings it to parity with the delayed finesse, and it is still the finesse—NOT the drop—that wins the queen. Essentially, it is a different way of looking at the same process first described; if one played the ace, saw the suit split 4-0 with the queen onside, but then led the king anyway to play for the impossible drop, the queen would be lost.

Such has been my contention for about 20 years, but I am inclined to view it differently after looking over the following link, which factors in the probability of an opponents holding in the OTHER three suits against the probability in the suit missing the queen: http://www.durangobill.com/BrSplitHowTo.html

He concludes, rightly, I think, factoring the three seemingly "irrelevant" suit distributions actually alters the all important "percentage play." Not much (<1.7%,) but enough to alter what had been dead even: Suddenly the delayed finesse only wins 56.2% of the time, less than before, while the drop works 57.9% of the time, much more. What is particularly interesting is the difference owes to the chance of a 4-0 or 3-1 split decreasing and that of a 2-2 split increasing, even though bridge "conventional math" says, "suits missing an even number of cards most likely divide unevenly; suits missing an odd number of cards most likely divide as evenly as possible." This rule (which Pascals Triangle quickly demonstrates) still holds (3-1 splits are still more likely than 2-2s overall) but by less than before, which makes the difference.

GENERALLY SPEAKING, the chance of a given suit distribution can be roughly calculated at the table as nCr/n^2, where n is the number of cards missing and r the number in a given opponents hand. Once play begins, each player can always see two hands (his own and dummys,) leaving declarer to ask, "which defender has x?" and the defenders "does declarer or my partner have x?" Thus it is a nice game for those who enjoy math, because the percentage play is set in stone from the first card led, though it is often hard to find (and rarely assured success; nothing infuriates me more than a low percentage holding setting my contract, or making an opponents, when the most probable one would do the opposite.)

The first thing they taught me was "if you can count to 13, you can play bridge;" the second was "play is easy: BIDDING is hard." In fact, many playing rules of thumb exist for those with neither the card nor math experience to derive them. The best BIDDING rule is "always trust your partner; never save your partner," because there is nothing more annoying (or fun to watch) than two partners pushing their contract ever higher arguing over a trump suit. "Gee, ya'll make 4 Spades easily; 6 Diamonds is kinda hopeless though. Or 6 Spades. Six of anything, really. Can we play for money next time...? :)" The only other bidding rule I esteem as highly is "never open a four card major suit," but the Grim Reaper should take care of the last 4 card major player soon (if he has not already.)


View original postWell your odds of rolling a 17 or 18 on 3d6 are 4 in 216, just under 2% as you say, but so many systems let you have 'drop the lowest' or 'reroll one' and those massively change those outlier spreads. Classic D&D, need a 17 CHR for the paladin, you had a 4 in 216 chance of getting it, or 6 times that if you got to assign your rolls to a stat. With 'drop the lowest' there are 1296 possible combinations instead of 216, and there 21 combinations for 18 or 3 each and 54 more for 17,4 each. Your odds of rolling 17+ jump to just under 6% instead of just under 4%. Alternatively 3 and 4, by dropping the lowest, actually only have 1 and 4 combinations, since you drop the lowest. Your average value shifts from 10.5 to ~12.5 and you don't mirror around that value, your odds on 3d6 of getting a 9 equal those of 12, drop the lowest and you don't just move the middle up but make it assymetric. Your odds of getting a 7 or less, previously as probable as 14+, now drop to less than 6% itself. Getting the four lowest number is less probable than getting the highest two numbers. So many games, especially for stat building, use that method and even a lot of dice intensive games offer 'drop the lowest' as a special ability without most players realizing how breaking it is. It's like reroll save abilities, versus those that let you add +5 to a die roll, it's more than passing difficult to figure out which is better especially since it would depend on the circumstance.

Rerolls or dropping the lowest does greatly change things; I picked up the latest edition of Talisman a few months ago and it now allows rerolls via "fate," which is game BENDING at the very least. Many confrontations I would have blithely skipped when possible before are now far less intimidating. I think many games offer that drop vs. add variety just to make min/maxxing that much more difficult, but min/maxxing is like that phrase Chris Berman used to love: You can't stop it, only contain it. I have to admit though, once I found bridge dice lost a lot of appeal. It did not stop me working out a formula to calculate the odds of rolling an x on ydz, but I am usually all about the cards when I play these days. Maybe I should try the hopelessly dated Twilight 2000 again; it used cards as RNGs a lot.
Honorbound and honored to be Bonded to Mahtaliel Sedai
Last First in wotmania Chat
Slightly better than chocolate.

Love still can't be coerced.
Please Don't Eat the Newbies!

LoL. Be well, RAFOlk.
This math has no numbers, by definition.
Reply to message
Ancient d20 Emerges from the Ashes of Time - 07/03/2013 05:23:05 PM 961 Views
That's amazing. - 07/03/2013 06:44:33 PM 713 Views
Cool. *NM* - 07/03/2013 07:06:43 PM 353 Views
I suppose it isn't too surprising, very darn cool though - 07/03/2013 10:32:37 PM 695 Views
Makes one wonder about the purpose(s) for which the dice were used. - 08/03/2013 10:29:19 AM 700 Views
Re: Makes one wonder about the purpose(s) for which the dice were used. - 08/03/2013 03:43:57 PM 583 Views
Tom would probably be our best source on symbols; apparently ancient Egypt had THREE numeric sets. - 08/03/2013 06:24:27 PM 747 Views
Tom probably, maybe Ghav - 08/03/2013 07:29:32 PM 746 Views
Yeah, and maybe more likely to visit the RPGMB. - 08/03/2013 09:59:38 PM 815 Views
The Fortress of Solitude always welcomes guests - 08/03/2013 10:49:59 PM 715 Views
Like Infinity Welcomes Careful Drivers? You expect me to fall for that as easily as General Zod did? - 09/03/2013 11:14:24 PM 813 Views
Never hurts to try - 10/03/2013 02:40:19 AM 689 Views
Do or do not.... - 11/03/2013 04:31:46 PM 921 Views
Re: Do or do not.... - 11/03/2013 08:54:34 PM 688 Views
Re: Do or do not.... - 12/03/2013 02:19:56 AM 908 Views
Also Tom's remark's on the CMB make it pretty definitely not numerical - 10/03/2013 04:21:55 AM 688 Views
Hmm... actually, that d20 COULD be a "d200." - 11/03/2013 03:39:52 PM 876 Views

Reply to Message