Learning math?

I've found that I struggle to read & say math when I encounter it in computer science (papers, proofs, etc.). I find this makes it hard to process them even if I know what the symbols mean I may not know how to "say" them. I found Logitext (http://logitext.mit.edu/logitext.fcgi/main) to be helpful, but even then I found that with some of the symbols (turnstile) I was uncertain if that's how it's pronounced that way or somehow else.

What books, online resources, tutorials, reference sheets, etc. are helpful for learning & being able to read math - specifically math used in computer science & programming language design?

Thanks.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

You mean _pronounce_? As in

You mean _pronounce_? As in "A'" is "A prime"?

Yes, "pronounce".

Yes, "pronounce".

If so I suggest watching

If so I suggest watching online lectures for a bit, and see how professors talk while writing on the board. There are great courses available as video from schools like MIT.

I have no idea either, but I

I have no idea either, but I suppose I make up my own words when I read it. For instance, I pronounce "B, C |- A, B" as "Given (a proof of) B and C, can we prove A or B." I have no name for the horizontal bar (or line) between proofs, but I am thinking of it as a logical implication.
I might be completely wrong here, of course.

You aren't really supposed

You aren't really supposed to read mathematical notation out loud in your head like that. Instead, if you must, translate them to natural language sentences, like turnstile really just means "derive" or something like that. (if we have "a" and "b" in some environment "e," then we get "c"!).

Otherwise, to be efficient, you have to suck the notation into your head as if it was a language you understood without translation. This is why it is useful to not use a new notation, since sucking in every new notation basically requires gaining fluency in a new language.

I agree. I don't pronounce

I agree. I don't pronounce |- "turnstile". I don't pronounce it at all. When I see it, I think "OK, the thing on the left can prove the thing on the right".

Still, it is useful to know that it's called turnstile, for when other people call it that. So the original question still stands!

In that case, I just copy

In that case, I just copy latex's math mode terminology :)

Not much help for the automath but....

one of the advantages of learning this at a university is that you hear the "names" spoken by the knowledgable again and again in the lecture hall and the tutorial room.

Why don't you just ask?

By now, there should be more than enough online fora to ask what the correct pronunciation of a construct is.

This is a learning-style issue.

The issue here is that for some people (maybe 20%? I don't know, really), visual "notation" is much harder and slipperier than some other preferred modes of interacting with the material.

Every instructor knows that the more different sensory modes of interacting with the material you can engage, the better people will learn it. This is because different people learn best in different sensory modes.

This is why (some) people are yelling for "visual" programming languages for pete's sake, when most of us are entirely happy with text, and why (some) people rail against the "unpronounceable and therefore unrememberable" names in so many language's standard keywords and libraries. Too many PL people dismiss this as being stupid or having a brain unsuited to programming, but often it really is just an interaction-mode issue.

The issue here is that some learn more sight-centrically, and have trouble getting a grip on straight textual languages. Some learn more audio-centrically, and have trouble getting a grip on languages with symbols they can't pronounce (or worse, symbols pronounced exactly alike that mean different things). Math notation tends to be very symbol-heavy, and a lot of the symbols, like "script A" and "Bold A" and "lower-case A" tend to mean different things even while the instructor prounounces them exactly the same way (as "A").

If your learning style is audio-centric or, like mine, text-centric, as opposed to sight-centric, math the way most universities teach it is a nightmare, even for really smart people.

I don't know how else to explain this, except to say that something in a mode that's not one of the ones your brain is equipped to deal well with is slippery, indistinct, hard to remember, and hard to form associations with.

My learning style is text-centric. At some fundamental level my brain does not distinguish that script A, that boldface A, that regular A, that Fraktur A, that lowercase a, that greek capital Alpha, the Angstrom sign, and about a dozen other things that all read at a glance as 'A'. Several reputably excellent instructors tried, and failed, to teach me things that relied on those letters being used as symbols that meant different, but maddeningly similar or related, things; they swam around in my brain and would not stay distinct, and I thought I just sucked at math.

But I eventually learned that math is easy (for me, anyway) if I force every equation and proof through a conversion to text, one symbol at a time, using DIFFERENT TEXT for every symbol. I have to start with a (text) list describing all the symbols in use, and then one at a time go down the list to see which description or which drawing most closely matches the thing I'm looking at in the notation. Having done so I write an alternate symbol (with distinct text) for each one, usually five characters long or so, and carefully rebuild the statement.

Usually it winds up in a form that looks a lot more like pseudocode than standard math notation, with textually distinct symbols, no nontextual symbols beyond a static set of about twelve symbols that ALWAYS mean exactly the same thing no matter who uses them, and infixes transformed into prefixes to show precedence and nesting levels. After learning to do that, I aced every kind of math they could throw at me, although it always took me longer to take proper notes than it took most students. Hey, it turns out I don't suck at math after all, in fact math is easy and fun! I just suck at handling the notation!

And then there was the epiphany moment: Hey, look, programming languages! It's math, in text notations! I can do this!

OP has a similar issue, except his style is audio-centric instead of text-centric. He needs to rewrite (or, maybe "hear") all of this notation using words (different words, with different pronunciations) instead of symbols. He's asking what words will make the things he says understandable to conventionally trained mathematicians.

Unfortunately, I don't know any conventionally trained mathematician that will even ATTEMPT to understand something other than symbols in some near-standard notation. Hell, even in programming language theory, which is about various kinds of math-in-text-notation, they switch to a symbol notation when doing math about the properties of these different mathematical systems as opposed to actually using them. So I don't think there *is* any set of words like that.

If there are any blind mathematicians who actually *learned* mathematics while blind, I would love to read the stories of what they had to go through to extract information from conventionally trained instructors.

Getting someone to even mention that the script capital A and the boldface A in this equation were supposed to mean different things, took a heroic effort. I had no idea that there was even a subject there for a question that I needed to ask, except that the way I was reading it the instructor had given two simultaneous contradictory definitions for A, and no matter which of them you took to be correct the damned equation was obviously false! I kept hammering at understanding why nobody else thought it was obviously false, and even gave a very simple, straightforward proof that it was false, and kept hammering away at demanding to know *which step* of my proof was wrong and *why* it was wrong, until we eventually got to the issue.

Ray

and to mention an obvious point....

So, yeah, you can probably guess how I feel about UML.

notation

I'm somewhat interested in the notation you settled on, could you give the details?

It's nothing terribly

It's nothing terribly amazing. I "borrowed" it mostly from programming languages I had seen at the time, so it looks a lot like Lisp. I use infix notation for binary operations only, parenthesize aggressively assuming almost nothing about operator precedence, and switch to prefix notation inside parens for any operation that isn't binary (well, the parens may get omitted for unary operations on single terms).

As to the symbols themselves ... the usual thing with set theory, where capital A means a set and lowercase a means a member of a set and script A means a subset of the set, gets transformed into symbols like 'aset', 'amem' and 'asubs' respectively. I use square brackets [] to refer to parallel elements when dealing with parallel collections. For example aset[x] and bset[x] would refer to associated members of parallel sets aset and bset, usually for all x meeting some criteria.

I know the greek alphabet, but if there's any latin letter that looks even a little bit like something, I usually don't recognize it unless concentrating hard and being very careful, so Kappas, Chi's, etc, get converted to the written-out letter names. I recognize lower-case lambda for example, as text, but uppercase Lambda always gets read as "L", so I have to write it out.

When letters have grown to have a particular meaning in my mind, the way lower-case lambda means function definition for example, I absolutely will not use those letters for an unrelated operation. If a letter like lower-case lambda is being "misused" in some context, I have to rewrite it as something else, otherwise the "text" will mean the wrong thing to me.

Script versions of greek or other non-latin characters are the worst of the worst. They tend to be unrecognizable, unordered, unmemorable squiggles to me, and I can't reliably associate anything with them because they all run together in my mind. I draw them very carefully in a notebook at about triple size, write some arbitrarily-chosen label like "glork" or "zomble" under them, and then painstakingly use that as my standard translation of that particular squiggle in that particular domain. I have to refer back to the "squiggle notebook" a lot. I use different labels (different notebooks) in different domains, because sometimes the same squiggles get used differently in different domains (and of course no instructor would ever deign to *mention* explicitly that they are being used differently) and in that case using the same name would mislead me into a blind assumption that something is the same thing or at least has the same properties.

If script versions of greek letters are being used under the assumption that they will be associated with other presentations of the same greek letters, then I have to write that down in the "squiggle notebook" and use tags for them that extend the names of the greek letter. But this is hard for me to do, harder than any other aspect of the notation conversion. It's the experience most people would have when trying to recognize particular rocks presented in unfamiliar contexts, and distinguish them from other members of a large set of nondescript rocks, and remember which other members of that set of nondescript rocks they are associated with.

The math symbols that I actually use are those I whose meanings are strongly associated with a single operation in my mind. Usually that means either that I learned them in grade school and have practiced enough with them that they are "text" to me, or that I have seen them consistently and frequently used for exactly one operation and rarely for anything else.

They are + and - for addition and subtraction, ¬ for logical negation, ± for approximation bounds, ∈ for member-of, ∋ for such-that, = for equality, ∀ meaning for-all, ∃ for there-exists, × for multiplication and ÷ or / for division, ∩ for intersection, ∪ for union, for comparisons, | for choice-of, ⋁ for logical-or, ⋀ for logical-and, and maybe a few others I haven't thought of right now.

But I generally extend them with disambiguators if they refer to operations on any domain other than that which they are most closely identified with, and never use them at all for any operations that don't have exactly the same logical properties as the operations they're most closely identified with.

+ for example, is strictly addition. If we're talking about some non-numeric addition, it gets extended as, for example, vector+ or tensor+ or something. That's fine as long as + is referring to something that has the properties (associative, commutative, identity element, inverse operation in subtraction, etc) of addition on numbers.

If + meaning addition-on-numbers isn't even a relevant idea in some domain and there is exactly one operation in that domain that behaves like it, then I can stand to use the bare + sign for the other operation that behaves like it. + as a symbol for something that doesn't behave like addition, however, is _Just_Plain_Wrong_ on a level I can't even communicate, and the + sign isn't even in the symbols I use for operations like that.

× for multiplication and ÷ or / for division are, again, strictly for these operations on numbers, or for operations that behave exactly like those operations when multiplication and division on numbers aren't relevant to the domain. But these signs are among the most most frequently elided and heavily overloaded in all of math, so I'm "weird" and have to actually write the symbol between two things that are getting multiplied, or, with parens, in front of more-than-two things getting multiplied. I write extended symbols for so-called multiplications on different types or that mix different types (say, of a number by a vector or matrix) and write unrelated symbols for so-called multiplications that don't have exactly the same properties (associative, commutative, distributive, identity element, reversible by a division operation, etc) as multiplication on numbers.

I use "crossprod" never × for cartesian products; cross-products need a sub-operation for joining the elements, and nothing I would ever write with the × sign does. "(aset (crossprod ×) bset)" refers to a cross-product whose members are made by multiplying the members of the original aset by the members of the original bset, while "(aset (crossprod conc) bset)" refers to a cross-product whose members are made by concatenating the members of aset with the members of bset.

etc. Aside from diagnosing and analyzing my particular set of cognitive quirks, is this helpful?

Ray

notation

It is very helpful because a hobby of mine is designing better systems.
For me I have no trouble recognizing arbitrary pictures, but I HATE it when they make the name of something non-arbitrary, so that alpha-renaming fails. My other big frustration is when they they make it unclear which function they are applying to which arguments. But your convention for uninary, binary and higher arity functions is very sensible.

I was reading the first half of the fist book of Knuth's "The Art of computer programming, and when he was talking about simplifying expressions containing operations on arrays of elements, I had to draw out the array and color in the sections according to how many times that area was represented in the results, before the simplifications made sense to me.

A while ago on the rgrd newsgroup you said you had some text files full of ideas for creatures and items, if it isn't too bothersome could you send a file or two to andershorn@mailinator.com ? I really enjoyed the few ideas you shared.

Another helpful suggestion

Look up the HTML character entities for various things--or the Unicode code points for mathematical symbols; they'll give you a good idea what a given dingbat or whatzit is called.