archives

Programming Languages as Mathematical Representations

Hi Folks,

To those of you with math backgrounds....

A random conversation, about math and machine learning, led me to wondering - LISP & lambda calculus aside - is it reasonable to view programming languages as mathematical representations?

After all - a program represents a set of logical/mathematical relationships - be it descriptive, a model, or a series of operations - not unlike the way a set of differential equations can model a physical system (and be used to predict behavior).

Are there fundamental differences, beyond the symbology and grammars, that differentiate a set of equations from a program?

I ask this partly out of intellectual curiosity, and partly because, when it comes to analyzing and modeling systems, my mind tends to think in terms of code, rather than formulas - and I kind of wonder if there's really something fundamentally different about the thought process, or is it more akin to the difference between, say, differential and integral forms?

Opinions?

Thanks,

Miles Fidelman