Today I learned a new term from a blog post:
The complete blog post rags mostly on perf issues, but I'm more interested in the complexity implications. I felt myself go through this before, and to be honest the more powerful the language, the worse my peak abstraction got. It was only when moving to a less expressive language (C# rather than Scala) that I had incentive to keep it simple.
Has anyone else found themselves in an abstraction trap and come down as they grew as a programmer? What can we do in PL design to avoid, or at least discourage, overuse of abstraction? Is this a case of where less might be more?
Kalani Thielen's The Algebra of Data, and the Calculus of Mutation is a very good explanation of ADTs, and also scratches the surfaces of Zippers:
(hat tip to Daniel Yokomizo, who used to be an LtU member...)
Active forum topics
New forum topics