User loginNavigation |
LtU ForumDenominated Values - Part numeric and symbolic.Well as long as "how I spent my summer vacation" is an interesting topic, I've got one. I've recently been working on/thinking about a programming language type that I haven't seen implemented before. I guess "Denominated values" is the best name for it. A denominated value is a real number, plus a quasi-symbolic representation of what that number denominates - meters, hectares, seconds, joules, etc etc etc. The semantics are just what you'd expect: Multiply two denominated values together and you get another denominated value. Divide a length by a time and you get a speed. Divide a speed by a time and you get an acceleration. Multiply a distance times a distance and you get an area. Add two distances and you get a distance. Add a distance to a time and you get an error. Which is the whole point, because the idea is error checking. Most of the error checks can be free for most compile-able languages - The denomination semantics exist in compilation, but can get boiled out of the executable (except debug versions of the executable). If the program passes denomination semantics during compilation it doesn't need to check again, especially not on every single operation. I was remembering college physics where misunderstanding the problem often led to doing math and realizing I was trying to do some mathematical operation on values whose denominations were incompatible under that operation. Or when the answer came out with a value of the wrong denomination. Or when I blew a unit conversion. Of course denominations coming out right is no guarantor of correctness. Consider the popular rule "71 divided by annual percent interest rate equals doubling time." This happens to come out fairly close for small interest rates, but is not a mathematically correct construction and fails hard the further out of its range you get. This is a mistake that someone would make, because they probably learnt it when they were ten and never realized that the mathematical reasoning is fundamentally wrong. This is the kind of mistake we would not want to make. But it comes out correct in denomination. interest rate is in {thing per time} per {thing}, thus the unit {thing} cancels out and interest rate turns out to be denominated in {inverse time}. So 71 (a unit-less value) is divided by a value in {inverse time} and yields a value denominated in {time}, which is exactly how the answer is interpreted. You can assign this wrong answer value to a denominated variable and never know there was a problem - you print it out in years and it "looks right." And of course correct operations on denominated values do not necessarily guarantee us correct units. It guarantees us Denominated Values that *CAN* be expressed in the correct units, but are still prone to humorous errors. Even if we have done everything right, some values come out in {inverse time} and can be expressed as interest rates, angular velocities, or frequencies, and whatever units you tell the system to express it in, any inverse-time denominated value will express without an error because those three things are the same fundamental units. They shouldn't be, and I don't really know what to do about that. Implementation details: The constructor takes real numbers and construct unit-less denominated values, and has a matching 'conversion' that gets a real number if and only if the denominated value being converted is also unit-less. so you can do things like this. // implicit promotion of reals to unit-less allows DV*DV implementation to be used producing a DV answer. The simple implementation so far is that the number is a 64-bit double and the denomination is an array of 8 bytes. The denominations, and all operations and tests on them, are #ifdef'd to exist in debug mode only. The first byte of denomination is devoted to order-of-ten magnitude, -127 to +127. The remaining 7 bytes are each devoted to the power of one fundamental unit, allowing the fundamental units to have powers from -127 to 127. (I am not using #FF. During implementation it signified overflow). Addition and Subtraction requires that the unit powers are equal and increase or reduce one of the values by the difference in magnitude before doing the operation. Multiplication and Division adds or subtracts every field of the denomination, including the Magnitude. Internally values are represented in metric units, with various unit constants like INCH being denominated values representing the metric equivalent of one of that unit. So when someone creates a denominated value using 2*INCH they get the same denominated value as if they'd created it using 5.08*CENTIMETER. And there are also a lot of unit constants that use more than one of the denomination powers (WATT has a +1 power for joules and a -1 power for seconds for instance). I've created accessors that get the numeric value as a standard real number regardless of whether the denomination is unit-less, or which get the denomination as a 64-bit integer. I'm not sure this was a good idea. I seem to get sucked into using them to do "denomination casts" whose value or hazard I can't really guess. Denominated Values - Part numeric and symbolic.Well as long as "how I spent my summer vacation" is an interesting topic, I've got one. I've recently been working on/thinking about a programming language type that I haven't seen implemented before. I guess "Denominated values" is the best name for it. A denominated value is a real number, plus a quasi-symbolic representation of what that number denominates - meters, hectares, seconds, joules, etc etc etc. The semantics are just what you'd expect: Multiply two denominated values together and you get another denominated value. Divide a length by a time and you get a speed. Divide a speed by a time and you get an acceleration. Multiply a distance times a distance and you get an area. Add two distances and you get a distance. Add a distance to a time and you get an error. Which is the whole point, because the idea is error checking. Most of the error checks can be free for most compile-able languages - The denomination semantics exist in compilation, but can get boiled out of the executable (except debug versions of the executable). If the program passes denomination semantics during compilation it doesn't need to check again, especially not on every single operation. I was remembering college physics where misunderstanding the problem often led to doing math and realizing I was trying to do some mathematical operation on values whose denominations were incompatible under that operation. Or when the answer came out with a value of the wrong denomination. Or when I blew a unit conversion. Of course denominations coming out right is no guarantor of correctness. Consider the popular rule "71 divided by annual percent interest rate equals doubling time." This happens to come out fairly close for small interest rates, but is not a mathematically correct construction and fails hard the further out of its range you get. The thing is, this is a mistake that someone would make, because they probably learnt it when they were ten and never realized that the mathematical reasoning is fundamentally wrong. This is the kind of mistake we would not want to make. But it comes out correct in denomination. interest rate is in {thing per time} per {thing}, thus the unit {thing} cancels out and interest rate turns out to be denominated in inverse {time}. So 71 (a unit-less value) is divided by a value in inverse {time} and yields a value denominated in {time}, which is exactly how the answer is interpreted. You can assign this wrong answer value to a denominated variable and never know there was a problem - you print it out in years and it "looks right." Implementation details: The constructor takes real numbers and construct unit-less denominated values, and has a matching 'conversion' that gets a real number if and only if the denominated number being converted is also unit-less. so you can do things like this. The simple implementation so far is that the number is a 64-bit double and the denomination is an array of 8 bytes. The denominations, and all operations and tests on them, are #ifdef'd to exist in debug mode only. The first byte of denomination is devoted to order-of-ten magnitude, -127 to +127. The remaining 7 bytes are each devoted to the power of one fundamental unit, allowing the fundamental units to have powers from -127 to 127. (I am not using #FF. During implementation it signified overflow). Addition and Subtraction requires that the unit powers are equal and increase or reduce one of the values by the difference in magnitude before doing the operation. Multiplication and Division adds or subtracts every field of the denomination, including the Magnitude. Internally values are represented in metric units, with various unit constants like INCH being denominated values representing the metric equivalent of one of that unit. So when someone creates a denominated value using 2*INCH they get the same denominated value as if they'd created it using 5.08*CENTIMETER. And there are also a lot of unit constants that use more than one of the denomination powers (WATT has a +1 power for joules and a -1 power for seconds for instance). There's a lot of stuff missing, and a lot of things that turn out to be the "same" units even though they represent entirely different things. Frequency and Angular Velocity ought not be interchangeable for example, but both are seconds to the minus 1. I don't know how or whether that kind of error ought to be sorted out. I've created accessors that get the numeric value as a standard real number or the denomination as a 64-bit integer. I'm not sure this was a good idea. I seem to get sucked into using them to do "denomination casts" whose value or hazard I can't really guess. Programming Languages for a Programmable World (what's on those tablets in Westworld, anyway?)It occurs to me that, as our world becomes more programmable - we need better languages for programming the world around us. Perhaps we can start discussing what those languages, and the run-time environments behind them, might look like. We're in the Anthropocene, human activity has been the dominant influence on climate and the environment. We shape the face of the planet, and with almost 7 billion of our 8 billion carrying smartphones - we have reached the "singularity" - we have become globally connected rendering engines of an increasingly plastic reality. In a world of complex, self-organizing & adaptive systems - our dreams emerge into our collective subconsciousness (Facebook, Twitter, the Evening News), and we proceed to make those dreams real through our speech & actions. And so far, we've been letting our Ids rule the show - perhaps, because we simply don't have the language, or the processes, to be more deliberate about negotiating the world we want to live in. The notion of a Holodeck has been expanded greatly in recent years - "Westworld," "Ready Player 1," "Free Guy" - we get closer and closer to the notion of designing & scripting the world around us. Theme Parks, "Reality TV," LARPs, CONs built around fictional universes, large-scale LVC (Live, Virtual, Constructive) military exercises ... bring us closer and closer to the point where we can deliberately design & script the world around us, at the stroke of a key. But we're still living in a world where the pen rules. Westworld shows us engineers reprogramming the world from their tablets. Parzival & Blue Shirt Guy pull up virtual consoles, with super-user privileges. But, so far, the designs are conceptual - the GUIs are fictional, as is the code behind them. It's time that we start developing those interfaces & the run-time environments behind them. The Internet is increasingly the SCADA system for our planetary systems - it's time to start developing a new generation of protocols & languages. Shakespeare wrote, "All the world's a stage, and all the men and women merely players." John wrote, "In the beginning was the Word, and the Word was with God, and the Word was God." In the introduction to the "Whole Earth Catalog" - Stewart Brand wrote "We are as gods and might as well get good at it." Learning to be better gods, starts with improving our vocabulary, grammer, and diction. We need better design & scripting languages for shaping the world around us. And then we can talk about Design & Engineering Processes. We have MATLAB for systems modeling. We have have business plans, program plans, contracts, budgets & schedules for building systems. We have "mission orders" for military campaigns. But, the closer we get to the daily experience of life, the more nebulous our language becomes - we're back to natural languages that are not particularly useful for defining scenes, scenarios, characters, behaviors - for describing or scripting the stuff of reality. We use natural languages to write course catalogs & syllabi for universities; to write conference programs for events. We write shooting scripts for movies. Unity is simply not up to real-world tasks, like setting up an improv scene to be played out by avatars. Playing Game Master is still an art, practiced by individuals. If we are to truly master our reality, to write our own scripts, and to live together as cast & crew in each others' games, we need better language for discussing & negotiating our roles & lines - rules of engagement & order for "reality jamming" if you will - ideally ones that let us all "Live Long & Prosper" in a sustainable world of "Infinite Diversity in Infinite Combination." We seem to have gotten pretty good at setting up for a "Forever War" (or perhaps one ending in Mutually Assured Destruction). Now we need to get good at setting up for a "Million Year Picnic." The question is... what do those languages look like? What's the computational model behind them? What's the run-time bridge between thought, rendering the scenery, setting up the action? This is a community of language designers - perhaps we can start the discussion here. Your thoughts, please! A Manufacturer's Perspective on PL ProgressThis may be the first post about manufacturing ever to appear on LtU. I offer it because, sometimes, we find lessons about our own fields in the fields of our neighbors. One or two of you will have noticed (perhaps even appreciated) my eight year hiatus from LtU. I've spent the last seven years working on a viable business model for American manufacturing through on-demand digital production. Testing ideas commercially has a way of making you look at them in somewhat different terms, and it seems to me that there are parallels between what has happened in manufacturing and what has happened in programming language adoption. I recently rediscovered Milton Silva's post from April of 2020 Why is there no widely accepted progress for 50 years?, which prompted me to write this. The notion of "progress" is replete with observer bias: we readily notice advances on things that serve our objectives, and readily ignore advances on things that are not relevant to us on the day we happen to learn about them. So perhaps a better question might be: "Why has there been no uptake of PL innovations over the last 50 years?" Thomas Lord nailed the biggest issue in his response to Silva: "it's because capital, probably." Because the parallels are direct, I'd like to give my view of why there has been no substantive progress in manufacturing in the last 50 years. With apologies to international readers, I'll focus on how production evolved in the United States because I know that history better; I don't mean to suggest for a minute that progress only occurred here. The Path to Mass Production In the US, the so-called "American System" of repeatable fabrication using standardized parts is commonly mis-attributed to Eli Whitney. Whitney mainly deserves credit for collaborating with Thomas Jefferson to copy the work of Honoré Blanc. Blanc, a French gunsmith, demonstrated rifles made from standardized parts in 1777. Blanc's methods were rejected by European craftsmen, who correctly viewed it as a threat to their livelihoods and the established European social structure. Unable to convince Blanc to emigrate to America, Jefferson (then Ambassador to France) wrote letters to the American Secretary of War and to his friend Eli Whitney describing Blanc's methods. Though unsuccessful in his attempts to copy them, Whitney did an excellent job of marketing Blanc's ideas in America. Among others, he found a strong supporter in George Washington. Standardized parts took a very long time to succeed, driven initially through the efforts of the US Ordinance Group. Lore about Singer, McCormick, Pope, and Western Wheel Works notwithstanding, standardized parts did not yield cost-effective production until Ford's assembly lines generated standardized part consumption at scale. In the mid-1920s, Alfred Sloan created the flexible assembly line, enabling him to create the concept of a "model year" for vehicles. With this invention, Sloan resolved the achilles heal of mass production by removing the challenge of satiation. In both production and consumption, "more is better" became a driving thought pattern of the American 20th century. Through two world wars and several major regional conflicts, American mass production powered the economic growth of the country and a significant part the world. Death Rattles Viewed in retrospect, the re-entry of Toyota into the US market in 1964 with the Toyota Corona offered warning signs that mass production was in trouble. Unable to match the capital resources of American manufacturers, and unable to tolerate the costs of errors incurred in Deming's statistical quality control methods, Toyota had come up with the Toyota Production System, now known as lean manufacturing. US manufacturers had become complacent in their dominance, and were slow to consider or even recognize the threat this represented. By 1974, the Toyota Corolla had become the best-selling car in the world, and Detroit finally started to wake up. By then it was too late, because 1974 also marked the global introduction of the standardized cargo container. With this invention, the connection between sea and rail shipping became relatively seamless across the world and the per-unit cost of international shipping became negligible. Labor and environmental costs became the principle variable costs in production, and both were cheaper overseas than they were in the United States. Standardized cargo shipping abruptly erased the geographical partitions that had protected US markets from foreign competition. By buffering transportation with inventory, the delays of foreign production could be hidden from consumers. Mass production began to move inexorably to countries that had lower labor and environmental costs for producers. Basic arithmetic should have told us at this point that US mass production was in mortal peril: American salaries could not be reduced, so something else needed to change if manufacturing was to survive in America. Which prompts the question: Why has there been no substantive innovation in US manufacturing since 1974? So What Happened? There were several factors in the failure of U.S mass production:
There's obviously more to the story, and many other factors that contributed significantly in one example or another. The point, I think, is that when we succeed researchers and early stage engineers tend to be standing at the beginning of social processes that are exponentially more expensive than we realize and take a very long time to play out. Change is very much harder than we imagine. Thomas Lord an LtU regular, dies at 56The full obituary is at the Berkley Daily Planet Thomas Lord 1966-2022 He is survived by his wife Trina Pundurs, mother Luanna Pierannunzi, uncle Christopher Lord, aunt Sharlene Jones, and many cousins and extended family. Basic building blocks of a programming languageI've tried to understand category theory, but while I can't say I've got the mathematical skills to understand it, I think I kind of grasp what it intends to do. It is basically a way of building bigger things out of smaller things as flexibly as possible. Designing my language I get inspired by this thought, and started to think along those lines. This means trying to realize what are the basic building blocks from which to build larger blocks, i.e. from where do I start? In an attempt to figure this out, I thought about what languages typically tries to do. What I found so far are: 1. Calculate data, i.e. from data produce some other data, i.e. functions The idea here is to make these elements as versatile as possible, so they can be combined without restriction as long as it is reasonable. Allow functions to generate other functions, hierarchies and/or patterns. Allow hierarchies to contain functions, other hierarchies and/or patterns. Allow patterns to describe functions, hierarchies and/or other patterns. First of all, do you agree that these three could be used as basic building blocks, or is there something missing or wrong? Secondly, the pattern. I can see patterns used in two that seems like distinctly different ways. One is that you write a template like code, which you could see as a patter Perhaps those two cases of patterns should be named differently? Any thoughts on this? My article on state machines and DSL evolutionI've been unhappy with with state machines, activity diagrams, and BPMN as tools for modelling processes for long time. In the article linked below I've used the same principles as were used for moving from flat languages (Asm, Fortran 66, BASIC, etc) to structured programming to create a behavior model that is some kind of behavior equivalent of Martin Fowler state machine model. https://dzone.com/articles/evolving-domain-specific-languages (Teaser) the end result for the sample is the following: LOOP { ESCAPE doorOpened { DO lockPanel, unlockDoor WAIT doorClosed ALL { WAIT lightOn } AND { WAIT drawOpened } DO lockDoor, unlockPanel WAIT panelClosed } } By const at 2022-05-05 12:54 | LtU Forum | login or register to post comments | other blogs | 1432 reads
Do we need exactly two binding constructs?I'm recently thinking about the relation between objects and lexical environment. Objects, especially under a prototype-based object system, looks suspiciously similar to a lexical environment: slot-ref <-> environment-ref slot-set! <-> environment-set! add-slot! <-> environment-define parent(s)-of/delegate(s)-of <-> environment-parent(s) However, one problem remains in the way of unifying those 2 binding constructs completely: the slot-scoping problem. If I simply use symbols (like the case for environment) to designate slots, there's nothing to prevent two authors to come up with the same name (say 'x) and they can clash, especially in the presence of multiple delegation/inheritance. Therefore I figure I must use slot objects rather than symbols to designate slots: (within-environment user-1 (define x (make-slot))) (within-environment user-2 (define x (make-slot)) (make-object x 1 user-1:x 2)) and... now environments bind symbols to slot objects, and objects bind slot objects to values. This all looks fine, except that it makes me itch that I need to have two almost identical constructs, and I can't unify them into one! Are those two binding constructs exactly what we need, no more, no less? Cicada language -- a new dependently typed languageCicada language is a The aim of cicada project is to help people understand that developing software and developing mathematics are increasingly the same kind of activity, and people who practices these developments, can learn from each other, and help each other in very good ways. Homepage at: cicada-lang.org By xieyuheng at 2021-12-21 20:14 | LtU Forum | login or register to post comments | other blogs | 1926 reads
CUE: An open-source data validation language
The application of set-theoretic types is getting more and more mainstream as they turn out to be very effective in (partially) typing dynamic languages (i.e. Typescript). Other popular examples are Scala(3) and Kotlin, together with other (less mainstream) examples such as Ceylon, Typed Racket and the Avail programming language. I've dabbled in Avail a few years ago, and was very impressed by Avail's type driven compiler, but not so much by its (extreme) DSL support. In contrast to Avail, I believe CUE is much more pragmatic while not allowing any 'tactical' compromises. CUE has its origins at Google: the ability to compose CUE code/specifications is really useful at the scale that Google operates. The creator of CUE - Marcel van Lohuizen - has recently quit his job at Google to become a full-time developer of CUE. I think Marcel has created something really interesting, so I understand his decision! |
Browse archives
Active forum topics |
Recent comments
23 weeks 11 hours ago
23 weeks 15 hours ago
23 weeks 15 hours ago
45 weeks 1 day ago
49 weeks 3 days ago
51 weeks 23 hours ago
51 weeks 23 hours ago
1 year 1 week ago
1 year 6 weeks ago
1 year 6 weeks ago