User loginNavigation |
LtU ForumHaskell for MacAvailable here with hackernews and reddit discussions ongoing. Even though I'm not a big fan of Haskell, I'm pretty excited about this. It represents a trend where PL is finally taking holistic programmer experiences seriously, and a move toward interactivity in program development that takes advantage of our (a) rich type systems, and (b) increasing budget of computer cycles. Even that they are trying to sell this is good: if people can get used to paying for tooling, that will encourage even more tooling via a healthy market feedback loop. The only drawback is the MAS sandbox, app stores need to learn how to accept developer tools without crippling them. Python, Machine Learning, and Language Wars. A Highly Subjective Point of ViewA nice article that describes tradeoffs made in choosing a PL for a domain-specific field. By Sean McDirmid at 2015-08-25 02:11 | LtU Forum | login or register to post comments | other blogs | 3294 reads
Another "big" question
To continue the interesting prognostication thread, here is another question. Several scientific fields have become increasingly reliant on programming - ranging from sophisticated data analysis to various kinds of standard simulation methodologies. Thus far most of this work is done in conventional languages, with R being the notable exception, being a language mostly dedicated to statistical data analysis. However, as far as statistical analysis goes, R is general purpose -- it is not tied to any specific scientific field [clarification 1, clarification 2]. So the question is whether you think that in the foreseeable future (say 5-15 years) at least one scientific field will make significant use (over 5% market share) of a domain specific language whose functionality (expressiveness) or correctness guarantees will be specific to the scientific enterprise of the field.
It might be interesting to connect the discussion of this question to issues of "open science", the rise in post-publication peer-review, reproducability and so on.
word2vecSo I made some claims in another topic that the future of programming might be intertwined with ML. I think word2vec provides some interesting evidence to this claim. From the project page:
Word Cosine distance
-------------------------------------------
spain 0.678515
belgium 0.665923
netherlands 0.652428
italy 0.633130
switzerland 0.622323
luxembourg 0.610033
portugal 0.577154
russia 0.571507
germany 0.563291
catalonia 0.534176Of course, we totally see this inferred relationship as a "type" for country (well, if you are OO inclined). Type then is related to distance in a vector space. These vectors have very interesting type-like properties that manifest as inferred analogies; consider:
I believe this could lead to some interesting augmentation in PL, in that types can then be used to find useful abstractions in a large corpus of code. But it probably requires an adjustment in how we think about types. The approach is also biased to OO types, but I would love to hear alternative interpretations. Unstructured casting considered harmful to securityUnstructured casting (e.g. Java, C#, C++, etc.) can be harmful to security. Structured casting consists of the following: 1: Casting self to an interface implemented by this Actor 2: Upcasting a) an Actor of an implementation type to the interface type of the implementation b) an Actor of an interface type to the interface type that was extended 3: Conditional downcasting of an Actor of an interface type to an extension interface type. (An implementation type cannot be downcast because there is nothing to which to downcast.) Claim: All other casting is unstructured and should be prohibited. Edit: The above was clarified as a result of a perceptive FriAM comment by Marc Stiegler
Actor DepositOnlyAccount[initialBalance:Euro] uses SimpleAccount[initialBalance]。
implements Account using
deposit[anAmount] →
⍠Account⨀SimpleAccount.deposit[anAmount]¶
// use deposit message handler from SimpleAccount (see below)
getBalance[ ] → ⦻¶ // always throw exception
withdraw[anAmount:Euro] → ⦻§▮ // always throw exception
As a result of the above definition, DepositOnlyAccount⊒Account and
getBalance[ ] ↦ ⦻, // always throws exception
withdraw[ ] ↦ ⦻, // always throws exception
deposit[Euro] ↦ Void▮
The above makes use of the following:
Interface Account with
getBalance[ ]↦Euro,
deposit[Euro]↦Void,
withdraw[Euro]↦Void▮
Actor SimpleAccount[startingBalance:Euro]
myBalance ≔ startingBalance。
// myBalance is an assignable variable
// initialized with startingBalance
implements Account using
getBalance[ ] → myBalance¶
deposit[anAmount] →
Void // return Void
afterward myBalance ≔ myBalance+anAmount¶
// the next message is processed with
// myBalance reflecting the deposit
withdraw[anAmount:Euro]:Void →
(amount > myBalance) �
True ⦂ Throw Overdrawn[ ] ⍌
False ⦂ Void // return Void
afterward myBalance ≔ myBalance–anAmount ⍰§▮
// the next message is processed with updated myBalance
Harnessing Curiosity to Increase Correctness in End-User ProgrammingHarnessing Curiosity to Increase Correctness in End-User Programming. Aaron Wilson, Margaret Burnett, Laura Beckwith, Orion Granatir, Ledah Casburn, Curtis Cook, Mike Durham, and Gregg Rothermel. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '03). (ACM paywalled link).
Via a seminar on Human Factors in Programming Languages, by Eric Walkingshaw. To quote Eric's blurb:
OcaPic: Programming PIC microcontrollers in OCamlMost embedded systems development is done in C. It's rare to see a functional programming language target any kind of microcontroller, let alone an 8-bit microcontroller with only a few kB of RAM. But the team behind the OcaPic project has somehow managed to get OCaml running on a PIC18 microcontroller. To do so, they created an efficient OCaml virtual machine in PIC assembler (~4kB of program memory), and utilized some clever techniques to postprocess the compiled bytecode to reduce heap usage, eliminate unused closures, reduce indirections, and compress the bytecode representation. Even if you're not interested in embedded systems, you may find some interesting ideas there for reducing overheads or dealing with constrained resource budgets. Nullable type is needed to fix Tony Hoare's "billion dollar mistake".The
In an expression, Illustrations:
In a pattern: Illustrations: Edited for clarity Eric Lippert's Sharp RegretsIn an article for InformIT, Eric Lippert runs down his "bottom 10" C# language design decisions:
The "lessons learned in retrospect" for each one are nicely done. Big questions
So, I've been (re)reading Hamming /The Are of Doing Science and Engineering/, which includes the famous talk "you and your research". That's the one where he recommends thinking about the big questions in your field. So here's one that we haven't talked about in awhile.
It seems clear that more and more things are being automated, machine learning is improving, systems are becoming harder to tinker with, and so on. So for how long are we going to be programming in ways similar to those we are used to, which have been with us essentially since the dawn of computing? Clearly, some people will be programming as long as there are computers. But is the number of people churning code going to remain significant? In five years - of course. Ten? Most likely. Fifteen - I am not so sure. Twenty? I have no idea.
One thing I am sure of: as long as programming remains something many people do, there will be debates about static type checking. Update: To put this in perspective - LtU turned fifteen last month. Wow. Update 2: Take the poll!
|
Browse archives
Active forum topics |
Recent comments
4 days 22 hours ago
5 days 19 hours ago
6 days 23 hours ago
1 week 13 min ago
1 week 5 days ago
1 week 5 days ago
1 week 5 days ago
4 weeks 5 days ago
5 weeks 4 days ago
5 weeks 4 days ago