(this researcher seems to me (an ignorant neophyte admittedly), to have quite a bit of interesting work on GC, among other interesting things.)
p.s. i'm less interested in supporting OO everywhere than i am in having kick-ass GCs, i assume that using something more FP with a GC like this would be never less performant anyway.
A real time collector for reconfigurable hardware seems kinda like a nice little 'hardware' implementation.
Here is the idea: Popular wisdom dictates that functional languages need garbage collection, but is it really true. For example C++ style move semantics seem to solve the problem of returning a closure from a function, you create an object to contain a copy of all the local variables and functions, and then swap the local copy of the handle with the calling functions allocation, so when the function returns it destroys the empty handle from the caller, leaving the caller with the handle to the closure.
On this basis any acyclic datatype can be stored in the heap, but its lifetime managed from the stack handle (this is what RAII is doing in C++ STL).
I guess this is a degenerate case of reference counting, where we limit references to two types, a unique 'owning' reference (lets call it a handle to disambiguate) , that when it goes out of scope releases the memory (its unique and un-copyable so no need to count references), and an 'ephemeral' reference (lets call it a pointer) that is restricted in that it cannot be 'leaked' to a scope more short lived than the scope in which the handle exists. This all sounds a lot like Ada access variables - but note the change from the scope in which the handle was created, to the any scope in which the handle exists, as returning handles by move semantics is possible. This allows a constructor to return a handle into a scope which is also allowed to hold pointers to the object.
It doesn't sound like it really needs anything new, just a combination of Ada's access variable, and the mechanism for converting closures into objects described above.
What potential problems might there be? Could this work, or is there some overlooked flaw?
GPUs as an Opportunity for Offloading Garbage Collection sounds kinda cool to me.
Mainly I've been trying to learn from anybody who might know: can moving a GC (especially in things like Ocaml or whatever) off to another core have good results like making pauses less obvious in interactive apps?
Annual Peter Landin Semantics Seminar: On correspondences between programming languages & semantic notations: 8th Dec 2014
BCS FACS - Annual Peter Landin Semantics Seminar 2014
Date/Time: Monday 8 December 2014, 6.00pm - 8.30pm
Venue: BCS, First Floor, The Davidson Building, 5 Southampton Street, London, WC2E 7HA
Cost to attend: Free of charge, but, please book your place via the BCS online booking system.
Book Online: https://events.bcs.org/book/1170/
Speaker: Prof. Peter Mosses, Swansea University
Peter Landin (1930 - 2009) was a pioneer whose ideas underpin modern computing. In the the 1950s and 1960s, Landin showed that programs could be defined in terms of mathematical functions, translated into functional expressions in the lambda calculus, and their meaning calculated with an abstract mathematical machine. Compiler writers and designers of modern-day programming languages alike owe much to Landin's pioneering work.
Each year, a leading figure in computer science will pay tribute to Landin's contribution to computing through a public seminar. This year's seminar is entitled "On correspondences between programming languages and semantic notations" and will be given by Prof. Peter Mosses (Swansea University).
50 years ago, at the IFIP Working Conference on Formal Language Description Languages, Peter Landin presented a paper on “A formal description of ALGOL 60”. In it, he explained “a correspondence between certain features of current programming languages and a modified form of Church’s λ-notation”, and suggested using that as the basis for formal semantics. He regarded his formal description of ALGOL 60 as a “compiler” from ALGOL abstract syntax to λ-notation.
10 years later, denotational semantics was well established, and two denotational descriptions of ALGOL 60 had been produced as case studies: one in the VDM style developed at IBM-Vienna, the other in the continuations-based style adopted in Christopher Strachey’s Programming Research Group at Oxford.
After recalling Landin’s approach, I’ll illustrate how it differs from denotational semantics, based on the ALGOL 60 descriptions. I’ll also present a recently developed component-based semantics for ALGOL 60, involving its translation to an open-ended collection of so-called fundamental constructs. I’ll assume familiarity with the main concepts of denotational semantics.
Closing date for bookings is 8 December @ 5pm. No more bookings will be taken after this date.
Grammar of Graphics (Vega) for declarative static semantics + FRP (Flapjax) for declarative temporal!
An interesting paper by Oney, Myers, and Brandt in this year's UIST. Abstract:
I sometimes get a bee in my bonnet to look for tools to do model driven design and the like. So, I found a list of verification and synthesis tools. The mind boggles. For little people such as myself, I wish there were a table somewhere that listed the range of features and then showed what each system did/not support. I want stuff that would help with application development: user interface and state machine (e.g. audio playback engine + ui controlling it) type modeling and code generation (and round tripping). Anybody know of such guidance for the laymen?
The latest video in Dyalog's library - Depth-First Tree-Search in APL - is now available (https://www.youtube.com/watch?v=DsZdfnlh_d0)
The classic depth-first search algorithm is explored using APL, a simple and concise array notation with high-order functions. The presentation highlights APL's incremental development style, using simple steps and culminating in a purely functional solution for the N-Queens problem. The informative style of this presentation brings clarity to this advanced topic, making it accessible even to those who are still near the start of their APL journey.
Once you've seen the video, why not examine the code used in greater detail and try the expressions for yourself in the online tutorial at http://tryapl.org/ (Learn tab > Depth-first search)
Has anyone used Datalog or RDF as a basis beyond model-driven development, like projectional editing or unikernel generation?
By the way, no this is not a "homework question". Yes, I did research it. I think some people on here can probably point me to some interesting links that weren't obvious on Google.
Has anyone used Datalog or RDF as a basis for something beyond basic model-driven development, like projectional editing or unikernel generation?
Seems like Google is hitting on some model-driven development stuff that uses RDF, but what I really am interested in is generating the whole program or even system entirely based on some kind of semantic model, not just representing some domain model with it.
Unikernels: Rise of the Virtual Library Operating System -- this is what is making me wonder about this type of thing again. I think having a high-level description of all layers of the whole system makes sense, but not everyone is going to want to use OCaml, and practically speaking you will probably want to be able to generate program code or at least configuration files for specific languages or existing machines/VMs or interpreters.
The idea is to have kind of RDF schema or semantic model or common datalog database of facts and rules and then use that as a basis for modeling a program or entire system and then generating the program code from that.
Or better have different programming languages defined using this common set of rules and facts, so the programming language code can be automatically translated back into the common representation and then processed using another tool or edited using a particular interactive projection.
The basic idea is that assuming the whole system is open source, all of the different programming languages or database formats or programs or data representations are defined based on some common semantics in a logic format like datalog. This should make it much easier for different systems (like programming languages or databases or applications) to work together.
I am wondering especially if someone has applied an approach like that to projectional editing or unikernel generation or maybe both together.
Active forum topics
New forum topics