User loginNavigation |
ImplementationGraydon Hoare: 21 compilers and 3 orders of magnitude in 60 minutesIn 2019, Graydon Hoare gave a talk to undergraduates (PDF of slides) trying to communicate a sense of what compilers looked like from the perspective of people who did it for a living. I've been aware of this talk for over a year and meant to submit a story here, but was overcome by the sheer number of excellent observations. I'll just summarise the groups he uses:
I really recommend spending time working through these slides. While much of the material I was familiar with, enough was new, and I really appreciated the well-made points, shout-outs to projects that deserve more visibility, such as Nanopass compilers and CakeML, and the presentation of the Futamura projections, a famously tricky concept, at the undergraduate level. Google Brain's Jax and FlaxGoogle's AI division, Google Brain, has two main products for deep learning: TensorFlow and Jax. While TensorFlow is best known, Jax can be thought of as a higher-level language for specifying deep learning algorithms while automatically eliding code that doesn't need to run as part of the model. Jax evolved from Autograd, and is a combination of Autograd and XLA. Autograd "can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization." Flax is then built on top of Jax, and allows for easier customization of existing models. What do you see as the future of domain specific languages for AI? By Z-Bo at 2021-01-15 13:59 | Implementation | Python | Scientific Programming | Software Engineering | login or register to post comments | other blogs | 63180 reads
Applications of Blockchain to Programming Language TheoryLet's talk about Blockchain. Goal is to use this forum topic to highlight its usefulness to programming language theory and practice. If you're familiar with existing research efforts, please share them here. In addition, feel free to generate ideas for how Blockchain could improve languages and developer productivity. As one tasty example: Blockchain helps to formalize thinking about mutual knowledge and common knowledge, and potentially think about sharing intergalactic computing power through vast distributed computing fabrics. If we can design contracts in such a way that maximizes the usage of mutual knowledge while minimizing common knowledge to situations where you have to "prove your collateral", third-party transactions could eliminate a lot of back office burden. But, there might be benefits in other areas of computer science from such research, as well. Some language researchers, like Mark S. Miller, have always dreamed of Agoric and the Decades-Long Quest for Secure Smart Contracts. Some may also be aware that verification of smart contracts is an important research area, because of the notorious theft of purse via logic bug in an Ethereum smart contract. By Z-Bo at 2020-04-13 14:38 | Fun | Implementation | Semantics | 4 comments | other blogs | 103364 reads
Tensor Considered HarmfulTensor Considered Harmful, by Alexander Rush
Thanks to Edward Z. Yang for pointing me to this "Considered Harmful" position paper. By Z-Bo at 2019-06-27 14:26 | Critiques | Implementation | Teaching & Learning | 6 comments | other blogs | 78475 reads
Selective FunctorsFrom Andrey Mokhov's twitter feed:
By Z-Bo at 2019-03-05 17:12 | Functional | Implementation | Meta-Programming | 1 comment | other blogs | 68439 reads
Safe Dynamic Memory Management in Ada and SPARKSafe Dynamic Memory Management in Ada and SPARK by Maroua Maalej, Tucker Taft, Yannick Moy:
For the systems programmers among you, you might be interested in some new developments in Ada where they propose to add ownership types to Ada's pointer/access types, to improve the flexibility of the programs that can be written and whose safety can be automatically verified. The automated satisfiability of these safety properties is a key goal of the SPARK Ada subset. By naasking at 2018-07-26 19:42 | Implementation | Type Theory | 1 comment | other blogs | 49507 reads
"C Is Not a Low-level Language"David Chisnall, "C Is Not a Low-level Language. Your computer is not a fast PDP-11.", ACM Queue, Volume 16, issue 2.
Includes a discussion of various ways in which modern processors break the C abstract machine, as well as some interesting speculation on what a "non-C processor" might look like. The latter leads to thinking about what a low-level language for such a processor should look like. By Allan McInnes at 2018-07-04 03:09 | History | Implementation | Parallel/Distributed | 12 comments | other blogs | 39890 reads
Compiling a Subset of APL Into a Typed Intermediate Language
Compiling a Subset of APL Into
a Typed Intermediate Language
by Martin Elsman, Martin Dybdal Traditionally, APL is an interpreted language ... In this paper, we present a compiler that compiles a subset of APL into a typed intermediate representation, which should serve as a practical and well-defined intermediate format for targeting parallel-architectures through a large number of existing tools and frameworks. The intermediate language is conceptually close to the language Repa. It supports shape-polymorphic functions and types that classify shapes. The compiler takes a simplified approach to certain aspects of APL. Following other APL compilation approaches, the compiler is based on lexical (i.e., static) identifier scoping and has no support for dynamic compilation (APL execute).Terseness of APL is legendary, for good or bad. I keep finding more and more papers by Haskell community (and especially GHC contributors) working on efficient (parallel) arrays in Haskell. Exploiting Vector Instructions with Generalized Stream Fusion
Exploiting Vector Instructions with Generalized Stream Fusion
By Geoffrey Mainland, Roman Leshchinskiy, and Simon Peyton Jones. A.k.a. "Haskell beats C".
Our ideas are implemented in modified versions of the GHC compiler and vector library. Benchmarks show that high-level Haskell code written using our compiler and libraries can produce code that is faster than both compiler- and hand-vectorized C. This paper continues the promising line of research started in 1990 by Wadler (at least, that was how I learned of deforestation). Of course, there was a lot of development since then, but this specific paper introduces an interesting idea of multiple representations - potentially changing the game. By Andris Birkmanis at 2017-12-22 03:33 | Implementation | login or register to post comments | other blogs | 44354 reads
Implementing Algebraic Effects in CImplementing Algebraic Effects in C by Daan Leijen:
Another great paper by Daan Leijen, this time on a C library with immediate practical applications at Microsoft. The applicability is much wider though, since it's an ordinary C library for defining and using arbitrary algebraic effects. It looks pretty usable and is faster and more general than most of the C coroutine libraries that already exist. It's a nice addition to your toolbox for creating language runtimes in C, particularly since it provides a unified, structured way of creating and handling a variety of sophisticated language behaviours, like async/await, in ordinary C with good performance. There has been considerable discussion here of C and low-level languages with green threads, coroutines and so on, so hopefully others will find this useful! By naasking at 2017-07-27 13:50 | Effects | Implementation | Lambda Calculus | Semantics | login or register to post comments | other blogs | 35360 reads
|
Browse archives
Active forum topics |
Recent comments
25 weeks 6 days ago
26 weeks 44 min ago
26 weeks 51 min ago
48 weeks 1 day ago
1 year 2 days ago
1 year 1 week ago
1 year 1 week ago
1 year 4 weeks ago
1 year 9 weeks ago
1 year 9 weeks ago