User loginNavigation |
archivesA problem about programming with macros vs Kernel F-exprsI love Kernel F-exprs which looks better than ordinary macro in almost every ways, However, I have trouble trying to implement the following using F-expr (defmacro f () (make-array 1)) ;; Common Lisp style macro (define g (lambda () (list (f) (f))))
The two I can't think of a way to replicate this behavior using Kernel F-exprs. Any ideas? Google Brain's Jax and FlaxGoogle's AI division, Google Brain, has two main products for deep learning: TensorFlow and Jax. While TensorFlow is best known, Jax can be thought of as a higher-level language for specifying deep learning algorithms while automatically eliding code that doesn't need to run as part of the model. Jax evolved from Autograd, and is a combination of Autograd and XLA. Autograd "can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization." Flax is then built on top of Jax, and allows for easier customization of existing models. What do you see as the future of domain specific languages for AI? By Z-Bo at 2021-01-15 13:59 | Implementation | Python | Scientific Programming | Software Engineering | login or register to post comments | other blogs | 63632 reads
|
Browse archivesActive forum topics |
Recent comments
33 weeks 4 days ago
33 weeks 4 days ago
33 weeks 4 days ago
1 year 3 weeks ago
1 year 8 weeks ago
1 year 9 weeks ago
1 year 9 weeks ago
1 year 12 weeks ago
1 year 16 weeks ago
1 year 16 weeks ago