Загрузка страницы

MuniHac 2020: Austin Huang - Hasktorch: Differentiable Functional Programming in Haskell

Title: Hasktorch: Differentiable Functional Programming in Haskell
Speaker: Austin Huang

Optimization over function composition is the unifying feature of machine learning using neural networks. Training neural networks utilizes differentiable layers, where layers implement pure functions. Higher order functions such as differentiation, jit optimization, distillation, hyperparameter optimization, are used in the process of building neural networks. Thus, neural networks can be considered to be "differentiable functional programming".

Despite this, popular neural networks frameworks today are implemented in an imperative programming language context. The goal of Hasktorch is to advance the use of typed functional programming for machine learning. Hasktorch is a library for tensor math and differentiable programming in Haskell. It shares the backend C++ libtorch library used by PyTorch and serves three primary objectives:

1. Research on new functional programming methodology for developing and representing models that are more productive or lead to algorithm innovations.
2. Building machine learning systems that are more reliable for the model and its integration with the software in which the model is embedded.
3. Dissemination of new ideas from typed pure functional programming for machine learning to other languages and machine learning ecosystems.

Prerequisites
This is an intro talk, so relatively little background knowledge is assumed.

MuniHac 2020, September 12 / https://munihac.de/
TNG Technology Consulting GmbH / https://www.tngtech.com

Видео MuniHac 2020: Austin Huang - Hasktorch: Differentiable Functional Programming in Haskell канала TNG Technology Consulting GmbH
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
13 сентября 2020 г. 5:14:54
00:38:33
Яндекс.Метрика