Загрузка страницы

The impact of differentiable programming: how ∂P is enabling new science in Julia

Fully incorporating differentiable programming (∂P) into the Julia language has enabled composability between modern machine learning techniques and existing high performance computing (HPC) modeling and simulation without sacrificing expressivity. Most notably, this has meant that small neural networks can be embedded within larger models whose other behaviors are fully understood and can be concretely represented. Smaller neural networks, in turn, are easier to train and interpret. It has also enabled complex computations to be embedded within cost functions for fast and robust reinforcement learning. In this talk, we’ll walk through several concrete examples and demonstrate how the combination of ∂P with Julia’s generic programming has enabled powerful and expressive new models.

Presented by Matt Bauman

Видео The impact of differentiable programming: how ∂P is enabling new science in Julia канала ACM SIGPLAN
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
23 ноября 2020 г. 21:07:21
01:09:58
Яндекс.Метрика