Club de lectura y discusion

Este seminario es una ocasion para estudiar temas matematicos en aprendizaje profundo.

Los temas cubiertos pueden provenir de cualquier area de las matematicas/estadisticas/computacion, y estan a distintos niveles de profundidad.

Para proponer temas y para manifestar interes, escribir a mpetrache@mat.uc.cl

Formato: 30 min presentacion paper, 1 h discusion / clarificacion dudas. Llevar su computador ayuda!

El seminario es una continuacion desde 2023, con otro publico, relacionado al grupo de lectura qu etomo lugar en 2020, organizado en conjunto con académicos e investigadores en Ecuaciones Diferenciales Parciales y Machine Learning de la Universidad de Chile, tiene el objetivo de aprender juntos sobre temas de investigación en Machine Learning para resolver ecuaciones.

2024-05-08
17:00-18:30 -- 30 min de presentacion + discusion por 1hrs.
German Pizarro. Cenia
A New Approach for Self-Supervised Learning on Images -- Https://arxiv.org/abs/2301.08243
Sala 1 departamento de matematicas
2024-04-24
17:00-18:30 (30 min presentacion + 1h discusion)hrs.
Sebastian Sanchez. UC Chile
Fourier Neural Operator for Parametric Partial Differential Equations -- Https://arxiv.org/pdf/2010.08895.pdf
Sala 1
2024-04-17
17:00-18:30 (30 min presentacion + 1h discusion)hrs.
Rafael Elberg. UC Chile
Clip: Como Conectar Imagenes y Texto de Forma Inteligente -- Paper Base: Https://arxiv.org/abs/2103.00020
Sala 1
2024-04-10
17:00-18:30 (30min presentacion + 1h discusion)hrs.
Nicolas Alvarado. UC Chile
Entrenamiento de Redes Neuronales Hiperbolicas -- Https://www.math.uci.edu/~Jxin/hffnn_Gd_Camc_Oct_2023.pdf
Sala 1
2024-04-03
17:00-18:30 (30min presentacion, 1h discusion)hrs.
Felipe Urrutia. Universidad de Chile
Uso de Teoria de Informacion en Procesamiento de Lenguaje Natural -- Https://tinyurl.com/23Vhc3Z9 -- Https://arxiv.org/pdf/2308.12562.pdf
Sala 1
2024-03-20
17:00 - 18:00hrs.
Mircea Petrache. UC Chile
Repaso inicial: Redes neuronales, entrenamiento, arquitecturas basicas (CNN, GNN, RNN, Transformers, Difusion)
Sala 1
Abstract:
Se recuerdan / introducen las redes neuronales y sus arquitecturas.

Sesion util especialmente para quienes no han visto redes neuronales antes.

La presentacion sera muy rapida y no sustituye un estudio individual, aca van mas materiales introductorios.

Material:

Notas que cubren entrenamiento, CNN y RNN https://arxiv.org/pdf/2304.05133.pdf

Notas un poco mas avanzadas/matematicas, sin enfasis en arquitecturas especiales: https://arxiv.org/abs/2105.04026

Notas sobre GNN (graph neural networks): http://web.stanford.edu/class/cs224w/slides/04-GNN2.pdf

Transformers (intro visual): https://jalammar.github.io/illustrated-transformer/

Difusion (paper de introduccion/survey): https://arxiv.org/pdf/2306.04542.pdf

2024-03-14
17:30-18:30hrs.
Mircea Petrache. UC Chile
Fijamos formato y proponemos temas por abordar durante el semestre
Sala 2
Abstract:
Consideraremos tematicas que mezclen herramientas matematicas con aplicaciones/ideas de aprendizaje profundo.

Una primera idea es que
-- cada sesion tenga 30 min de presentacion de un paper o tema,
-- y sucesivamente prevedemos max 1 hora de discusiones, donde incluso se cubre material complementario y se discuten los detalles del paper.
-- Para cada sesion sirve que al menos 2-3 personas hayan leido el paper y que lo conozcan.

En esta primera ocasion, hablaremos de lo siguiente:
-- si lo de arriba es un buen formato,
-- que temas interesa cubrir (sobre que temas buscamos papers por presentar)
-- ademas nos presentamos entre los interesados en matematicas + deep learning
2023-11-27
16:45 - 17:20hrs.
Valentina Navarro y Daniela Oviedo . UC Chile
Introduccion a Deep Learning para genomas
Sala 5
Abstract:
Presentacion final asociada al curso MAT2320 "Introduccion a Metodos Matematicos para Aprendizaje Profundo"
2021-01-15
15:00hrs.
Pavlos Protopapas. Harvard University
Tba
https://uchile.zoom.us/j/82959938410?pwd=V0lCM3VRT3A4TmRqQ2V3cWhCRjliUT09
2020-11-13
15:00hrs.
Javier Castro. Uchile (Dim)
Deep backward schemes for high-dimensional nonlinear PDEs
https://uchile.zoom.us/j/86346682467?pwd=NlJkdGc0TTQzQ2lqNVV3NmM0bjMyQT09
Abstract:
Se discutira el paper Deep backward schemes for high-dimensional nonlinear PDEs 
https://arxiv.org/abs/1902.01599
2020-11-06
15:00hrs.
Jocelyn Dunstan. Uchile
Physics-Informed Deep Learning
Zoom: https://uchile.zoom.us/j/86346682467?pwd=NlJkdGc0TTQzQ2lqNVV3NmM0bjMyQT09 Pass: pedir a M. Petrache
Abstract:
Se discutiran los articulos
2020-10-30
15:00hrs.
Gonzalo Mena. Oxford University
Neural Ordinary Differential Equations
https://uchile.zoom.us/j/89083121892?pwd=cHovbUhRalBHelpqckpyU3NXZnQwQT09 (pedir pass a Mircea Petrache)
Abstract:
Se discuten los papers:

https://arxiv.org/abs/1806.07366
https://arxiv.org/abs/2002.08071
2020-10-23
15:00hrs.
Mircea Petrache. PUC
Deep Neural Networks Desde el Punto de Vista del Grupo de Renormalisacion
Zoom: https://uchile.zoom.us/j/86346682467?pwd=NlJkdGc0TTQzQ2lqNVV3NmM0bjMyQT09 Pass: pedir a M. Petrache
2020-10-16
15:00hrs.
Joaquin Fontbona. Uchile
Mean-Field Interpretation of Deep Learning Algorithms (1)
Zoom: https://uchile.zoom.us/j/86346682467?pwd=NlJkdGc0TTQzQ2lqNVV3NmM0bjMyQT09 Pass: pedir a M. Petrache
2020-10-01
12:00hrs.
Nicolas Valenzuela. Uchile
Sobre "Solving high-dimensional partial differential equations using deep learning", de Jiequn Han, Arnulf Jentzen, y Weinan E (parte 2)
https://uchile.zoom.us/j/89083121892?pwd=cHovbUhRalBHelpqckpyU3NXZnQwQT09 (pedir pass a Mircea Petrache)
Abstract:
"Developing algorithms for solving high-dimensional partial differential equations (PDEs) has been an exceedingly difficult task for a long time, due to the notoriously difficult problem known as the “curse of dimensionality.” This paper introduces a deep learning-based approach that can handle general high-dimensional parabolic PDEs. To this end, the PDEs are reformulated using backward stochastic differential equations and the gradient of the unknown solution is approximated by neural networks, very much in the spirit of deep reinforcement learning with the gradient acting as the policy function. Numerical results on examples including the nonlinear Black–Scholes equation, the Hamilton–Jacobi–Bellman equation, and the Allen–Cahn equation suggest that the proposed algorithm is quite effective in high dimensions, in terms of both accuracy and cost. This opens up possibilities in economics, finance, operational research, and physics, by considering all participating agents, assets, resources, or particles together at the same time, instead of making ad hoc
assumptions on their interrelationships"
2020-09-24
12:30hrs.
Claudio Munoz. Uchile
Sobre "Solving High-Dimensional Partial Differential Equations Using Deep Learning", de Jiequn Han, Arnulf Jentzen, y Weinan e (Parte 1)
https://uchile.zoom.us/j/89083121892?pwd=cHovbUhRalBHelpqckpyU3NXZnQwQT09 (pedir pass a Mircea Petrache)
Abstract:
"Developing algorithms for solving high-dimensional partial differential equations (PDEs) has been an exceedingly difficult task for a long time, due to the notoriously difficult problem known as the “curse of dimensionality.” This paper introduces a deep learning-based approach that can handle general high-dimensional parabolic PDEs. To this end, the PDEs are reformulated using backward stochastic differential equations and the gradient of the unknown solution is approximated by neural networks, very much in the spirit of deep reinforcement learning with the gradient acting as the policy function. Numerical results on examples including the nonlinear Black–Scholes equation, the Hamilton–Jacobi–Bellman equation, and the Allen–Cahn equation suggest that the proposed algorithm is quite effective in high dimensions, in terms of both accuracy and cost. This opens up possibilities in economics, finance, operational research, and physics, by considering all participating agents, assets, resources, or particles together at the same time, instead of making ad hoc
assumptions on their interrelationships."