Seminars

Research and development - Seminars

Optimal Search of Neural Networks in the Context of Supervised Learning:

In this talk, Mateo presents his thesis work, co-advised by Adolfo, where he proposes a principled approach for selecting neural network architectures by optimizing over the structure itself, not just the weights. The project introduces a rigorous framework to explore the space of possible architectures as a graph, where each node is a family of networks. Leveraging the Metropolis-Hastings algorithm and concepts from statistical learning theory such as structural risk minimization, Mateo shows how to intelligently traverse this space, balancing predictive accuracy with model complexity. He develops theoretical guarantees for universal consistency in both classification and regression settings and provides synthetic experiments where this method outperforms predefined architectures — albeit with increased computational cost. This is a rich proposal at the intersection of statistics, theoretical computer science, and machine learning — ideal for those interested in the mathematical foundations of designing robust and efficient neural networks.

Details:

Exhibitor:

Mateo Alejandro Rodríguez Ramírez

Date:

May 15, 2025

Play Video

Optimal Search of Neural Networks in the Context of Supervised Learning

YouTube – Quantil Matemáticas Aplicadas

Attachments

Not available

Newsletter

Get information about Data Science, Artificial Intelligence, Machine Learning and more.