Research and development - Seminars
In this talk, Mateo presents his thesis work, co-advised by Adolfo, where he proposes a principled approach for selecting neural network architectures by optimizing over the structure itself, not just the weights. The project introduces a rigorous framework to explore the space of possible architectures as a graph, where each node is a family of networks. Leveraging the Metropolis-Hastings algorithm and concepts from statistical learning theory such as structural risk minimization, Mateo shows how to intelligently traverse this space, balancing predictive accuracy with model complexity. He develops theoretical guarantees for universal consistency in both classification and regression settings and provides synthetic experiments where this method outperforms predefined architectures — albeit with increased computational cost. This is a rich proposal at the intersection of statistics, theoretical computer science, and machine learning — ideal for those interested in the mathematical foundations of designing robust and efficient neural networks.
YouTube – Quantil Matemáticas Aplicadas
Not available
Get information about Data Science, Artificial Intelligence, Machine Learning and more.