top of page


About Me

About Me

I am currently an Assistant Professor in the Department of Statistics at the University of Wisconsin Madison. From 2015 - 2018, I was a Prager Assistant Professor (postdoctoral position) at Brown University. I finished my Ph.D in mathematics at Carnegie Mellon University in 2015. I received my Bachelor's degree in mathematics from Universidad de Los Andes in Bogotá, Colombia, in 2010. 



My academic interests lie at the intersection of applied analysis, applied probability, statistics, and machine learning. 


When studying a certain data analysis methodology, my approach is to first seek a well posed continuum analogue that can work as an ideal population level counterpart. The population level methodologies that I typically study take the form of variational problems or geometric problems on continuum non-parametric settings. Studying these ideal methodologies requires a combination of tools from PDE theory, geometric measure theory, and ODEs in the space of probability measures taken with respect to optimal transport distances. Through rigorous analysis, a goal of my research is to connect the finite data approaches and their ideal continuum counterparts, with the intention of translating properties of the continuum setting to the discrete one. I have taken this general perspective to provide deep insights into graph-based methodologies for learning such as spectral clustering, graph cut clustering, and supervised learning approaches that rely on the solution of graph differential equations. My work has allowed me and my collaborators to provide strong links between learning problems and geometric problems studied in the mathematics literature, as well as to rigorously motivate the choice of algorithms used for optimization and uncertainty quantification (for example taking a Bayesian perspective and using MCMC algorithms) in learning problems.   


More recently, I have been interested in exploring the connection between adversarial learning and regularization. I have explored this in the setting of neural networks, where one can draw connections with optimal control theory, and in non-parametric classification, where one can draw connections with geometric problems involving perimeter functionals. In the non-parametric setting, for example, one can formulate an adversarial problem not as one where a fixed value of robustness parameter is chosen, but rather as one where an entire ensemble of problems are analyzed simultaneously. When the robustness parameter is turned off, the solution to the adversarial problem reduces to the classical Bayes classifier, and the idea is to study the dynamics that the associated decision boundary must obey in order to produce the solutions to all indexed adversarial problems as the robustness parameter (now interpreted as time) increases. These dynamics can naturally be described as geometric evolution equations.

Other ongoing work investigates analytical frameworks for improving methodologies used to fuse, prune, or search for neural networks. For the task of neural architecture search I have already completed some preliminary work where together with my collaborators we have introduced an analytical framework that allows us to propose gradient based algorithms for the exploration of a network architecture space. 

In the past I have used Bayesian methods for inference and MCMC computing in the context of uncertainty quantification for inverse problems arising in physics and engineering. In such problems the main computational challenge is the cost of evaluating an expensive forward map. This challenge is fundamental to the Bayesian inversion of complex models since vanilla MCMC sampling methods require repeated evaluation of the forward map. 

bottom of page