My Penn Grad Talks presentation on "How To Know What You Don't Know" is now on YouTube!
You can check it out here: www.youtube.com/watch?v=JhRb...
@leonardofg
Applied Mathematics and Computational Science PhD candidate @ UPenn, advised by Prof. Paris Perdikaris. I research applications of deep learning to science and engineering, with a focus on Uncertainty Quantification, Operator Leaning and PINNs!
My Penn Grad Talks presentation on "How To Know What You Don't Know" is now on YouTube!
You can check it out here: www.youtube.com/watch?v=JhRb...
Now that I have that presentation ready, I might do a quick recording of it to post online later. Be on the lookout for that if the paper peaked your interest 🙂
(6/6)
2) Proposing Neon, a neural network architecture for operator learning that has built-in uncertainty quantification by using Epinets. When compared to GPs and deep ensembles, Neon achieved the best performance, at times requiring ~40x less trainable parameters than DeepONet ensembles.
(5/6)
1) Formulating the Leaky Expected Improvement (L-EI) acquisition function for Bayesian optimization, which is provably similar to traditional EI, but significantly easier to optimize via gradient-based methods.
(4/6)
I also gave a talk on my recent paper “Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks”. You can read the full paper here: www.nature.com/articles/s41...
The paper’s main contributions are:
(3/6)
We had 8 wonderful talks from researchers both from industry and academia. Topics ranged from Bayesian Optimization, data-driven solutions to PDEs, climate forecasting, ecological modeling, and more! Thanks to all the speakers for accepting the invitation and traveling to present their work!
(2/6)
Had a great time attending the Computational Science & Engineering conference organized by #SIAM this week in Fort Worth, TX! 🤠
Earlier this week I organized a two session mini-symposium on “Uncertainty Quantification for Scientific Machine Learning” (#UQ4SciML for short). #SciML #AI4Science
(1/6)
You can watch the recorded livestream, including all the talks, here (mine starts around the 3:39:00 mark):
www.youtube.com/live/dya_05f...
(4/4)
#MachineLearning #UncertaintyQuantification #ScienceCommunication #SciML #AI4Science #TED #PhDStudent
Making technical math research accessible to a broad audience wasn’t easy, but it was incredibly rewarding. Huge thanks to everyone who attended and to my fellow speakers for their great talks!
(3/4)
In my talk, "How to Know What You Don't Know", I spoke about Uncertainty Quantification—how knowing what we don’t know makes machine learning models more trustworthy and less overconfident, especially in science and engineering.
(2/4)
Ever since I was a tween, I’ve loved watching TED talks. So getting to give my own TED-style talk at this year's Penn Grad Talks (a competition of 8-minute talks by UPenn grad students) was a thrill! Not only that, but I was chosen as the Audience Choice winner out of 20 incredible talks!!🏆 🥳
(1/4)
There'll be a panel of judges, but there's also an "Audience Choice" prize, which you'll be able to vote for during the talks (in-person or virtual), so I'd appreciate any help on that end 👀👀
(3/3)
The title of my talk is "How To Know What You Don't Know", and I'll be talking about Uncertainty Quantification (UQ) and how important it is in the current era of AI models being deployed in the real world.
(2/3)
Happy to share that this Friday at 3pm I'll be speaking as a finalist of the Penn Grad Talks competition! 🥳🥳
It is a competition of 8-minute TED-style talks by Penn grad students. You can attend it in-person at the Penn Museum, or virtually via the livestream: www.youtube.com/watch?v=dya_...
(1/3)
I recently gave a talk on the latter of these papers, (ActNet - Deep Learning Alternatives of the Kolmogorov Superposition Theorem) which you can check out here:
youtu.be/pQKayyvNo5E
I'll be giving a talk on the NEON paper soon at the SIAM CSE conference. Be on the lookout for a video on it soon!
To learn more about my recent research, feel free to check out my two most recent papers!
- Neural Epistemic Operator Networks (NEON): www.nature.com/articles/s41...
- ActNet - Deep Learning Alternatives of the Kolmogorov Superposition Theorem (ICLR 2025 Spotlight Paper) : arxiv.org/abs/2410.01990
- Within SciML, my main focus is on a) Uncertainty Quantification (UQ) - i.e.: models that know what they don't know; and b) building accurate and efficient surrogate models via Operator Learning and Physics Informed Neural Networks (PINNs) - i.e.: models that integrate data & scientific knowledge.
- I'm a PhD candidate in Applied Math & Computational Science at the University of Pennsylvania.
- I am originally from São Paulo, Brazil! 🇧🇷
- I work on developing deep learning methods for science and engineering applications, an area often called Scientific Machine Learning (SciML), or AI4Science.
Hello all! Just created this account to share some of my work and connect with other researchers! In the tread below, I'll share a few things about me, but first, here are some links!
- Google Scholar: scholar.google.com/citations?us...
- LinkedIn: www.linkedin.com/in/leonardo-...