News

Jun 17, 2025 PAPER 📝 We found a peculiar structure in the loss landscape: slowly decreasing loss channels that converge to a minimum at ∞-norm in parameter space. Our new paper describes how these channels emerge from saddle points that are induced by permutation symmetries. Functionally, these minima at infinity implement Gated Linear Units via combinations of standard artificial neurons. Check out our Flat Channels to Infinity in Neural Loss Landscapes.
Jun 05, 2025 POSTER 🏞️ Presented a poster at Frontiers in NeuroAI, Boston, on optimization challenges to recover connectivity from activity.
May 28, 2025 PAPER 📝 How degenerate is the solution space of identical RNNs trained on the same task? Check out our new paper on measuring and controlling degeneracy of RNNs. Fun collaboration with the Rajan Lab.
Mar 26, 2025 POSTER 🏞️ Presented a poster at Kempner Institute’s Spring Into Science meeting.
Mar 01, 2025 🌎 I am currently based in Boston for a 4 months visit in Kanaka Rajan’s lab. Reach out if you want to have a chat!
Sep 02, 2024 TALK 🎤 I gave a talk on Expand-and-Cluster for the EfficientML reading group – link to video.
Jul 21, 2024 TALK 🎤 POSTER 🏞️ In Vienna for ICML ‘24 and a visit to Tim Vogel’s lab in IST, Austria.
May 20, 2024 POSTER 🏞️ In Trieste for the Youth in High Dimensions meeting.
May 02, 2024 PAPER 📝 Expand-and-Cluster is accepted to ICML 2024!

🔍 We identify the weights of a neural network from simple input-output queries.
⚠️ Symmetries and overparameterised loss landscapes are heavily involved.
Feb 08, 2024 TALK 🎤 Gave a talk and visited the laboratory of Jakob Macke at the Tübingen AI Center, Germany.
Jan 31, 2024 TALK 🎤 Gave a talk at the annual Swiss Computational Neuroscience meeting.
Sep 26, 2023 POSTER 🏞️ In Berlin for a poster at the Bernstein conference.
Sep 01, 2023 TALK 🎤 Best presentation award at the NeuroLeman meeting 🎖️
Jun 06, 2023 TALK 🎤 Gave a talk and visited the groups of Angelika Steger and Joao Sacramento in ETH, Zürich.