News
| Dec 09, 2025 | PAPER ๐ Our work on in-memory computing with memristive devices is out in Nature Machine Intelligence! We push the boundaries of whatโs computable in-memory, delegating both inference, error computation and error updates on the hardware side to minimize slow and inefficient cross-talk with a generic CPU. Have a look at Actorโcritic networks with analogue memristors mimicking reward-based learning |
|---|---|
| Dec 01, 2025 | POSTER ๐๏ธ In San Diego for NeurIPS โ25, presenting our Flat Channels to Infinity, Degeneracy in RNNs and Augmentation Techniques for Expand-and-Cluster. |
| Nov 25, 2025 | PAPER ๐ Our extension of Expand-and-Cluster for overparameterized networks have been accepted at the NeurIPS UniReps workshop. We come up with ways to augment data specifically to extract informative signals from black-box networks. Check out Data Augmentation Techniques to Reverse-Engineer Neural Network Weights from Input-Output Queries. |
| Oct 01, 2025 | POSTER ๐๏ธ In Frankfurt for two posters on RNN solution degeneracy and Toy models of identifiability for neuroscience at the Bernstein conference. |
| Sep 15, 2025 | PAPER ๐ Measuring and Controlling Solution Degeneracy across Task-Trained Recurrent Neural Networks has been accepted to NeurIPS as spotlight! |
| Sep 15, 2025 | PAPER ๐ Flat Channels to Infinity in Neural Loss Landscapes has been accepted to NeurIPS for a poster! |
| Sep 03, 2025 | TALK ๐ค Gave a talk about the channels to infinity in loss landscapes in the Ploutos platform - link to video |
| Aug 03, 2025 | ๐ Currently in Woods Hole, MA (US) for the MIT Brain, Minds and Machines summer school. |
| Jun 17, 2025 | PAPER ๐ We discover channels of slowly decreasing loss in network loss landscapes that lead to minima at infinite parameter norm. In the limit, these solutions implement Gated Linear Units using standard neurons. These channels are parallel to lines of saddle points generated by permutation symmetries. Read the paper: Flat Channels to Infinity in Neural Loss Landscapes. |
| Jun 05, 2025 | POSTER ๐๏ธ Presented a poster at Frontiers in NeuroAI, Boston, on optimization challenges to recover connectivity from activity. |
| May 28, 2025 | PAPER ๐ How degenerate is the solution space of identical RNNs trained on the same task? Check out our new paper on measuring and controlling degeneracy of RNNs. Fun collaboration with the Rajan Lab. |
| Mar 26, 2025 | POSTER ๐๏ธ Presented a poster at Kempner Instituteโs Spring Into Science meeting. |
| Mar 01, 2025 | ๐ I am currently based in Boston for a 4 months visit in Kanaka Rajanโs lab. Reach out if you want to have a chat! |
| Sep 02, 2024 | TALK ๐ค I gave a talk on Expand-and-Cluster for the EfficientML reading group โ link to video. |
| Jul 21, 2024 | TALK ๐ค POSTER ๐๏ธ In Vienna for ICML โ24 and a visit to Tim Vogelโs lab in IST, Austria. |
| May 20, 2024 | POSTER ๐๏ธ In Trieste for the Youth in High Dimensions meeting. |
| May 02, 2024 | PAPER ๐ Expand-and-Cluster is accepted to ICML 2024! ๐ We identify the weights of a neural network from simple input-output queries. โ ๏ธ Symmetries and overparameterised loss landscapes are heavily involved. |
| Feb 08, 2024 | TALK ๐ค Gave a talk and visited the laboratory of Jakob Macke at the Tรผbingen AI Center, Germany. |
| Jan 31, 2024 | TALK ๐ค Gave a talk at the annual Swiss Computational Neuroscience meeting. |
| Sep 26, 2023 | POSTER ๐๏ธ In Berlin for a poster at the Bernstein conference. |
| Sep 01, 2023 | TALK ๐ค Best presentation award at the NeuroLeman meeting ๐๏ธ |
| Jun 06, 2023 | TALK ๐ค Gave a talk and visited the group of Angelika Steger and Joao Sacramento in ETH, Zรผrich. |