Jaisidh Singh
I am an ML grad student at the University of Tübingen
exploring the limits of what neural networks can learn.
My current research investigates hypernetworks in collaboration with ELLIS and Samsung Research Montreal.
In the past, I have worked on
→ compositional limitations in multi-modal neural networks,
→ generative neural networks learning private information.
Email / CV / Scholar / Twitter / LinkedIn / Github / Blog
Selected publications
- Learning the Power of “No”: Foundation Models with Negations
[WACV 2025] - Jaisidh Singh*, Ishaan Shrivastava*, Richa Singh, Mayank Vatsa, Aparna Bharati
(*: equal contribution) [Project Page] [Preprint] [GitHub: 27x⭐️]
- SynthProv: Interpretable Framework for Profiling Identity Leakage
[WACV 2024] - Jaisidh Singh, Harshil Bhatia, Richa Singh, Mayank Vatsa, Aparna Bharati
[Paper] [Poster] [Presentation]
Pet projects
pytorch-mixtures
[GitHub: 20x⭐️]
A minimalist library for popular MoEs & MoD in PyTorch.
lora-clip
[GitHub: 24x⭐️]
A library to easily wrap LoRA layer insertion for CLIP.
- Neural Art [GitHub: 8x⭐️]
An app bring Fast Neural Style Transfer to your phone.
Notes + learning
- Tutorial for "DINo: Continuous PDE forecasting with INRs"
I've recently made an in-depth tutorial on this cool physics-informed ML paper.
- Auto-DEcoder in JAX [GitHub Gist]
A quick and easy walkthrough of the auto-decoding process with JAX.
- TokenFormer in PyTorch [GitHub Gist]
A crisp implementation of the TokenFormer Layer in PyTorch.
- Differential Transformer in PyTorch [GitHub Gist]
A simple snippet for multi-head differential attention in PyTorch.