Jaisidh Singh
I am an ML grad student at the University of Tübingen
exploring the limits of what neural networks can learn.
My current research at ELLIS investigates hypernetworks with Samsung Research Montreal.
In the past, I have worked on (i) compositionality in multimodal models, and (ii) generative models learning private information.
Email / Resume / Google Scholar / Twitter / LinkedIn / Github
Selected research
-
Hyper-Align: Efficient Modality Alignment via Hypernetworks
ICLR WSL Workshop 2025 (Poster)
Jaisidh Singh, Diganta Misra, Boris Knyazev, Antonio Orvieto.
[Poster]
- Learning the Power of “No”: Foundation Models with Negations
WACV 2025 (Poster)
Jaisidh Singh*, Ishaan Shrivastava*, Richa Singh, Mayank Vatsa, Aparna Bharati
[Project Page] [Preprint] [GitHub]
- SynthProv: Interpretable Framework for Profiling Identity Leakage
WACV 2024 (Poster)
Jaisidh Singh, Harshil Bhatia, Richa Singh, Mayank Vatsa, Aparna Bharati
[Paper] [Poster] [Presentation]
Refer to my Google Scholar for a complete list (*: indicates equal contribution.)
Pet projects, notes, etc.
pytorch-mixtures
[GitHub]
A minimalist library for popular MoEs & MoD in PyTorch.
lora-clip
[GitHub]
A library to easily wrap LoRA layer insertion for CLIP.
- Tutorial for "DINo: Continuous PDE forecasting with INRs"
I've recently made an in-depth tutorial on this cool physics-informed ML paper.
- Auto-DEcoder in JAX [GitHub Gist]
A quick and easy walkthrough of the auto-decoding process with JAX.
- TokenFormer in PyTorch [GitHub Gist]
A crisp implementation of the TokenFormer Layer in PyTorch.
- Differential Transformer in PyTorch [GitHub Gist]
A simple snippet for multi-head differential attention in PyTorch.