Thennal D K [ˈθɛnnel]

bee.png

I’m Thennal (any/all), a CS undergrad at the Indian Institute of Information Technology Kottayam, conducting natural language processing research since 2018. I’m interested in what makes language/speech models tick, and how to make them tick better. In particular:

  1. How do large pretrained models form their internal representations, and how does each component update it?
  2. There are a lot of pretrained and finetuned models available publicly. Can we use them to make better models?
  3. The field has a significant evaluation/benchmarking problem, particularly when it comes to non-English languages. How can we make it better?

I also like running, fungi, anything produced by Supergiant Games, and Japanese music. Go watch Etsuko Yakushimaru’s I’m Humanity, and then read about it.

news

Jan 22, 2025 Our paper on ASR evaluation metrics was accepted to NAACL Findings 2025!
Oct 18, 2024 Two new preprints, related to my internship with the University of Hamburg and my collaboration Jesin James from the University of Auckland.
Feb 20, 2024 Paper accepted at LREC-COLING 2024! Excited to go there in May and present our work, Fisher Mask Nodes for Language Model Merging.
Feb 17, 2024 Got the DAAD WISE scholarship for an internship with the University of Hamburg!

latest posts

selected publications

  1. Large Language Models Are Overparameterized Text Encoders
    Thennal D K , Tim Fischer , and Chris Biemann
    In Proceedings of the 10th Workshop on Representation Learning for NLP (RepL4NLP-2025), May 2025
  2. Fisher Mask Nodes for Language Model Merging
    Thennal D K , Ganesh Nathan , and Suchithra M S
    In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), May 2024
  3. Advocating Character Error Rate for Multilingual ASR Evaluation
    Thennal D K , Jesin James , Deepa Padmini Gopinath, and 1 more author
    In Findings of the Association for Computational Linguistics: NAACL 2025, Apr 2025