Ashok Thillaisundaram is a scientific researcher in the fields of machine learning and natural language processing. Thillaisundaram's work focuses on improving the capabilities of machine learning models in understanding and processing natural language.
Thillaisundaram has an email address verified with Microsoft.
Thillaisundaram has made significant contributions to the field of machine learning, particularly in relation to natural language processing and understanding hierarchical structure.
In this paper, Thillaisundaram and colleagues extend the transformer model by enabling it to learn hierarchical representations. They adapt the ordering mechanism from Shen et al. (2018) to the self-attention module of the transformer architecture. By training the model on language modelling and applying it to unsupervised parsing, they achieve reasonable results on the WSJ10 dataset with an F1-score of about 50%.
Here, Thillaisundaram presents a solution for extracting "gene - function change - disease" triples, extending the BERT language model to learn contextual language representations from a large unlabelled corpus. Their system outperforms a random baseline despite class imbalance, demonstrating the effectiveness of fine-tuning parameters for specific tasks.
Thillaisundaram's work in this area focuses on creating domain-specific knowledge graphs and relation extraction. They cite the lack of standard datasets for understanding financial relations and the need for more interpretable data in biomedical knowledge bases.
Ashok Thillaisundaram is a researcher in the fields of machine learning and natural language processing. Thillaisundaram has made contributions to the development of machine learning models for natural language understanding and biomedical relation extraction. He currently works at Microsoft.
It is unclear where Thillaisundaram received his undergraduate degree, but he has a verified email address at microsoft.com.
Thillaisundaram's research focuses on the intersection of machine learning and natural language processing, with a particular emphasis on biomedical applications. One of his key contributions is in the area of unsupervised parsing, where he proposed a hierarchical transformer model that can learn the hierarchical structure of natural language, outperforming other models that process text sequentially.
Thillaisundaram has also worked on extracting relationships between biomedical entities, such as genes and diseases, from large-scale literature using pre-trained language representations with minimal task-specific architecture. This work was presented at the 2019 BioNLP Open Shared Tasks, where his system, extending the BERT language model, achieved significant performance despite a simple setup.
Additionally, Thillaisundaram has proposed a method for constructing large-scale biomedical knowledge bases from scratch by extracting interpretable patterns from biomedical literature. This approach enables domain experts to discover new facts and relationships without the need for training data or hand-crafted rules, aiding in drug discovery and other practical applications.
Ashok Thillaisundaram is a scientific researcher with a verified email address at microsoft.com. Thillaisundaram's work focuses on machine learning and natural language processing, with a particular emphasis on biomedical applications.
Thillaisundaram has co-authored papers with Julien Fauqueur, Theodosia Togia, and others.
Thillaisundaram, A. (n.d.). A Hierarchical Transformer for Unsupervised Parsing.
Thillaisundaram, A. (2019). Biomedical relation extraction with pre-trained language representations and minimal task-specific architecture.
Fauqueur, J., Thillaisundaram, A., & Togia, T. (n.d.). Constructing large-scale biomedical knowledge bases from scratch with rapid annotation of interpretable patterns.