Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

About me

Posts

Future Blog Post

less than 1 minute read

Published:

This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false.

Blog Post number 4

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 3

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 2

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

Blog Post number 1

less than 1 minute read

Published:

This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.

portfolio

Multi-event commonsense in event coreference resolution

Event coreference models cluster metions pertaining to the same real-world event. Recent models rely on contextualized representations to recognize coreference among lexically or contextually similar mentions. However, models typically fail to leverage commonsense inferences, which is particularly limiting for resolving lexically-divergent mentions. We are working on a model that extends event mentions with temporal commonsense inferences. Given a complex sentence with multiple events, e.g.,“the man killed his wife and got arrested”, with the target event “arrested”, our model generates plausible events that happen before the target event – such as “the police arrived”, and after it, such as “he was sentenced”. We generate such inferences by fine-tuning GPT-3 on limited human annotations. We show that incorporating such inferences into an existing event coreference model improves its performance, and we analyze the coreferences in which such temporal knowledge is required.

Visual Commonsense generation & incorportation

We are working on a model that can generate commonsense inferences on provided image + textual cues. For this, we are extending VisualCOMET to reason about entities (e.g.: Oven is used for baking) in addition to events (e.g.: Before shooting, the person loaded a gun). This will be developed by using existing Vision language sources rather than collecting expensive human annotations for each image.

Open and automated machine learning

I am a core contributor of OpenML, an open-source machine learning platform for reproducible and collaborative research. My major contributions can be found in the python framework - openml-python, OpenML core and new version of OpenML. My work spans aspects such as building research infrastructure, evaluating and comparing models, defining metrics, running AutoML experiments, visualizing datasets. github

Towards explainable active learning in hate speech detection

I developed explainable batch-mode active learning algorithms for sentiment analysis, in particular for hate-speech datasets. I built interactive tools and conducted experiments to evaluate the effect of explainability and uncertainity in active learning in a human-in-the-loop setting. I experimented with the explainability aspect using different backbones such as SVM, LSTM and BERT. github

Active meta learning

I worked on building efficient meta-learning approaches for automated machine learning (auto-sklearn) in order to selectively acquire training data for meta learning. In meta-learning, we usually try to predict which algorithms (or network architectures) to use for specific tasks. It is expensive to gather training data for these models as each data sample requires training a model. Active learning (or other clever techniques) could select the most interesting experiments to run.

talks

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.