## Imaginative Walks Generative Random Walk Deviation Loss for Improved Unseen Learning Representation

We propose a novel loss for generative models, dubbed as GRaWD (GenerativeRandom Walk Deviation), to improve learning representations of unexplored spaces . We show that our loss can improve unseen class representation quality on four text-based ZSL benchmarks on CUB and NABirds datasets .…

## Understanding Synonymous Referring Expressions via Contrastive Features

Referring expression comprehension aims to localize objects identified by natural language descriptions . Eachobject can be described by synonymous sentences with paraphrases, and suchvarieties in languages have critical impact on learning a comprehension model . We develop an end-to-end trainable framework to learn contrastive features on the image andobject instance levels .…

## Simple Type Theory is not too Simple Grothendieck s Schemes without Dependent Types

We report on a formalization of schemes in the proof assistant Isabelle/HOL . Schemes aresophisticated mathematical objects in algebraic geometry introduced by Grothendieck in 1960 . We show how the powerful dependent types of Coq or Lean can be traded for a minimalist apparatus called locales .…

## Grammatical Error Generation Based on Translated Fragments

We perform neural machine translation of sentence fragments in order to create large amounts of training data for English grammatical error correction . Our method aims at simulating mistakes made by second language learners . A model trained on data created using our proposed method is shownto outperform a baseline model on test data with a high proportion of errors .…

## SelfReg Self supervised Contrastive Regularization for Domain Generalization

The proposed approach use only positive data pairs,thus it resolves various problems caused by negative pair sampling . The proposed method shows comparable performance to the conventional state-of-the-art alternatives . We propose a class-specific domain perturbation layer (CDPL), which makes itpossible to effectively apply mixup augmentation even when only positive datapairs are used .…

## Identifying Helpful Sentences in Product Reviews

Online shopping has gained momentum and became an importantvenue for customers wishing to save time and simplify their shopping process . We suggest a novel task of extracting a single representative helpful sentence from a set of reviews for a given product .…

## WASSA IITK at WASSA 2021 Multi task Learning and Transformer Finetuning for Emotion Classification and Empathy Prediction

This paper describes our contribution to the WASSA 2021 shared task on Emotion Prediction and Emotion Classification . The broad goal of this task was to model an empathy score, a distress score and the overall level of emotion of an essay written in response to a newspaper article associated with harm to someone .…

## Efficient Retrieval Optimized Multi task Learning

Recently, there have been significant advances in neural methods for tackling knowledge-intensive tasks such as open domain question answering . Using our framework, we achieve comparable or better performance than recent methods on QA, while drastically reducing thenumber of parameters .…