Distributional semantics is a broad area of research based on Distributional hypothesis which says linguistic items with similar distributions have similar meanings. This area mostly deals with the theories and methods for quantifying and categorizing semantic similarities between linguistic items based on their distributional properties in large samples of language data.
As a part of our research work we investigate several aspects of distributional semantics to solve open problems of NLP like novel words sense detection, relation extraction etc.
- Novel word Sense Detection
Description- In this era of Big Data, due to expeditious exchange of information on the web, words are being used to denote newer meanings, causing linguistic shift. Thus novel sense detection becomes a crucial and challenging task in order to build any natural language processing application which depends on the efficient semantic representation of words. In this project we are presenting a novel proposal based on network features to improve the precision of new word sense detection.
- Lexical Relation Detection
Description- Distinguishing lexical relations has been a long term pursuit in natural language processing (NLP) domain. Recently, in order to detect lexical relations like hypernymy, meronymy, co-hyponymy etc., distributional semantic models are being used extensively in some form or the other. Even though a lot of efforts have been made for detecting hypernymy relation, the problem of co-hyponymy detection has been rarely investigated. In this project, we propose a novel supervised model where various network measures have been utilized to identify co-hyponymy relation with high accuracy outperforming state-of-the-art models by a significant margin.
- Abhik Jana
- Animesh Mukherjee
- Pawan Goyal
Publications: Not Yet