Proposed D-RNNLM, a novel language modeling approach for code-switched text. Explored techniques to effectively train RNNLM for low-resource scenarios. Accepted as a short paper at EMNLP, 2018. Formulated a framework for combining two monolingual language models using a probabilistic model. Accepted at Interspeech 2018.