Please use this identifier to cite or link to this item:
http://localhost:8080/xmlui/handle/123456789/5028
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Purnachandra Rao, Alapati | - |
dc.contributor.author | Joy Christy, Lawrance | - |
dc.contributor.author | Sambath, Priya | - |
dc.contributor.author | Rekha, Murugan | - |
dc.contributor.author | Manikandan, Rengarajan | - |
dc.contributor.author | Infant Raj, I | - |
dc.contributor.author | Kiran Bala, B | - |
dc.date.accessioned | 2024-06-24T05:48:05Z | - |
dc.date.available | 2024-06-24T05:48:05Z | - |
dc.date.issued | 2024-04-14 | - |
dc.identifier.isbn | 979-835035306-8 | - |
dc.identifier.uri | https://ieeexplore.ieee.org/document/10544031/authors#authors | - |
dc.description.abstract | In the evolving landscape of language education, this paper delves into the intersection of Natural Language Processing (NLP) and education technology to address the unique challenges faced by non-native speakers, particularly children, in acquiring English proficiency. Leveraging the potential of Cross-Lingual Transfer Learning, the proposed methodology, implemented in Python, aims to enhance language learning outcomes through the innovative use of the NNCES (NNCES) corpus. The NNCES corpus, featuring 50 Telugu-speaking children aged 8 to 12 engaged in English language learning, serves as a rich dataset for exploring cross-lingual transfer learning strategies. The paper introduces Multilingual Transfer Learning with Domain Adaptation (MTL-DA) that fine-tunes pre-trained multilingual models on the NNCES corpus. This strategic adaptation enables models to discern linguistic nuances, phonetic variations, and semantic context inherent in NNCES. The methodology involves a comprehensive pipeline, encompassing dataset collection, preprocessing, feature extraction, and model training. Min-Max Normalization is applied to acoustic features, and Mel-frequency cepstral coefficients (MFCCs) and word embeddings from pre-trained models are integrated into a holistic feature vector. The fine-tuned models exhibit superior performance in English language learning tasks, showcasing an accuracy of 99.6%. Comparative analysis with existing methods reveals a significant improvement, with the proposed method surpassing others by 3.6%. | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | en_US |
dc.title | CROSS-LINGUAL TRANSFER LEARNING IN NLP: ENHANCING ENGLISH LANGUAGE LEARNING FOR NON-NATIVE SPEAKERS | en_US |
dc.type | Other | en_US |
Appears in Collections: | 4. Conference Paper (12) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
CROSS-LINGUAL TRANSFER LEARNING IN NLP ENHANCING ENGLISH LANGUAGE LEARNING FOR NON-NATIVE SPEAKERS.docx | 245.4 kB | Microsoft Word XML | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.