Uzaktan eğitimde öğretim elemanlarının ayakkabı bağı desenine dayalı konuşma duygu tanıma: ShoePat23
dc.authorid | 0000-0002-3298-0109 | en_US |
dc.contributor.author | Tanko, Dahiru | |
dc.contributor.author | Doğan, Şengül | |
dc.contributor.author | Demir, Fahrettin Burak | |
dc.contributor.author | Baygın, Mehmet | |
dc.contributor.author | Şahin, Şakir Engin | |
dc.contributor.author | Tuncer, Turker | |
dc.date.accessioned | 2022-03-15T12:22:49Z | |
dc.date.available | 2022-03-15T12:22:49Z | |
dc.date.issued | 2022 | en_US |
dc.department | MTÖ Üniversitesi, Akçadağ Meslek Yüksekokulu, Bilgisayar Teknolojileri Bölümü | en_US |
dc.description.abstract | Background and objective: We are living in the pandemic age, and many educational institutions have shifted to a distance education system to ensure learning continuity while at the same time curtailing the spread of the Covid-19 virus. Automated speech emotion classification models can be used to measure the lecturer's performance during the lecture. Material and method: In this work, we collected a new lecturer's speech dataset to detect three emotions: positive, neutral, and negative. The dataset is divided into segments with a length of five seconds per segment. Each segment has been utilized as an observation and contains 9541 observations. To automatically classify these emotions, a hand-modeled learning approach is presented. This approach has a comprehensive feature extraction method. In the feature extraction, a shoelace-based local feature generator is introduced, called Shoelace Pattern. The suggested feature extractor generates features at a low level. To further improve the feature generation capability of the Shoelace Pattern, tunable q wavelet transform (TQWT) is used to create sub-bands. Shoelace Pattern generates features from raw speech and sub-bands, and the proposed feature extraction method selects the most suitable feature vectors. The top four feature vectors are selected and merged to obtain the final feature vector. By deploying neighborhood component analysis (NCA), we chose the most informative 512 features, and these features are classified using a support vector machine (SVM) classifier using 10-fold cross-validation. Results: The proposed learning model based on the shoelace pattern (ShoePat23) attained 94.97% and 96.41% classification accuracies on the collected speech databases consecutively. Conclusions: The findings demonstrate the success of the ShoePat23 on speech emotion recognition. Moreover, this model has been used in the distance education system to detect the performance of the lecturers. | en_US |
dc.identifier.citation | Tanko, D., Dogan, S., Demir, F. B., Baygin, M., Sahin, S. E., & Tuncer, T. (2022). Shoelace pattern-based speech emotion recognition of the lecturers in distance education: ShoePat23. Applied Acoustics, 190, 108637. | en_US |
dc.identifier.doi | 10.1016/j.apacoust.2022.108637 | |
dc.identifier.endpage | 9 | en_US |
dc.identifier.issn | 0003682X | en_US |
dc.identifier.issue | 10867 | en_US |
dc.identifier.scopus | 2-s2.0-85123631772 | en_US |
dc.identifier.startpage | 1 | en_US |
dc.identifier.uri | https://hdl.handle.net/20.500.12899/643 | |
dc.identifier.volume | 190 | en_US |
dc.identifier.wos | WOS:000807404100009 | en_US |
dc.identifier.wosquality | Q1 | en_US |
dc.indekslendigikaynak | Web of Science | en_US |
dc.indekslendigikaynak | Scopus | en_US |
dc.institutionauthor | Şahin, Şakir Engin | |
dc.language.iso | en | en_US |
dc.publisher | Elsevier Ltd | en_US |
dc.relation.ispartof | Applied Acoustics | en_US |
dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | en_US |
dc.rights | info:eu-repo/semantics/closedAccess | en_US |
dc.subject | Distance education | en_US |
dc.subject | NCA | en_US |
dc.subject | Shoelace Pattern | en_US |
dc.subject | Speech emotion recognition | en_US |
dc.subject | SVM | en_US |
dc.title | Uzaktan eğitimde öğretim elemanlarının ayakkabı bağı desenine dayalı konuşma duygu tanıma: ShoePat23 | en_US |
dc.title.alternative | Elsevier Ltd | en_US |
dc.type | Article | en_US |