Show simple item record

dc.contributor.authorJohnston, Lincolnen_NZ
dc.identifier.citationJohnston, L. (2004, November 12). Automatic mood detection from electronic music data (Dissertation, Bachelor of Commerce with Honours). Retrieved from
dc.description.abstractAutomatic mood detection from music has two main benefits. Firstly, having the knowledge of mood in advance can allow for possible enhancement of the music experience (such as mood-based visualizations) and secondly it makes 'query by mood' from music data-banks possible. This research is concerned with the automatic detection of mood from the electronic music genre, in particular that of drum and bass. The methodology was relatively simple, firstly sampling the music, and then giving a human pre-classification to the music (to use for training a classifier) via a point on a Thayer's model mood map. The samples then had low level signal processing features, mel frequency cepstral coefficient, psychoacoustic features and pitch image summary features extracted from them. These were then verified as useful via self organising maps and ranking via the feature selection techniques of information gain, gain ratio and symmetric uncertainty. The verified features were then used as training and testing (via cross-validation) data for a 3 layer perceptron neural network. Two approaches at feature extraction were used due to the first approach performing poorly at self organising map based cluster analysis. The mood classification scheme was then simplified to have four moods as opposed to 25. The main difference, however between the two approaches was based around different feature extraction window duration and different features. The second approach's features were used to train the neural network and the classification performed with classification accuracy rates no less than 84 %. Out of this research comes understanding of how one human's approximated perception can be captured and shows its use for determination of mood classifications from music.en_NZ
dc.subjectdetection of mooden_NZ
dc.subjectmel frequency cepstral coefficienten_NZ
dc.subjectperceptron neural networken_NZ
dc.subjectmood classifications from musicen_NZ
dc.subject.lcshT Technology (General)en_NZ
dc.subject.lcshQ Science (General)en_NZ
dc.subject.lcshM Musicen_NZ
dc.titleAutomatic mood detection from electronic music dataen_NZ
otago.schoolInformation Scienceen_NZ Scienceen_NZ of Commerce with Honours of Otagoen_NZ Dissertationsen_NZ
otago.openaccessAbstract Only
dc.identifier.eprints383en_NZ Scienceen_NZ
dc.description.referencesAhrendt, P., Meng, A., Larsen, J. "Decision time horizon for music genre classification using short time features". Submitted for EUSIPCO, 2004. Bishop, C. M. "Neural Networks for Pattern Recognition". Oxford University Press. , 1995. Cheng, K., Nazer, B., Uppuluri, J., Verret, R. "Beat This: A Beat Synchronization Project'', Owlnet Group, Rice University, (Retrieved on 7 May 2004 from ), 2003. Dahlhaus, C., Gjerdingen, C., Robert 0., "Studies in the Origin of Harmonic Tonality", Princeton University Press, ISBN 0691091358, 1990. [5] Demuth H. and Beale M. "Neural Network Toolbox for use with Matlab Documentation", MathWorks, , 1998. Deva, B. C. "Psychoacoustics of Music and Speech". I. M. H. Press Ltd, 1967. Golub, S. "Classifying recorded music". Unpublished masters thesis. University of Edinburgh. (Retrieved May 28 2004 from ), 2000. Grimaldi, M., Cunningham, P., Kokaram, A. "An Evaluation of Alternative Feature Selection Strategies and Ensemble Techniques for Classifying Music", to appear in Workshop in Multimedia Discovery and Mining, ECML/PKDD03, Dubrovnik, Croatia, September, 2003. Haykin, S. "Neural networks: a comprehensive foundation". Upper Saddle River, N.J. , Prentice Hall, 1999. Healey, J., Paccar, R., and Dabek, F. new affect-perceiving inter-face and its application to personalized music selection". Technical Report 478, Massachusetts Institute of Technology, Media Laboratory Perceptual 65 Computing Section. , 1998. Huron, D. and Aarden, B. "Cognitive Issues and Approaches in Music Information Retrieval". edited by S. Downie and D. Byrd (Retrieved on 7 May 2004 from http :// ) , 2002. Juslin, P.N. "Cue Utilization in communication of emotion in music performance: Relating performance to perception", Experimental Psychology, 26, pp. 1797-1813, 2000. Kohonen, T. "Self-Organising Maps. Second Edition", Springer, 50 p, 2001. Krumhansl, C. L. Cognitive Foundations of Musical Pitch, Oxford Psychology Series 17, Oxford University Press, New York and Oxford, 1990. Larsen, J. "Introduction to Artificial Neural Networks" IMM, 1999. Leman, M., Lesaffre, M., Tanghe, K. "An introduction to the IPEM Toolbox for Perception Based Music Analysis", Mikropolyphonie - The Online Contemporary Music Journal, Volume 7, 2001. Lei Yu, Huan Liu, Efficiently handling feature redundancy in high-dimensional data. Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining, August 24-27, Washington, D.C. 2003. Liu, D., Lu, L. and Zhang, H.J. "Automatic mood detection from acoustic music data", International Symposium on Music Information Retrieval, Baltimore, Maryland (USA), 2003. Logan, B. "Mel Frequency Cepstral Coefficients for Music Modeling" in Proc. of the International Symposium on Music Information Retrieval 2000, Plymouth, USA, Oct, 2000. Lyons, A. "Synaesthesia - A Cognitive Model of Cross Modal Association." Consciousness, Literature and the Arts 2, 2001. McKinney, M.F. and Breebaart, J. "Features for Audio and Music Classification," in 4th International Conference on Music Information, , 2003. Metois, E. "Musical Sound Information: Musical Gestures and Embedding Systems", PhD Thesis, MIT Media Lab, 1996. Nyquist, H. "Certain topics in telegraph transmission theory," Trans. AIEE, vol. 47, pp. 617-644, Apr, 1928. Pampalk, E. Rauber, A. Merkl, D. "Content-based organization and visualization of music archives". ACM Multimedia pp: 570-579, 2002. McNabb, R., Smith, L., Witten, I., Henderson, C. "Tune Retrieval in the Multimedia Library," Multimedia Tools and Applications, 10(2-3)113-132, 2000. Schmidt, A. and Stone, T. "Music Classification and Identification System". University of Colorado (Retrieved on 7 May 2004 from http://www.flwvd.dhs.orq/school/MusicRecoqnitionDatabase.pdf Schubert, E., Wolfe, J. and Tarnopolsky, A. "Spectral centroid and timbre in complex, multiple instrumental textures" International Conference on Music Perception and Cognition, North Western University, Illinois. 654-657, 2004. Scott, P. and Widrow, B. "Music Classification using Neural Networks", Stanford University (Retrieved on 7 May 2004 from Shannon, C., "Communication in the presence of noise," Proc. Institute of Radio Engineers, vol. 37, no.1, pp. 10-21, Jan, 1949. Slaney, M. "Auditory toolbox (Tech. Rep. No. 1998-010)". Interval Research Corporation. (Retrieved 28 May 2004 from ) , 1998. Sondhi, M.M., "New Methods of Pitch Extraction". IEEE Trans. Audio and Electroacoustics, Vol. AU-16, No.2, pp.262-266, June, 1968. T. Li and M. Ogihara and Q. Li, "A comparative study on content-based music genre classification," in Proc. ACM SIGIR '03, Toronto, Canada, July, pp. 282-289, 2003. Thayer, R.E. The Biopsychology of Mood and Arousal. New York: Oxford University Press, 1989. Tzanetakis, G, and Cook, P, "Musical Genre Classification of Audio Signals". IEEE Transactions on Speech and Audio Processing, 10(5): 293-302, 2002. Wessel, D. "Timbre Space as a Musical Control Structure", Foundations of Computer Music", Curtis Roads Eds. MIT Press p.640-657, 1997. Witten, I., Frank, E., Kaufmann, M. "Data Mining: Practical machine learning tools with Java implementations", Morgan Kaufmann, San Francisco, 2000. Yazhong Feng, Yueting Zhuang, Yunhe Pan. Music Information Retrieval by Detecting Mood via Computational Media Aesthetics. Web Intelligence, pg. 235-24, 2003. Zillman, D. "Mood management in the context of selective exposure theory", In M. E. (Ed), Communication yearbook 23, Thousand Oaks, CA: Sage, pp. 103-123, 2000.en_NZ
 Find in your library

Files in this item


There are no files associated with this item.

This item is not available in full-text via OUR Archive.

If you would like to read this item, please apply for an inter-library loan from the University of Otago via your local library.

If you are the author of this item, please contact us if you wish to discuss making the full text publicly available.

This item appears in the following Collection(s)

Show simple item record