Publication Date

2009

Degree Type

Master's Project

Degree Name

Master of Science (MS)

Department

Computer Science

Abstract

With the ever growing amount of digital information and multimedia on the World Wide Web and the current trend towards personalizing technology, users find themselves wanting a more intuitive way of finding related information, and not just any information but relevant information that is personal to them. One way to personalize and filter the information is by extracting the mood affectation, allowing the user to search based on current mood. The artificial intelligence field has done extensive research and continues to discover and improve current mood extraction techniques for each distinct medium. This paper will explore how to link and integrate the mood extraction of several distinct mediums— audio, image, and text—by utilizing a common emotion model that is customizable to the user. This project will allow the user to provide an input medium and find a matching output of a different medium based on default settings or user customization.

Share

COinS