Impact Factor (2025): 6.9
DOI Prefix: 10.47001/IRJIET
Music plays
a significant role in improving and elevating one’s mood as it is one of the
important source of entertainment and inspiration to move forward. Recent
studies have shown that humans respond as well as react to music in a very
positive manner and that music has a high impact on human’s brain activity.
Now-a-days, people often prefer to listen to music based on their moods and
interests. This work focuses on a system that suggests songs to the users,
based on their state of mind. In this system, computer vision components are
used to determine the user’s emotion through facial expressions. Once the
emotion is recognized, the system suggests a song for that emotion, saving a
lot of time for a user over selecting and playing songs manually. Conventional
method of playing music depending upon the mood of a person requires human
interaction. Migrating to the computer vision technology will enable automation
of such system. To achieve this goal, an algorithm is used to classify the
human expressions and play a music track as according to the present emotion
detected. It reduces the effort and time required in manually searching a song
from the list based on the present state of mind of a person. The expressions
of a person are detected by extracting the facial features using the Haar
Cascade algorithm and CNN Algorithm. An inbuilt camera is used to capture the
facial expressions of a person which reduces the designing cost of the system
as compared to other methods.
Country : India
IRJIET, Volume 7, Issue 5, May 2023 pp. 273-277