A Semantic Classification of Images by Predicting Emotional Concepts from Visual Features

Authors

  • Tamil Priya.D
  • Divya Udayan.J

Abstract

Classification of emotion based on the image is a trivial task and hence a challenging issue in Content-Based Image Retrieval (CBIR) technique in the framework called Emotion-Based Image Retrieval (EBIR) system. This paper discusses emotion predication system that automatically predicts the semantic meaning of human emotion and uses visual features in order to ease the process of feature extraction. The emotion-based on the image is the subjectivity which plays a major role in the image retrieval process. The main motivation of this study is to predict the human emotion from an image. In the proposed work, the color, texture and shape are used as feature concepts to predict the semantics of emotion which are related with an image and features extracted by dominant color structure descriptors for color feature extraction, homogeneous texture descriptor(edge histogram) for texture feature extraction and region-based or contour-based descriptor(shape spectrum) for the shape feature extraction. Deep convolution Neural Network (Deep CNN) is used for classification into 6 basic emotion classes like anger, disgust, fear, happiness, sadness, and surprise by using color emotion factors. For training and testing the images, real-time data such as wallpaper, textile, and painting database is used. Experimental results of approach are evaluated and compared with classifier algorithms by using the measures of precision, recall, and accuracy. The prediction system also measures the arousal, valence and dominance value in order to measure the performance of EBIR approach/system.

Downloads

Published

2019-11-22

Issue

Section

Articles