Clothes Retrieval Using M-AlexNet With Mish Function and Feature Selection Using Joint Shannon’s Entropy Pearson’s Correlation Coefficient
Clothes Retrieval Using M-AlexNet With Mish Function and Feature Selection Using Joint Shannon’s Entropy Pearson’s Correlation Coefficient
Blog Article
The online retrieval of clothes-related images is crucial because finding the exact items, like the query image from a large amount of data, is highly challenging.However, significant clothes image variations degrade visual search retrieval accuracy.Another problem 4 Piece Large Entertainment Wall with retrieval accuracy is the high dimensions of feature vectors obtained from pre-trained deep CNN models.This research aims to enhance clothes retrieval training and test accuracy by using two different means.
Initially, features are extracted using the modified AlexNet (M-AlexNet) with slight modification.The ReLU activation function is replaced with a self-regularized Mish activation function because of its non-monotonic nature.The M-AlexNet with Mish is trained on the CIFAR-10 dataset using the SoftMax classifier.Another contribution is to reduce the dimensions of feature vectors obtained from M-AlexNet.
The dimensions of features are reduced by selecting the top $k$ -ranked features and removing some of the different features using the proposed Joint Shannon’s Entropy Pearson Correlation Coefficient (JSE-PCC) technique to enhance the clothes retrieval performance.To calculate the efficacy of suggested methods, the comparison is performed with other deep CNN models such as baseline AlexNet, VGG-16, VGG-19, and ResNet50 on DeepFashion2, MVC, and the proposed Clothes Image Dataset (CID).Extensive experiments indicate that AlexNet with Bridle Bits Mish attains 85.15%, 82.
04%, and 83.65% accuracy on DeepFashion2, MVC, and 83.65% on CID datasets.Hence, M-AlexNet and the proposed feature selection technique surpassed the results with a margin of 5.
11% on DeepFashion2, 1.95% on MVC, and 3.51% on CID datasets.