Cherednichenko, O. Yu.Vovk, M.Ivashchenko, O.Чередніченко, О. Ю.2023-05-062023-05-062020Cherednichenko, O., Vovk, M., & Ivashchenko, O. (2020). Item Matching Based on Collection and Processing Customer Perceptionof Images. ICT in Education, Research and Industrial Applications. Integration, Harmonization and Knowledge Transfer (ICTERI 2020) : proceedings of the 16th International Conference. CEUR Workshop Proceedings, 2732, 329–337.Cherednichenko O., Vovk M., Ivashchenko O. Item Matching Based on Collection and Processing Customer Perceptionof Images. ICT in Education, Research and Industrial Applications. Integration, Harmonization and Knowledge Transfer (ICTERI 2020) : proceedings of the 16th International Conference. CEUR Workshop Proceedings. 2020. Vol. 2732. P. 329–337.1613-0073https://dspace.mipolytech.education/handle/mip/215The number of sellers and goods being sold on the e-marketplaces is growing, so the volume of data stored and processed by e-commerce information systems is increasing drastically. That is why the development of performance solutions is quite relevant. The given paper provides the approach of item match-ing based on the human perception of item images. The main goal of the study is to build a model for assessing the similarity of items. This paper provides a de-scription of a software product for comparing product images collected on online trading platforms. The user evaluates the product visually. The developed soft-ware implements the crowdsourcing data collecting based on the comparator identification method. The use of this method involves an experiment in which the user is offered two images, by comparing which the determined binary reac-tion is obtained. The results show the perspective of the mobile client application as part of an item matching system that aims to optimize the search for products on the Internet.enproduct matchingcrowdsourcingmobile applicationcustomer perceptioncomparator identificationItem Matching Based on Collection and Processing Customer Perceptionof ImagesArticlehttps://orcid.org/0000-0002-9391-5220https://orcid.org/0000-0003-4119-5441https://orcid.org/0000-0003-3636-3914