Abstract
Subjective complexity judgements play a crucial role in perception. Complexity is a major driver of aesthetic judgements, along with other cognitive characteristics such as memorability or engagement. Many studies have tried to identify the factors influencing the complexity of images. Despite an overall lack of consensus, studies have suggested that features from different levels of abstraction jointly contribute to complexity. We developed computational objective complexity measures to predict the subjective complexity of synthetic and natural images, and tested them in two studies. In study 1, we synthetically generated 2D binary pixel patterns using cellular automata, calculated multiple statistical objective measures including density, entropy, local spatial complexity, Kolmogorov complexity, intricacy and asymmetry, and related them to subjective ratings of complexity. The low-level measure of local spatial complexity, assessing the mean information gain of pairwise pixels, and the high-level measure of intricacy, counting the number of connected components in an image, together best predicted subjective complexity. In study 2, we collated four publicly available datasets (RSIVL, VISC, Savoias and IC9600) which report subjective complexity ratings for natural images including scenes and artworks. We adopted an object-centric approach, quantifying complexity as a function of the objects in an image. We found that subjective complexity was well explained by a low-level measure of number of objects, extracted using a hierarchical segmentation model, SAM, and a high-level measure of the number of named classes in the image, extracted using a semantic segmentation model, FC-CLIP. Across both studies, we applied algorithmic, information theoretic and deep-learning based methods to quantify the complexity of images. Through our work, we advocate for the use of similar computational methods in the aesthetics community, enabling large-scale, reproducible, and broadly interpretable analyses of experimental data.