Graphic codes across times and cultures consistently share certain visual characteristics. According to the ecological hypothesis, this is because glyphs reflect the input statistics to which our visual system has adapted. We computationally model this hypothesis by employing a drawing-based signaling game involving two AI models to explore factors that impact empirical regularities in the surface form of artificially evolved glyphs and their similarity to human visual signs. In our first experiment, we investigate the role of the models' perception system on glyph line orientation and symmetry. We find that these characteristics are impacted by the input statistics of data used to pre-train models and, to a lesser extent, canvas shape and architectural model properties. Our second experiment analyzes the grapho-phonemic mapping that emerges when we integrate representations learned by a deep learning model trained for speech conversion into our setup.
«
Graphic codes across times and cultures consistently share certain visual characteristics. According to the ecological hypothesis, this is because glyphs reflect the input statistics to which our visual system has adapted. We computationally model this hypothesis by employing a drawing-based signaling game involving two AI models to explore factors that impact empirical regularities in the surface form of artificially evolved glyphs and their similarity to human visual signs. In our first experi...
»