Object-specific semantic coding in human perirhinal cortex.


Category-specificity has been demonstrated in the human posterior ventral temporal cortex for a variety of object categories. Although object representations within the ventral visual pathway must be sufficiently rich and complex to support the recognition of individual objects, little is known about how specific objects are represented. Here, we used representational similarity analysis to determine what different kinds of object information are reflected in fMRI activation patterns and uncover the relationship between categorical and object-specific semantic representations. Our results show a gradient of informational specificity along the ventral stream from representations of image-based visual properties in early visual cortex, to categorical representations in the posterior ventral stream. A key finding showed that object-specific semantic information is uniquely represented in the perirhinal cortex, which was also increasingly engaged for objects that are more semantically confusable. These findings suggest a key role for the perirhinal cortex in representing and processing object-specific semantic information that is more critical for highly confusable objects. Our findings extend current distributed models by showing coarse dissociations between objects in posterior ventral cortex, and fine-grained distinctions between objects supported by the anterior medial temporal lobes, including the perirhinal cortex, which serve to integrate complex object information.