Introduction Recent research has reported that adding non-existent diacritical marks to a word produces a minimal reading cost compared to the intact word. Here we examined whether this minimal reading… Click to show full abstract
Introduction Recent research has reported that adding non-existent diacritical marks to a word produces a minimal reading cost compared to the intact word. Here we examined whether this minimal reading cost is due to: (1) the resilience of letter detectors to the perceptual noise (i.e., the cost should be small and comparable for words and nonwords) or (2) top-down lexical processes that normalize the percept for words (i.e., the cost would be larger for nonwords). Methods We designed a letter detection experiment in which a target stimulus (either a word or a nonword) was presented intact or with extra non-existent diacritics [e.g., amigo (friend) vs. ãmîgô; agimo vs. ãgîmô]. Participants had to decide which of two letters was in the stimulus (e.g., A vs. U). Results Although the task involved lexical processing, with responses being faster and more accurate for words compared to nonwords, we found only a minimal advantage in error rates for intact stimuli versus those with non-existent diacritics. This advantage was similar for both words and nonwords. Discussion The letter detectors in the word recognition system appear to be resilient to non-existent diacritics without the need for feedback from higher levels of processing.
               
Click one of the above tabs to view related content.