With 40,000 words in the average vocabulary, how can speakers find the specific words that they want so quickly and easily? Cumulative semantic interference in language production provides a clue:… Click to show full abstract
With 40,000 words in the average vocabulary, how can speakers find the specific words that they want so quickly and easily? Cumulative semantic interference in language production provides a clue: when naming a large series of pictures, with a few mammals sprinkled about, naming each subsequent mammal becomes slower and more error-prone. Such interference mirrors predictions from an incremental learning algorithm applied to meaning-driven retrieval from an established vocabulary, suggesting retrieval benefits from a constant, implicit, re-optimization process (Oppenheim et al., 2010). But how quickly would a new mammal (e.g. paca) engage in this re-optimization? In this experiment, 18 participants studied 3 novel and 3 familiar exemplars from each of six semantic categories, and immediately performed a timed picture-naming task. Consistent with the learning model's predictions, naming latencies revealed immediate cumulative semantic interference in all directions: from new words to new words, from new words to old words, from old words to new words, and from old words to old words. Repeating the procedure several days later produced similar-magnitude effects, demonstrating that newly acquired words can be immediately semantically integrated, at least to the extent necessary to produce typical cumulative semantic interference. These findings extend the Dark Side model's scope to include novel word production, and are considered in terms of mechanisms for lexical selection.
               
Click one of the above tabs to view related content.