Neuroanatomy for Speech-Language Pathology and Audiology 2nd Ed. Matthew H Rouse

Chapter 12. The Neurology of Language

CHAPTER PREVIEW

Language is a code we use to express ourselves through speech or writing. We not only express language but also receive it through listening/comprehending and reading. Language is complex, and many brain regions appear to be involved in our production and comprehension of it. We will survey the neural substrates of language in this chapter.

CHAPTER OUTLINE

IN THIS CHAPTER

In this chapter, we will . . .

 Define language

 Discuss three components of language—content, form, and use

 Survey how language comprehension and production may be processed in the neurological system

 Survey several language disorders, including aphasia, alexia, and agraphia

LEARNING OBJECTIVES

1. The learner will define language.

2. The learner will list and define the components of language.

3. The learner will outline how language is thought to be neurologically processed in comprehending, reading, speaking, and writing.

4. The learner will describe the following disorders: aphasia, alexia, and agraphia.

 Introduction

 A Definition of Language

 The Components of Language

 The Neural Basis of Language

 Auditory Comprehension of Language

 Visual Comprehension of Language

 The Oral Production of Language

 The Written Expression of Language

 Language Disorders

 Aphasia

 Alexia

 Agraphia

 Conclusion

 Summary of Learning Objectives

 Key Terms

 Draw It to Know It

 Questions for Deeper Reflection

 Case Study

 Suggested Projects

 References

► Introduction

There are an estimated 6,000 to 7,000 different languages in the world. Students spend numerous hours learning at least one additional language to their native language; thus, language is not only something we use every day but also something we intentionally think about as we learn other languages. What is language? The purpose of this chapter is to define language, sketch out its components, discuss the neurology of language, and survey several language disorders.

A Definition of Language

There are eight main characteristics of all languages to consider before defining what we mean by the term language (FIGURE 12-1).

1. Language is a code; it is a system of symbols used for transmitting messages.

2. Language is used to represent ideas about the world. If person A wanted to communicate with person B, A could bring all the objects necessary and point to them to convey a thought, but language makes this task much easier. We use words to represent these real objects; thus we do not need objects at hand to talk about them.

3. Language is conventional, meaning that it is shared by a speaking community. A noncon- ventional language is a dead language, such as Latin.

4. Language is systematic. Languages have rules and regulations for how the code is arranged, which makes them predictable and learnable.

5. Languages use mostly arbitrary symbols to communicate with others. For example, the word dog does not look like a dog or sound like a dog. This word is an arbitrary set of phonemes that is understood to represent a furry, domesticated, four-legged creature that barks. (Note: onomatopoeias are an exception to this [e.g., the word buzz represents the sound “buzz”].)

6. Languages are generative in that speakers are continually creating novel utterances. For example, when I teach my classes, what I say one semester is never exactly what I say the next semester; I am always creating new sentences even though the topics are the same semester after semester.

7. Languages are dynamic in that they change over time. Old English developed into Middle English, then went through the Great Vowel Shift, and emerged into Modern English.

8. Languages have universal characteristics. All languages have nouns, verbs, adjectives, and, most of all, rules.

Taking all these characteristics into consideration, a working definition of language can be developed before discussing the neural basis of language. The following is an adaptation of Bloom and Lahey’s (1978) definition of language: Language is a generative and dynamic code containing universal characteristics whereby ideas about the world are expressed through a conventional system of arbitrary symbols for communication. Having defined language, it is now time to discuss the basic components of language before taking on the difficult task of outlining a neurology of language.

The Components of Language

Language consists of three basic components: content, form, and use (FIGURE 12-2). Content (i.e., semantics) refers to word meaning. For example, if one says the word dog, there is meaning attached to this word (i.e., a furry, domesticated, four-legged creature that barks). Language form or grammar refers to the shape or structure of language. This involves the language’s sound structure (phonology), word structure (morphology), and phrase/sentence structure (syntax). Lastly, language use is also known as pragmatics, which refers to how we practically use language with other human beings. The two primary ways are through monologues (e.g., telling a story) and through dialogues or conversation. Just from laying out these three components and some of their subcomponents, one can see the incredible complexity of language. It is simply amazing that humans can produce such a complex feat and often with little thought behind how to do it. We just open our mouths and it usually flows out (BOX 12-1).

FIGURE 12- 1 Characteristics of language.

FIGURE 12-2 The three components of language.

► The Neural Basis of Language

How does the brain comprehend and produce language? The short answer is no one really understands all the inner workings of how the brain does this. The longer answer is that although a comprehensive understanding of the brain’s language functioning still eludes neuroscientists, there is evidence of brain regions important to language skills.

If one were to peruse textbooks on neuroanatomy, one would find references to the Wernicke- Geschwind model of language processing, named after Karl Wernicke (1848-1905) and Norman Geschwind (1926-1984). This model uses a connec- tionist framework, connecting the following structures in language tasks: Broca’s area, the arcuate fasciculus, Wernicke’s area, the angular gyrus, and the supramarginal gyrus. It has to be said up front that this model is woefully inadequate in explaining the complex process of language comprehension and expression, as will be explained later. However, it, along with newer research on a ventral and dorsal stream of language, is useful in getting a general neurological flow of language during various language tasks.

Important language regions of the cortex have been known since the time of Wernicke. These include the inferior frontal gyrus (Brodmann areas [BAs] 44, 45), the superior temporal gyrus (BAs 41, 42, 22), some of the middle temporal gyrus (BAs 20, 21, 37, 38), and the inferior parietal lobe (BAs 39, 40). Together, these are known as the perisylvian region because they all border the sylvian fissure, also known as the lateral fissure (FIGURE 12-4). These areas, among others, will be discussed in the following sections. We will first focus on receptive language modalities by looking at auditory comprehension and reading. We will then shift our focus to language expression modalities and explore the oral production of language (i.e., verbal formulation) as well as its written expression (TABLE 12-1).

BOX 12-1 Is American Sign Language a Language?

For those who are deaf and for whom language does not always flow out of the mouth, manual sign language, like American Sign Language (ASL), is the preferred communication modality FIGURE 12-3. Is a manual communication system like ASL a true language? Does it have content, form, and use? The simple answer is "yes.” It is obvious that ASL has content or meaning since each sign conveys meaning to those who observe the signs. It is also obvious that ASL has use or pragmatics since it is used in conversations and in telling stories. The hard question is: Does ASL have form that is similar to the form of spoken English? The answer is that "yes,” it does.

For example, in English phonology, we talk about three major features of consonants—manner, place, and voice. ASL has something akin to these features: configuration of the hand, movement of the hands/ arms in the signing space, and the location of the hands in the signing space. For the sign "mother,” the fingers and thumb are flared out and the thumb taps the chin. The sign's configuration is the fingers and thumb spread out, its movement is the tapping, and the location is the chin. ASL also has both bound and free morphemes like in English. For example, extra gestures can be added to a sign to alter its meaning (e.g., negation through hand turning). Finally, ASL has syntax because it has rules for ordering signs. Specifically, it uses a subject-verb-object ordering similar to English. In conclusion, ASL has content, form, and use and is thus a language.

FIGURE 12-3 The ASL sign for "mother” or "mom.”

FIGURE 12-4 The perisylvian region. Notice how important language areas congregate around the lateral or Sylvian fissure.

TABLE 12-1 An Overview of Language Modalities

Input: Modalities of Language Reception

Output: Modalities of Language Expression

Auditory comprehension

Verbal formulation

Reading comprehension

Written expression

Auditory Comprehension of Language

Peripheral Auditory System

When someone asks us a question (e.g., “What is your name?”), how is this auditory information processed (FIGURE 12-5)? The auditory information in this question makes its way through the peripheral auditory system first. The auditory components of the question are located and collected by the pinna and then funneled to the tympanic membrane. The sound waves vibrate the tympanic membrane, changing this acoustic energy into mechanical energy. These mechanical vibrations are sent through the ossicles to the oval window of the cochlea. The footplate of the stapes rocks in and out of the window, causing waves in the cochlear perilymph fluid. A second energy change takes place where mechanical energy becomes hydraulic energy. The perilymphatic waves cause waves in the cochlea duct, which disrupt hair cells in its organ of Corti. As these hair cells are disrupted, they depolarize and generate a chemical-electrical signal. The change from hydraulic energy to chemical-electrical energy is the third and final energy change.

The cochlear branch of the cochleovestibular nerve (cranial nerve VIII) picks up the chemicalelectrical signal produced in the organ of Corti and transmits it to the cochlear nucleus of the brainstem. The cochlear nucleus routes this information through various brainstem structures to the thalamus, which, in turn, routes the signal to the primary auditory cortex (PAC).

Temporal Lobe Processing

In regard to the cerebral cortex, primary analysis of the auditory information (i.e., phonological analysis) begins at the Heschl gyrus, where the PAC is located. This area shows activation by any type of sound (i.e., speech and nonspeech sounds). The PAC analyzes the primary auditory signal, with the left PAC sensitive to speech sound characteristics (i.e., distinctive features) and the right PAC sensitive to pitch. After this first processing step, the information is sent in two directions. First, it is sent to the planum temporale (PT) on the posterior superior temporal gyrus (i.e., Wernicke’s area) via a short-range rostral fiber pathway. Wernicke’s area was once thought to be where meaning was attached to speech, but it might act as more of a hub, drawing information from a surrounding network consisting of parietal (BAs 39, 40) and temporal regions (BAs 21, 37) in the meaning attachment process. Second, the auditory information is sent to the planum polare, which is situated anterior to the Heschl gyrus, via a short-range caudal fiber pathway. (Compare these short-range pathways to the long- range pathways discussed in the next section.) This region may be involved in acoustic analysis, but the planum polare’s role is not understood at this time. Phonological processing and semantic processing have been mentioned so far, but what happens in syntactic processing?

FIGURE 12-5 The auditory comprehension of language. 1. The ear converts acoustic energy into electrochemical energy and transmits it to the brainstem's cochlear nuclear complex (CNC) via cranial nerve VIII. 2. The CNC sends this information to the thalamus. 3. The thalamus relays the information to the primary auditory cortex (PAC) for signal processing. 4. The PAC routes to Wernicke's area and other temporal areas for meaning attachment. 5. Wernicke's area projects to Broca's area for higher- level syntactical processing.

Syntax has to do with the rules that guide phrase and sentence structure in a given native language. For example, the sentence “I kicked the red ball” is in correct syntactic order for English, but not Spanish. In Spanish, the sentence would be “I kicked the ball red” because adjectives come after the word they describe in Spanish, but before in English. Semantics has to do with the meaning of words. Neurologically, what part or parts of the brain appear involved in syntactic and semantic processing? Researchers have performed studies looking at what part of the brain is activated during activities where subjects are exposed to grammatically correct and incorrect sentences. From these and other studies, a picture of syntactic processing begins to emerge.

First, the superior temporal gyrus appears to be involved in the processing of syntactic structure. Second, the posterior temporal lobe is thought to be activated in processing a verb and its arguments. An argument involves the syntactic and semantic relationship between a noun phrase and a verb. For example, syntactically intransitive verbs require a subject (e.g., Bill giggled), whereas transitive verbs require a subject and an object (e.g., Bill built a house). In terms of semantics, a subject is an agent (i.e., one who does an action), whereas an object is a patient (i.e., one who undergoes an action).

Connections Between the Frontal and Temporal Lobes

In terms of connectivity within the perisylvian region, there are two sets of pathways—two dorsal pathways and two ventral pathways (FIGURE 12-6). Dorsal pathway 1 connects Wernicke’s area (BA 22) to the premotor cortex (BA 6) via two axonal tracts, the arcuate fasciculus and the superior longitudinal fasciculus.

FIGURE 12-6 Fronto-temporal language regions and their connections via two dorsal and two ventral pathways. Dorsal pathway 1 (DPI) and 2 (DP2) both involve the arcuate fasciculus (AF) and the superior longitudinal fasciculus (SLF). DP1 connects Wernicke's area (BA 22) to the premotor cortex (BA 6) while DP2 connects Wernicke's area to Broca's area (BA 44). Ventral pathway 1 involves the extreme fiber capsule system (EFCS) and connects the superior temporal gyrus (BA 41,42) to Broca's area (BA 45). Ventral pathway 2 uses the uncinate fasciculus (UF) to connect the anterior superior temporal gyrus to the frontal operculum.

Dorsal pathway 2 connects Wernicke’s area to the pars opercularis of Broca’s area (BA 44) via the same two tracts associated with dorsal pathway 1. Ventral pathway 1 connects the superior temporal gyrus (i.e., PAC, BA 41, BA 42) to the pars triangularis of Broca’s area (BA 45) using an axonal tract called the extreme fiber capsule system. Finally, ventral pathway 2 connects the anterior superior temporal gyrus to the frontal operculum via the uncinate fasciculus. This dizzying array of connections illustrates one important point: The superior temporal gyrus is highly connected to the inferior frontal gyrus (Friederici & Gierhan, 2013).

The ventral pathways are thought to facilitate the attachment of meaning to sounds and sound combinations, whereas the dorsal pathways support auditory-motor integration. The dorsal and ventral pathways are considered “dual stream,” meaning that information flows back and forth on them.

Frontal Lobe Processing

It appears that when syntax becomes complex, the inferior frontal gyrus is recruited to the comprehension task via the pathways mentioned previously. Broca’s area (BAs 44, 45) shows activation during complex syntactic activities. This observation accords with clinical evidence of patients with Broca’s aphasia who not only are expressively agrammatic but also struggle with higher-level comprehension tasks such as passive constructions (e.g., “The leopard was killed by the lion. Which animal died?”). It is possible that Broca’s area also contributes working memory to the task of comprehension (Dapretto & Bookheimer, 1999). Working memory is a type of temporary, scratch pad-like holding space that we use to work out information (e.g., a math problem). Experientially, this would seem to be true for when someone says something to us that has a level of complexity to it; we need the mental space to work it out. Patients with Broca’s aphasia would appear to lack this space for complex or odd grammatical constructions.

Visual Comprehension of Language

We can comprehend language through our auditory system, but we can also visually comprehend language through reading. We now turn to the topic of reading to explore how our eyes gather visual information and pass it through the visual pathways to cortical areas for processing and how the left hemisphere then decodes written language (FIGURE 12-7).

FIGURE 12-7 The visual comprehension of language. 1. Visual information is projected to the thalamus's lateral geniculate nucleus via the optic tracts. 2. The thalamus projects back to the occipital lobe's visual areas (BAs 17-19) for visual processing via the geniculocalcarine tract. 3. The visual areas project a dorsal stream of vision (i.e., the "where" of vision) to the parietotemporal reading system and a ventral stream of vision (i.e., the "what" of vision) to the occipitotemporal reading system. 4. An anterior reading system area is activated in silent reading and in decoding infrequently used words.

The Eyes

Our eyes are precious resources that we use to navigate the world around us. We also use them as a first step in the process of reading because they gather written visual information and then pass that information on to the visual pathways.

The eyes have three layers: the external, middle, and inner layers (FIGURE 12-8). The external layer contains a tough, white covering called the sclera and a thin mucous membrane with numerous sensory receptors called the cornea. The middle layer of the eye is continuous with the pia mater of the meninges. This layer contains the choroid, which contains blood vessels that nourish the retina. It also has what is called the ciliary body whose muscles shape the lens and control the optics of the eye. The iris is also a part of the middle layer. It is made of smooth muscle with an opening in the middle called the pupil through which light passes. The inner layer is the retina, which is continuous with the brain through the optic nerve. The retina contains three layers of cells: a photoreceptor layer, a bipolar cell layer, and a ganglion layer. Photoreceptors come in two varieties, rods and cones. Rods are sensitive to blue to green light (not red) and are most useful in low-light situations. Cones come in three varieties: red, green, and blue. Overall, cones are sensitive to a wide spectrum of light wavelengths and thus are sensitive for colors and are most useful in bright-light situations.

The photoreceptors transduce light into neural impulses. When light strikes a photoreceptor, it excites a pigment in the photoreceptor. Each pigment is sensitive to a specific wavelength of light. The excitation of the pigment sets off a chemical chain reaction that leads to the creation of an electrical impulse from the photoreceptors to the bipolar cells to the ganglion cells, and eventually to the optic nerve.

The Visual Pathways

The optic nerves enter the eyes at the medial posterior side and branch to the lateral (temporal) and medial (nasal) sides of each eye (FIGURE 12-9). We have two visual fields, a left visual field and a right visual field. A common misunderstanding is that the left eye is responsible for the left visual field and the right eye is responsible for the right visual field. In actuality, both eyes participate in each visual field. The temporal retina of the left eye and the nasal retina of the right eye are responsible for the right visual field, and the right temporal retina and left nasal retina take care of the left visual field. Retinal ganglion cell axons project from the eyes to the optic chiasm.

FIGURE 12-8 The structure of the eye.

FIGURE 12-9 The visual pathway.

The optic chiasm (“crossing”) is a structure between the optic nerves and the optic tracts. The optic nerve fibers from the nasal retinas cross at this point, whereas the temporal retinas do not cross. The optic tracts begin at the optic chiasm and conduct retinal ganglion cell axons to the lateral geniculate nucleus of the thalamus. Ninety percent of these axons terminate at the lateral geniculate nucleus; the remaining 10% project to the pretectal area and the superior colliculus. The pretectal area is a region of the midbrain composed of seven nuclei that make up the subcortical vision system. It is here that the pupillary light reflex is mediated. The superior colliculus is also found in the midbrain; it is involved in the control of eye movements.

From the lateral geniculate nucleus, the geniculocalcarine tract projects to the medial part of the occipital lobe surrounding the calcarine sulcus (BA 17). The fibers from the left visual field are projected to the right part of BA 17, and the right visual field to the left. In other words, the visual fields are received by the cortex in a contralateral fashion.

Cortical Processing of Visual Information

Cortical processing involves handling color, location, motion, depth, and identifying features of objects. All this processing happens at the same time and is known as parallel processing. Visual information is sent from BA 17 to BAs 18 and 19. A dorsal stream of vision occurs as information is sent to the superior occipital lobe and inferior parietal lobe for further processing. This dorsal stream represents the “where” of vision (i.e., where an object is located in the visual field). A ventral stream of processing, involving the inferior occipital lobe and posterior and inferior temporal lobe, processes the “what” of vision, or the identity of an object. Lesions in the ventral stream sometimes result in agnosia, the inability to recognize the object.

Cortical Reading Areas

The posterior part of the brain is focused on recognizing patterns, whether they are auditory, visual, or olfactory patterns. We read a book and the words we read form a pattern. If we hear our friend speak, we recognize the pattern of his or her voice. If our neighbor grills a steak, we recognize the smell pattern and can identify what is cooking. Humans have not always read, but as we developed the ability to do so, we took advantage of the pattern-recognition system already built into our brains. Pattern recognition is crucial in reading. As we read, we recognize combinations of letters. For example, the d-o-g pattern is recognized as “dog.”

Shaywitz and Shaywitz (2008) proposed that there are three reading systems in the left hemisphere, one in the anterior portion and two in the posterior portion. The term systems is used rather than areas because these systems encompass more than one Brodmann area.

The first posterior reading system is called the parietotemporal reading system. This system focuses on word analysis, meaning the decoding of words at the phonemic level (i.e., phonics). It is also involved in the comprehension of written and spoken language (Heim & Keil, 2004; Joseph, Nobel, & Eden, 2001). The angular gyrus (BA 39), the supramarginal gyrus (BA 40), and the posterior part of the superior temporal lobe are all part of this system. The other posterior reading system is the occipitotemporal reading system, which is also known as the visual word-form area. As the name implies, this system is concerned with visual word form recognition (i.e., sight-reading) and rapid access to whole words. In addition, it integrates printed letters to their corresponding sounds (i.e., grapheme-to-pho- neme conversion). Cortical areas involved with this system include the left inferior occipital area, left inferior-posterior temporal area, and fusiform gyrus.

In addition to the two posterior reading systems, there is the anterior reading system, which involves Broca’s area as well as the ventral and dorsal premotor areas. This system has long been known to play an important role in word analysis in terms of spoken language syntax and articulation, but it may also play a role in silent reading (Shaywitz & Shaywitz, 2004, 2008) and in decoding infrequently encountered words (Cornelissen, Hansen, Kringelbach, & Pugh, 2010).

The Oral Production of Language

The Prefrontal Cortex

Our decision to speak to another human most likely originates in the prefrontal cortex (FIGURE 12-10). This structure lies in the frontal lobe anterior to the motor areas. It is highly connected to the parietal, temporal, and occipital lobes, and the rest of the frontal lobe by way of the dorsal pathways (i.e., the arcuate fasciculus and the superior longitudinal fasciculus). In addition, the prefrontal cortex receives information from the limbic system, our emotional center, via the thalamus. Because of this substantial connectivity, the prefrontal cortex is wired for decision making. When a decision to talk is made, the thought is likely transferred to the left inferior frontal cortex for semantic and phonological assembly.

FIGURE 12-10 The oral production of language. 1. Desire and thoughts to communicate originate in the prefrontal cortex and are sent to Broca's area for language encoding and speech planning/programming. 2. Broca's area projects to the supplementary motor area (top of area 6), which activates speech plans. 3. Supplementary motor area relays now-active plans to the primary motor cortex. 4. Primary motor cortex sends plans to the speech muscles for execution.

One of the reasons the prefrontal cortex is thought to help us make decisions, like when to talk, is due to the experience of those who have received prefrontal lobotomies (FIGURE 12-11). This surgical procedure, which was once done to help psychiatric patients, disconnects the prefrontal cortex from the rest of the brain. One of the characteristics of those who have undergone this procedure is that they lack rational decision-making abilities and act more on impulse.

The Left Inferior Frontal Cortex

Anatomically, the left inferior frontal cortex (LIFC) consists of Broca’s area (BAs 44, 45) as well as the pars orbitalis. It is associated with syntactic comprehension, as mentioned earlier. It is also involved in language production, though probably in addition to other subregions (e.g., insular cortex, fusiform gyrus). In the LIFC, there appears to be language specialization with the anterior (i.e., pars orbitalis) and ventral parts (i.e., BA 45, pars triangularis) specializing in semantics (e.g., verb selection) and in the posterior regions (i.e., BA 44, pars opercularis) handling phonology. The posterior dorsal part of the LIFC could be thought of as an extension of the prefrontal motor cortex (BA 6) where motor plans for speech are stored. Damage to this area can result in apraxia of speech. In this condition, people cannot pull up the motor plans necessary for speech, though their language may be intact.

FIGURE 12-11 A prefrontal lobotomy.

From this discussion, Broca’s area can be described as necessary for properly formed oral language, but not sufficient. It networks with other cortical and subcortical areas, such as the pars orbitalis, frontal operculum, insular cortex, and fusiform gyrus, to name just a few. The relationship among these areas can be seen in patients with transcortical motor aphasia who suffer lesions just anterior to Broca’s area. They can understand others and repeat what is said to them, but they struggle in initiating speech. Again, Broca’s area is necessary, but not sufficient, for expressive language. In fact, neuroimaging studies have provided evidence that regions anterior to Broca’s area (e.g., BA 47) are involved in semantic tasks, such as semantic retrieval (Wagner, Koutstaal, Maril, Schacter, & Buckner, 2000; Wagner, Pard-Blagoev, Clark, & Poldrack, 2001), verb generation (Thompson-Schill, Aguirre, Desposito, & Farah, 1999), and judging whether semantic choices are acceptable (Dapretto & Bookheimer, 1999).

The Supplementary Motor Area

After semantic assembly has been performed in the anterior ventral LIFC, it is most likely passed on to the posterior dorsal LIFC for phonological planning and assembly. From here, the language product is most likely sent to the supplementary motor area (SMA), which has been implicated in internally initiated activity. The product may be sent directly to the SMA or first relayed through cerebral motor regions, such as the basal ganglia and thalamus, before arriving at the SMA. The SMA then initiates the motor plans, much like turning the key in a car turns the car on, making it ready to drive.

The Primary Motor Cortex

The SMA sends the motor information for the intended language production to the primary motor cortex (BA 4). If the SMA turns the car on, the primary motor cortex puts the car into drive by relaying the motor information to the speech muscles via the motor speech system, which includes the pyramidal and extrapyramidal systems, control circuits, and sensory feedback mechanisms.

The Written Expression of Language

Writing, like reading, is a uniquely human function that relies upon our language abilities. Writing is a powerful skill in conveying ideas not only to large groups of people but also to people living in different eras. Sitting on my shelves are many books written by people who have long been dead. These authors may be deceased, but their ideas live on and continue to influence people.

In thinking about writing, there appear to be three key processes involved in our ability to write. First, there is the language involved in our written communication. This means language processing and production are intimately linked with the skill of writing, with the LIFC probably involved in encoding. Second, there is the motor control necessary to manipulate pen in hand in relation to paper. This motor control involves not only the gross movements of the arm and hands but also the precise, fine motor movements of the fingers as they wield a writing instrument. Third, there is significant visu- ospatial involvement because writing involves the use of vision guiding motor movements in the threedimensional writing space. Because we have already surveyed linguistic regions of the brain, next we will look at the visuospatial and motor control aspects of writing (FIGURE 12-12).

Visuospatial Elements: Left Superior Parietal Lobe

As mentioned, writing requires significant visuo- spatial skills because it involves visual guidance of the hand and fingers in forming graphemes. Graphemes are the written equivalent to phonemes and require the same precision to produce as phonemes. Neurologically, where does the control of writing reside?

Part of the writing circuit is the somatosensory association cortex (BAs 5, 7) that lies inferior to the primary sensory cortex (BAs 1, 2, 3) and superior to the angular gyrus (BA 39) and supramarginal gyrus (BA 40). It received its name because it coordinates motor and sensory information, including tactile and visual sensation. Several studies using functional magnetic resonance imaging (Harrington, Farias, Davis, & Buonocore, 2007; Katanoda, Yoshikawa, & Sugishita, 2001; Menon & Desmond, 2001) have found the left superior parietal lobe (SPL), part of the somatosensory association cortex, active during writing activities. The left SPL is thought to construct graphic images of letters. In other words, as we write, we imagine what letters look like just before and as we write them. The left SPL also is thought to direct the sequence of movements that occur during writing. Harrington et al. (2007) found that drawing activates the right and left SPLs.

Motoric Elements: Broca's Area and the Premotor Cortex

It has long been known from lesion studies that an area in the premotor cortex (BA 6) adjacent to Broca’s area is important for the motor aspects of writing. This area is known as Exner’s area, named after Sigmund Exner (1846-1926), an Austrian physiologist who first theorized that this region is important to writing.

FIGURE 12-12 The written expression of language. 1. Desire and thoughts originate in the prefrontal cortex and are sent to Broca's area for language encoding. 2. Language-encoded thoughts are sent to the premotor cortex (BA 6; Exner's area) for handwriting motor planning. 3. Motor plans are sent to the primary motor cortex. 4. The primary motor cortex sends writing motor plans to the dominant hand. 5. The left superior parietal lobe coordinates the visuospatial elements of writing.

The main brain areas involved in the motoric aspects of writing appear to be Broca’s area (BAs 44, 45), Exner’s area in the premotor cortex (BA 6), and the precentral gyrus (BA 4). In fact, Exner’s area is anterior to the hand area in the precentral gyrus. The graphemic images generated in the SPL are sent to Broca’s area, which has extensive connections with Exner’s area. Broca’s area is thought to organize the impulses from the SPL and relay them to Exner’s area for conversion into graphemic motor plans. These plans are then sequenced in the SMA and sent to the precentral gyrus, which activates muscle motor movements of the dominant writing hand (Joseph, 2000; Longcamp, Anton, Roth, & Velay, 2003; Roux et al., 2009).

► Language Disorders

Aphasia

Aphasia is an acquired multimodality language disorder. Acquired means that it is a condition caused by a stroke or some other form of brain injury; multimodality means that it affects all language modalities, such as auditory comprehension, verbal formulation, reading, and writing. Aphasia involves language in that people with it struggle with comprehending or using language symbols. Lastly, aphasia is a disorder in that a system, the language system, does not function as it should function, and this dysfunction may disable the person with aphasia. It should be said that aphasia should not be confused with conditions like mental confusion, memory problems, or dementia. Aphasia is not an intellectual disorder, but rather an acquired disorder of language.

One basic way of describing aphasia is through the categories of fluent versus nonfluent aphasia. Typically, fluency is thought to be a category reserved for fluency disorders, like stuttering. In these cases, fluency is thought of as the smoothness with which sounds, words, and sentences are produced. In aphasia, fluency is thought of a bit differently. It is thought of as the effort that goes into speech and the quantity of words produced. The speech of people with fluent aphasia is effortless, and they produce a normal quantity of words, between 100 and 200 words per minute. Their speech is empty, though, because they produce more function words (prepositions, pronouns) than content words (nouns, verbs). People with nonfluent aphasia have the opposite features; speech is effortful and labored, and they produce fewer than 100 words per minute. Their language is rich in content words, but short on function words, which are critical for seeing the relationships between words. These criteria along with others can be explored further in TABLE 12-2.

In addition to fluency, aphasias can be distinguished from each other through other features. For example, a patient’s auditory comprehension and ability to repeat words can be useful pieces of information in differentiating one form of aphasia from another. Naming ability is typically impaired across all forms of aphasia, so it is usually not a helpful indicator of type of aphasia. Impairment in naming is called anomia. Sometimes anomia can be due to motor issues, like those found in Broca’s aphasia. Other times, anomia is due to word selection issues, which is considered true anomia and is the most common type of anomia in aphasia. Finally, in severe cases of aphasia, sometimes anomia is due to entire semantic fields being wiped out, as is the case with semantic anomia (TABLE 12-3).

TABLE 12-2 Fluent Versus Nonfluent Aphasia

Category or Feature

Nonfluent Aphasia

Fluent Aphasia

Articulation

Affected

Normal

Effort

Effortful, labored

Normal

Quantity of words

Sparse

Normal

Words per minute

<100

100-200

Phrase length

1-2 words

5-8 words

Prosody

Impaired

Normal

Content

Excessive content words, but few function words

Lacks content words, but lots of function words

Data from Davis, G. A. (2006). Aphasiology: Disorders and clinical practice (2nd ed.). Upper Saddle River, NJ: Pearson.

In addition to these features, some types of aphasia have paraphasias. Paraphasias are either word and sound substitutions that can be lexical (real words) or nonlexical (nonwords) in nature. An example of a lexical paraphasia is a semantic paraphasia where a word is substituted for the target word and is related to it in meaning. For example, if the target word is cat, a semantic paraphasia might be dog. An example of a nonlexical paraphasia is a neologism (literally “new word”). If the target word is again cat, a neolo- gistic error might be repuco, a production that is not a known word in English. These and other types of paraphasias are described in TABLE 12-4.

Of all the features mentioned earlier, the most useful to differentially diagnose aphasia are fluency, auditory comprehension, and word repetition. These three categories lead to eight types of “classical” aphasias, four that are fluent in nature and four that are nonfluent. These types of aphasia are useful for professional communication between speech-language pathologists (SLPs) as well as between SLPs and physicians. However, we should keep in mind that some people do not cleanly fit into these categories and that labels like these only tell so much about a patient. The goal of assessment is to describe the language strengths and weaknesses of the patient, as this information is what we use to plan therapy. These eight aphasias are compared and contrasted in TABLE 12-5. They are also graphically laid out in FIGURE 12-13.

TABLE 12-3 Types of Anomia

Anomia Type

 

Does Patient Know Word?

 

Can Patient Produce Word in Another Modality (e.g., writing)?

Description of Patient Anomic Error

Word production anomia

Yes

Yes

Labored speech due to motor issues (Broca's aphasia, apraxia of speech)

Word selection anomia

Yes

No

Tip-of-the-tongue phenomenon ("Oh, it's . . . uh . . .”); true anomia

Semantic anomia

No

No

Semantic field lost; no response from the patient; usually accompanied by poor auditory comprehension

Data from Code, C. (1991). The characteristics of aphasia. London, UK: Lawrence Erlbaum Associates.

TABLE 12-4 Classification of Aphasic Paraphasias

Paraphasia Category

Paraphasia Types

Definitions

Examples

Target

Error

Lexical (real words)

Semantic

Word related to target in meaning

Cat

Dog

Formal

Word related to target in sound

Cat

Mat

Mixed

Word with sound and meaning relationship

Cat

Rat

Unrelated

Word with no apparent relationship to target

Cat

Bucket

Nonlexical

(nonwords)

Phonemic

Nonword obviously related in sound

Cat

Zat

Neologistic

Nonword with no apparent relationship

Cat

Repuco

Data from Davis, G. A. (2007). Aphasiology: Disorders and clinical practice (2nd ed.). Upper Saddle River, NJ: Pearson.

Broca's Aphasia

Broca’s aphasia is probably the most well-known form of aphasia thanks to Paul Broca’s famous case involving his patient, Tan (discussed in Chapter 1). It occurs in about 27% of aphasic cases, making it the most common form of aphasia (Hoffmann & Chen, 2013). The exact site or sites of damage are debated, but the condition can be due to lesions in and axonal areas under deep to Broca’s area (BAs 44,45) as well as frontal, parietal, and temporal areas surrounding Broca’s area (FIGURE 12-14).

Looking at Table 12-5, Broca’s aphasia is characterized as a nonfluent aphasia with relatively intact auditory comprehension, but impaired repetition. Struggles in auditory comprehension become apparent when the length and complexity of auditory information increase. Patients are agrammatic, meaning their

FIGURE 12-13 Types of aphasia.

FIGURE 12-14 The neuropathology of Broca's aphasia.

Darker areas mark primary lesion areas; lighter shading denotes other possible areas of involvement.

expressions lack grammatical completeness. For example, a patient might say, “I . . . walk . . . dog . . . park” in place of I walked the dog in the park. Paraphasias are not typically observed in this form of aphasia. Reading comprehension is functional for short, simple material, but will break down with longer and more complex reading passages. Writing is impaired and often follows the pattern of the patient’s oral output.

Despite these challenges, patients with Broca’s aphasia are generally good communicators because they can produce heavy content words (e.g., nouns). They also have good awareness, are cooperative, and have some ability to self-correct. Apraxia of speech and dysarthria may co-occur with Broca’s aphasia (Davis, 2006; Hedge, 2018; LaPointe, 2005).

Transcortical Motor Aphasia

Hoffmann and Chen (2013) report that all of the transcortical aphasia types together account for 1.8% of all cases of aphasia, so transcortical motor aphasia is exceedingly rare. It is caused by lesions in the frontal lobes above or below Broca’s area and in the supplementary motor area (FIGURE 12-15). Like Broca’s aphasia, it is a nonfluent form of aphasia with relatively preserved auditory comprehension. Unlike Broca’s aphasia, patients have intact repeating abilities. Patients are usually initially mute, but as this subsides, their speech will be echolalic with some paraphasias and agrammatism. Reading comprehension will be relatively intact except for syntactically complex material, but writing will have significant deficits that mirror oral communication deficits (Davis, 2006; Hedge, 2018; LaPointe, 2005).

Mixed Transcortical Aphasia

Another rare transcortical aphasia is mixed transcortical aphasia, which is a mixture of transcortical motor and transcortical sensory aphasia. It is caused by damage to what is called the watershed area, a region that receives blood supply from the small branches at the end of the three cerebral arteries (FIGURE 12-16). Mixed transcortical aphasia is like global aphasia in that patients are severely nonfluent, with severely impaired auditory comprehension, reading, and writing, but with preserved repetition, which often manifests as echolalia (Davis, 2006; Hedge, 2018).

Global Aphasia

Global aphasia is the most devastating type of aphasia to the language system. It accounts for about 19% of aphasia cases (Hoffmann & Chen, 2013). It is caused by extension damage in the perisylvian language region that is supplied by the middle cerebral artery (Figure 12-4 and FIGURE 12-17). All language modalities are severely impaired. Co-occurring conditions include hemiparesis or hemiplegia, oral apraxia, apraxia of speech, and hemineglect. If there is a bright spot with this aphasia type it is that it can morph into a less severe form of aphasia, such as Broca’s or Wernicke’s aphasia (Davis, 2006; Hedge, 2018; LaPointe, 2005).

FIGURE 12-15 The neuropathology of transcortical motor aphasia (TMA).

FIGURE 12-16 Mixed transcortical aphasia results from damage to the brain's watershed area (see gray shading).

FIGURE 12-17 The neuropathology of global aphasia. The condition is due to extensive damage to the perisylvian language region.

FIGURE 12-18 The neuropathology of Wernicke's aphasia. Darker areas mark primary lesion areas; lighter shading denotes other possible areas of involvement.

Wernicke's Aphasia

Named after the German neuropsychiatrist, Karl Wernicke, Wernicke’s aphasia is probably the second most well-known form of aphasia after Broca’s aphasia. It is rarer than transcortical aphasias according to Hoffmann and Chen (2013), accounting for only 1.6% of cases. Damage occurs in the superior temporal gyrus, supramarginal gyrus, or angular gyrus (FIGURE 12-18). It is a fluent aphasia where verbal output flows easily and effortlessly, so much so that patients are described as having logorrhea, a diarrhea of the mouth. This verbal output is full of semantic paraphasias and neologisms; as a result, the speech of patients with Wernicke’s aphasia is said to be empty or devoid of meaning. Example, for the target sentence “The dog needs to go out, so I will take him for a walk,” someone with Wernickes aphasia might say, “You know that smoodle pinkered and that I want to get him round and take care of him like you want before” (The Internet Stroke Center, n.d.). In these patients, auditory comprehension is severely impaired, as is their repetition and writing. Reading comprehension is impaired also, but not as severely as the other modalities, being a relative strength compared to the other modalities (Davis, 2006; Hedge, 2018; LaPointe, 2005).

Transcortical Sensory Aphasia

The final form of transcortical aphasia is transcortical sensory aphasia. It is caused by lesions to the temporoparietal region, including the angular gyrus and the posterior temporal lobe (FIGURE 12-19). People with transcortical sensory aphasia are fluent, but their verbal output is filled with paraphasias (semantic and neologistic paraphasias), echolalia, and empty speech. Auditory comprehension is impaired as well as reading and writing, but repetition abilities are preserved. It is interesting that repetition is intact and echolalic behavior is present in a condition with impaired auditory comprehension (Davis, 2006; Hedge, 2018; LaPointe, 2005).

FIGURE 12-19 The neuropathology of transcortical sensory aphasia.

Conduction Aphasia

Conduction aphasia is an extremely rare form of aphasia occurring only in about 1% of cases. Its neuropathology has been controversial, with the traditional site of damage being the arcuate fasciculus. However, damage anywhere between Broca’s area and Wernicke’s area can result in this classification (FIGURE 12-20). The condition itself is characterized by fluent verbal output with some paraphasias at times. Auditory and reading comprehension are also relatively intact, but its hallmark characteristic is impaired repetition. Writing is also impaired (Davis, 2006; Hedge, 2018; LaPointe, 2005).

Anomic Aphasia

Hoffmann and Chen (2013) report that anomic aphasia is the second most common type of aphasia, representing 26% of aphasia cases, after Broca’s aphasia. Some lesion sites include the temporoparietal area, including the angular gyrus, middle temporal gyrus, supramarginal gyrus, and posterior inferior temporal gyrus (FIGURE 12-21). In many ways, anomic aphasia is the least severe form of aphasia because patients are fluent, with persevered auditory comprehension, repetition, reading, and writing. The hallmark characteristic is word selection anomia with semantic paraphasias (Davis, 2006; Hedge, 2018; LaPointe, 2005).

Alexia

Alexia is a term that refers to an acquired disorder of reading. This condition is different from dyslexia, a term this text reserves for developmental problems in acquiring reading skills. Alexia is caused by a stroke, head injury, or some other mechanism. Premorbid reading abilities are normal but then suddenly change due to one of these neurological conditions.

A Dual-Route Reading Model

Riley, Brookshire, and Kendall (2017) have proposed a dual-route model for reading, developed by Coltheart, Rastle, Perry, and Langdon (2001), that is helpful for explaining different types of alexia (FIGURE 12-22). The two routes include a lexical reading route and a nonlexical reading route. The lexical route is our sight-reading route for already-known words, while the nonlexical route is our phonics, or “sounding things out,” route, which is important as we encounter new words.

Figure 12-22 applies the written word example of dog to this model. When we see this written word, our brains first perform a visual analysis of each grapheme in the word. For example, an analysis of d would be that it has a straight vertical line with a curve on its lower left side. Next, our brains do a letter analysis and determine the letter-to-sound correspondences contained in the word. Dog has three letters and three sounds. Another word, shoe, has four letters but only three sounds. If the word is familiar to us and already a part of our lexicon, it moves through the lexical reading route. As the first step of this process, our brains scan our orthographic input lexicon for a match for what word is being seen. Then the word moves into the semantic system for meaning attachment. In our example of dog, the meaning revolves around a smallish, furry, four-legged animal that has a tail and barks. This is the point at which we stop in silent reading because meaning has been attached to what we have read. If we wanted to read out loud, then we would take the further step of running dog through our phonological output lexicon. This is a dictionary of known words we can say. Phonemes are then assembled, and we speak the word out loud. Again, if the word we are trying to read is not in our sight-reading mechanism, then our reading switches to the nonlex- ical reading route, where graphemes are converted to phonemes and we sound the word out.

FIGURE 12-20 The neuropathology of conduction aphasia.

FIGURE 12-21 The neuropathology of anomic aphasia. Darker areas mark primary lesion areas; lighter shading denotes other possible areas of involvement.

FIGURE 12-22 The dual-route reading model.

Types of Alexia

Alexia comes in two basic forms—peripheral alexia and central alexia. Peripheral alexia involves reading problems caused by visuospatial and attention problems. There is difficulty in matching the word seen with the word representation stored in the brain. It is nonlinguistic in nature and does not affect the underlying reading system. Central alexia is a linguistic problem that affects the underlying reading system. Examples from each form will be discussed in the following sections. Taking the dual-route model of reading into consideration, peripheral alexia occurs at the beginning of the model (i.e., visual feature analysis and letter analysis), while central alexia is due to a breakdown within the lexical reading route itself (e.g., the semantic system). With the dual-route model of reading and these two general categories of alexia in mind, we can now discuss the subtypes of central and peripheral alexia (TABLE 12-6).

Peripheral Alexias

There are four types of peripheral alexia: pure alexia, neglect alexia, attentional alexia, and visual alexia. Pure alexia is also known as alexia without agraphia, verbal alexia, or letter-by-letter reading. People who suffer from it have difficulty reading, but writing ability is left intact. In fact, they would not be able to read their own writing. In pure alexia there is a disconnection between the visual information taken in (i.e., visual feature analysis and letter analysis) and the left hemisphere’s word recognition system, which lies in the occipitotemporal reading system. In other words, the sight-reading mechanism is damaged. These patients can read, but it has to be done with a letter-by-letter, sounding-out approach (i.e., phonic approach using the nonlexical reading route). This makes reading possible, but also slow and inefficient.

TABLE 12-6 Summary of Peripheral and Central Alexia Subtypes

Alexia Category

Alexia Subtype

Definition

Place of Breakdown in Dual-Route Model

Peripheral alexia

Pure

Sight-reading impaired

Visual feature analysis and letter analysis

 

Neglect

Ignoring left or right visual field

Letter analysis

 

Attentional

Poor visual attention

Letter analysis

 

Visual

Letter or syllable substitutions, additions, or omissions

Letter analysis system and/or the orthographic input lexicon

Central alexia

Phonological

Trouble reading unfamiliar or nonwords

Nonlexical reading route

 

Surface

Trouble reading words with irregular print-to-sound correspondences

Lexical reading route

 

Deep

Misreading one word for another

Lexical and nonlexical reading routes

 

Nonsemantic

Reading without meaning or comprehending

Semantic system in the lexical reading route

Data from Riley, E. A., Brookshire, C. E., & Kendall, D. L. (2017). The acquired disorders of reading. In I. Papathanasiou & P. Coppens (Eds.), Aphasia and related neurogenic communication disorders (pp. 195-218). Burlington, MA: Jones & Bartlett Learning.

Neglect alexia occurs usually because of damage to the right hemisphere. Sufferers will fail to identify the initial portions of a word or sentence as they try to read, which is a defect in letter analysis. In other words, they begin reading in the middle of words and/ or sentences (e.g., tiger might be read as -ger). People who have this form do better reading familiar words because they can predict the word.

The next form, attentional alexia, is usually caused by damage to the prefrontal cortex that helps mediate attention. People with this condition can read single words, but when there are multiple words on a page, they become distracted and unable to read. Like neglect alexia, this is a failure at the letter analysis level. This form of alexia is common in people with head injury.

Lastly, visual alexia is an impairment in the way that written words are perceived and analyzed during reading. It manifests as letter or syllable substitutions, additions, or omissions (e.g., butter instead of better; prince instead of price). This problem is caused by issues with the letter analysis system and/or the orthographic input lexicon.

Central Alexias

Central alexias affect the underlying linguistic reading system. There are four forms to mention: phonological alexia, surface alexia, deep alexia, and nonsemantic reading. Phonological alexia is a relatively mild form of dyslexia, which usually does not affect the reading of real words. If people with this condition do have trouble reading real words, it is usually in misperceiving visually similar words. Their real difficulty comes in reading/sounding out new or nonwords (i.e., pseudowords). For example, a nonword such as phope might be read as phone. Unfamiliar or new words can often be misperceived as being other known words. Phonological alexia is probably due to damage to the parietotemporal reading system, where word analysis occurs. In the dual-route reading model, it is a breakdown in the nonlexical reading route.

In comparison, patients with surface alexia have difficulty reading words with irregular print-to-sound correspondences. For example, they would have difficulty reading words like colonel, yacht, island, and borough. This deficit should be compared to those with phonological alexia, who usually do not have this difficulty. These patients read words with regular print-to- sound correspondences (e.g., state, hand) well. These issues are thought to be due to problems with the occipitotemporal reading system, which appears to be important in the lexical reading route.

Deep alexia, also known as semantic alexia, is a breakdown in both the lexical and nonlexical reading routes. In this condition, people recognize words as other words. For example, the word castle is perceived as knight. Another example might be the word bird being read as canary. They can also misperceive similar-looking words, like scale versus skate. Concrete words (e.g., chair, table) are read better than abstract words (e.g., fate, destiny, wish). Nonwords cannot be read either, which is a breakdown in the nonlexical reading route. Patients with deep alexia are usually not aware of their errors, and these errors are not circumlocutions due to word finding errors. Deep alexia occurs because of disruptions to the occipitotemporal reading system plus probably the perisylvian language area.

Nonsemantic reading is also called reading without meaning. People with this form of alexia can read real and nonwords, and regular and irregular words without difficulty; however, they do not comprehend what they are reading. In other words, they correctly pronounce but do not understand what they are pronouncing. In the dual-route model, it is a problem with the semantic system in the lexical reading route. Neuroanatomically, these issues are most likely due to lesions in and around Wernicke’s area (BA 22) and perhaps the fusiform gyrus (BA 37) (Hillis, 2004). We joke in my neuroanatomy class that college students have this condition, which becomes evident as they wade through hundreds of pages of reading each semester.

Agraphia

Agraphia (or dysgraphia) is an acquired disorder of writing. Premorbid writing abilities were normal, but after a stroke or other neurological disorder, writing abilities are impaired. Because writing and spelling are crucial to skills of daily living (e.g., writing checks, making grocery lists), agraphia can be a devastating disorder, especially when it co-occurs with other neurogenic language disorders like aphasia and/or alexia.

A Dual-Route Writing Model

Just as there is a dual-route reading model, so there is a dual-route writing model with lexical and non- lexical writing routes (FIGURE 12-23). The lexical writing route is for spelling words we already know, while the nonlexical writing route is for spelling words that are unfamiliar to us. Imagine you are sitting across from a client and you ask him or her to write the word dog. If this person’s writing were unimpaired, meaning would be attached to the heard word using the semantic system. His or her brain would then search for a known written word in the orthographic output lexicon to match this auditory stimulus. Next this item would be sent to the graphemic buffer, a type of working memory meant to store this written word form long enough to write it. The process would then move into actual preparing to write stages. The first of these is the allographic conversion stage. Here the word dog would be converted into print or cursive and into uppercase and lowercase as appropriate. The next stage, graphomotor programming, involves motor planning for the hand to make the correct motions with a pen to write the word dog. Graphomotor execution is the final stage, in which the word dog is written. The nonlexical writing route accounts for writing words that are new to a person. This route is used heavily in childhood as we learn to write and spell. This model is used to explain the different types of agraphia in the next section (TABLE 12-7).

FIGURE 12-23 The dual -route writing model.

Types of Agraphia

There are two basic types of agraphia, peripheral and central agraphia. Peripheral agraphia includes writing problems due to visuospatial processing and atten- tional problems. They occur lower in the dual-route writing model, leaving the patient’s underlying core linguistic reading system intact. Central agraphia involves impairment in the underlying linguistic writing system and occurs higher in the dual-route model. Both of these categories of agraphia are discussed in the following sections along with their subtypes.

Peripheral Agraphias

Graphemic agraphia is a problem with graphomotor execution of writing and spelling due to attentional issues. Patients omit, substitute, add, or transpose letters. This form of agraphia is caused by lesions to the left prefrontal cortex and left parietal lobe. Patients with spatial agraphia have difficulty writing accurately on a horizontal line. They may also write on one side of the paper or the other or will make extra marks on a letter. This form sometimes accompanies visual neglect or hemianopsia. It is a deficit in graphomotor execution usually caused by damage to the right hemisphere. Allographic agraphia involves errors in allographic conversion of lowercase versus uppercase letters (e.g., mR. sMitH). It can also include confusion of print and cursive scripts. Left parieto-occipital lesions are associated with allographic agraphia. The final form of peripheral agraphia is apraxic agraphia, a disorder of graphomotor programming, in which people have impairments carrying out the motor plans for writing. They will struggle to hold a writing utensil correctly and will search and grope to write letters correctly. Apraxic agraphia is associated with the main components of the brain’s writing centers, including SPL, premotor cortex, and SMA damage (Beeson & Rapcsak, 2004).

TABLE 12-7 Summary of Peripheral and Central Agraphia Subtypes

Agraphia

Category

Agraphia Subtype

Definition

Place of Breakdown in Dual-Route Model

Peripheral agraphia

Graphemic

Letter or syllable substitutions, additions, or omissions

Graphomotor execution

 

Spatial

Writing on a slant and/or including additional marks

Graphomotor execution

 

Allographic

Errors in lowercase vs. uppercase and printing vs. cursive

Allographic conversion

 

Apraxic

Motor plans for writing graphemes lost

Graphomotor programming

Central agraphia

Deep

Semantic paragraphias (e.g., substituting one word for another)

Semantic system and/or orthographic output lexicon

 

Phonological

Trouble writing nonwords

Phonological output lexicon

 

Surface

Inability to spell irregular words (e.g., island)

Orthographic output lexicon

 

Semantic

Difficulty spelling homophones

Semantic system

 

Graphemic buffer

Trouble holding word form in memory in order to write it

Graphemic buffer

Data from Papathanasiou, I., & Csefalvay, Z. (2017). Written language and its impairments. In I. Papathanasiou & P Coppens (Eds.), Aphasia and related neurogenic communication disorders (pp. 219-244). Burlington, MA: Jones & Bartlett Learning; McNeil, M. R., & Tseng, C. H. (2005). Acquired neurogenic agraphias: Writing problems. In L. L. LaPointe (Ed.), Aphasia and related neurogenic language disorders. New York, NY: Thieme.

Central Agraphias

Central agraphias are linguistically based and occur higher in the dual-route writing model. For example, deep agraphia (or semantic agraphia) is characterized by semantic paraphasias (i.e., substituting one word for another) due to issues with the semantic system. In addition, patients have trouble with spelling, especially homophones (e.g., night versus knight). Phonological agraphia is a relatively mild form of agraphia caused by damage to the nonlexical route’s phonological output lexicon. These patients can write regular and irregular words but have difficulty with nonwords or nonconcrete words (e.g., pride). Spelling is affected in both deep and phonological agraphias. These two conditions typically occur due to damage to the language regions of the left hemisphere, including Broca’s area, Wernicke’s area, and supramarginal gyrus. Surface agraphia involves impairments in spelling irregular words (e.g., island), but spelling of regular words is preserved. This type of agraphia is due to problems with the orthographic output lexicon. Damage to the extrasylvian temporoparietal regions of the left hemisphere are usually associated with this form of agraphia (Beeson & Rapcsak, 2004). Semantic agraphia is difficulty spelling homophones and is a deficit in the lexical writing route’s semantic system. Finally, graphemic buffer agraphia is a deficit in the dual-route model’s graphemic buffer system. Patients with this type of agraphia cannot hold written forms in their linguistic working memory long enough to write the word. Shorter words are easier to write than longer words are because shorter words require less graphe- mic buffer space (Papathanasiou & Csdfalvay, 2017).

► Conclusion

There are many things we humans take for granted in our lives, and our ability to express ourselves through language may be one of those things. It is certainly extremely complicated neurologically, using a vast array of networked structures. Its complexity is very difficult to capture, as the Wernicke-Geschwind model illustrates. Imaging studies as well as case studies do lend support to some kind of altered form of this model. The major problem with the Wernicke-Geschwind model is that it is too simplistic. First, the size and location of language areas are different from patient to patient. Second, the model does not take into consideration subcortical regions that may be involved in language. Third, the reliance on case and imaging studies may be problematic, especially if these studies are not taking into consideration areas that are subtly involved.

SUMMARY OF LEARNING OBJECTIVES

The following were the main learning objectives of this chapter. The information that should have been learned is below each learning objective.

1. The learner will define language.

 Language is a generative and dynamic code containing universal characteristics whereby ideas about the world are expressed through a conventional system of arbitrary symbols for communication.

2. The learner will list and define the components of language.

 Content (semantics): the meaning of language

 Form (grammar): the form of language

 Phonology: the study of a language’s sound structure

 Morphology: the study of a language’s word structure

 Syntax: the study of a language’s sentence structure

 Use (pragmatics): how language is used practically (e.g., conversation)

3. The learner will outline how language is thought to be neurologically processed in comprehending, reading, speaking, and writing.

 Auditory comprehension: The sounds of spoken language pass through the peripheral auditory system where their acoustic energy is changed into neural impulses. These impulses are conducted through cranial nerve VIII to the cochlear nuclear complex and through the brainstem to the medial geniculate of the thalamus. The thalamus routes the neural impulses to the temporal lobe’s primary auditory cortex where the signal is analyzed. The signal then goes to the planum tempo- rale (Wernicke’s area), which acts as a hub, drawing help from surrounding areas in the meaning attachment process. Syntax appears to be processed by the superior and posterior temporal lobes. If syntax is complex, Broca’s area is recruited into syntax processing.

 Reading: The eyes receive visual information, and photoreceptors in the eyes transduce light into neural impulses. These impulses travel down the visual pathways to the lateral geniculate nucleus of the thalamus. The geniculocalcarine tract projects to the medial part of the occipital lobe surrounding the calcarine sulcus (BA 17). The visual areas of the occipital lobe (BAs 17-19) process the signal and send it to three reading centers for decoding: the parietotemporal system (analysis at the phonemic level), the occipitotemporal reading system (sight-reading), and the anterior reading system (analysis at the syntactic level and silent reading).

 Speaking: The prefrontal cortex is important in the generation of our desire and intention to speak. Our idea is sent to the left inferior frontal cortex where semantic and phonological encoding occurs, probably with the help of surrounding areas (e.g., insular cortex). The plan to speak is sent to the supplementary motor area that initiates the motor plans. These plans are sent to the primary motor cortex, which activates speech muscles via the motor speech system.

 Writing: The prefrontal cortex is important in the generation of our desire and intention to write. Writing occurs through the cooperation of the superior parietal lobe (imagining graphemes and directing sequence of movements) and the following frontal lobe areas: Broca’s area, Exner’s area in the premotor cortex, and precentral gyrus. Motor signals are sent to the hand from the precentral gyrus.

4. The learner will describe the following disorders: aphasia, alexia, and agraphia.

 Aphasia: an acquired multimodality language disorder in which patients are either fluent (produce 100-200 words/minute) or nonfluent (produce fewer than 100 words/minute).

 Alexia: an acquired disorder of reading that can be peripheral or central in nature.

Peripheral alexia involves reading problems due to visuospatial and attention problems. It is nonlinguistic in nature and does not affect the underlying reading system. Central alexia is a linguistic problem that affects the underlying reading system.

• Agraphia: an acquired disorder of writing that can be peripheral or central in nature. Peripheral agraphia includes writing problems due to visuospatial processing and attentional problems. The patient’s underlying core linguistic reading system is left intact. Central agraphia involves impairment in the underlying linguistic reading system.

KEY TERMS

Agraphia Alexia Allographic agraphia Anomia

Anterior reading system Aphasia

Apraxic agraphia Attentional alexia Central agraphia Central alexia Conduction aphasia Content Deep agraphia Deep alexia Dyslexia

Form Geniculocalcarine tract Graphemic agraphia Graphemic buffer agraphia Language Morphology Neglect alexia Nonsemantic reading Occipitotemporal reading system

Optic chiasm Parietotemporal reading system

Peripheral agraphia Peripheral alexia

Phonological agraphia Phonological alexia Phonology Pragmatics Prefrontal lobotomies Pure alexia Semantic agraphia Spatial agraphia Surface agraphia Surface alexia Syntax Visual alexia Wernicke-Geschwind model

DRAW IT TO KNOW IT

1. Using the following Brodmann map, sketch a flow of neural information in the following tasks: auditory comprehension, visual comprehension, oral production, and written expression of language. You will have to sketch in the peripheral structures, like the ears, eyes, and so on.

2. Draw a diagram that displays the different classical aphasias, including their similarities and differences.

QUESTIONS FOR DEEPER REFLECTION

1. Define language.

2. List and define the components and subcomponents of language.

3. Compare and contrast fluent versus nonfluent aphasia.

4. Fill out the following chart using a positive sign (+) for relatively intact and negative sign (—) for impaired. See Table 12-5 as a reference.

5. Compare and contrast central alexia versus peripheral alexia. Give examples.

6. Compare and contrast central agraphia versus peripheral agraphia. Give examples.

CASE STUDY

Megan is a 27-year-old television sports reporter who had a sudden onset of difficulty talking on the air. While trying to describe a basketball game that had just ended, she said the following: “well, a very, very heava-ah-heavy-de-bertation tonigh. We had a very deris-derison by let’s go hit teris tazen go for the bit had the pit!” In addition to these speech problems, Megan had difficulty auditorily understanding her producer as he expressed concern for her behavior.

After about 30 minutes, Megan’s speech returned to normal.

1. What communication disorder was Megan experiencing?

2. What specific subtype of this disorder is the most likely type in Megan’s case?

3. What might have caused her to have this episode and then quickly recover?

SUGGESTED PROJECTS

1. Search the scholarly literature and find a case study involving aphasia, alexia, or agraphia. Prepare to present the case to your class.

2. Visit the National Aphasia Association’s website, prepare a summary of the resources available, and share with your class.

3. Pretend you have been asked to write an encyclopedia entry on aphasia, alexia, or agraphia. The entry you write should be no more than 750 words long.

REFERENCES

Beeson, P. M., & Rapcsak, S. Z. (2004). Agraphia. In R. D. Kent (Ed.), The MIT encyclopedia of communication disorders (pp. 233-236). Cambridge, MA: MIT Press.

Bloom, L., & Lahey, M. (1978). Language development and language disorders. London, UK: Pearson.

Cornelissen, P., Hansen, P., Kringelbach, M., & Pugh, K. (2010). The neural basis of reading. Oxford, UK: Oxford University Press.

Dapretto, M., & Bookheimer, S. Y. (1999). Form and content: Dissociating syntax and semantics in sentence comprehension. Neuron, 24(2), 427-432.

Davis, G. A. (2006). Aphasiology: Disorders and clinical practice (2nd ed.). Upper Saddle River, NJ: Pearson.

Friederici, A. D., & Gierhan, S. M. (2013). The language network. Current Opinion in Neurobiology, 23(2), 250-254.

Harrington, G. S., Farias, D., Davis, C. H., & Buonocore, M. H. (2007). Comparison of the neural basis for imagined writing and drawing. Human Brain Mapping, 28(5), 450-459.

Hedge, M. N. (2018). A coursebook on aphasia and other neurogenic language disorders (4th ed.). San Diego, CA: Plural Publishing.

Heim, S., & Keil, A. (2004). Large-scale neural correlates of developmental dyslexia. European Child and Adolescent Psychiatry, 13, 125-140.

Hillis, A. E. (2004). Alexia. In R. D. Kent (Ed.), The MIT encyclopedia of communication disorders (pp. 236-240). Cambridge, MA: MIT Press.

Hoffmann, M., & Chen, R. (2013). The spectrum of aphasia subtypes and etiology in subacute stroke. Journal of Stroke and Cerebrovascular Diseases, 22(8), 1385-1392.

Joseph, J., Noble, K., & Eden, G. (2001). The neurobiological basis of reading. Journal of Learning Disabilities, 34, 566-579.

Joseph, R. (2000). Neuroscience: Neuropsychology, neuropsychiatry, behavioral neurology, brain and mind. New York, NY: Academic Press.

Katanoda, K., Yoshikawa, K., & Sugishita, M. (2001). A functional MRI study on the neural substrates for writing. Human Brain Mapping, 13(1), 34-42.

LaPointe, L. L. (2005). Aphasia and related neurogenic language disorders. New York, NY: Thieme.

Longcamp, M., Anton, J. L., Roth, M., & Velay, J. L. (2003). Visual presentation of single letters activates a premotor area involved in writing. Neuroimage, 19(4), 1492-1500.

Menon, V., & Desmond, J. E. (2001). Left superior parietal cortex involvement in writing: Integrating fMRI with lesion evidence. Cognitive Brain Research, 12(2), 337-340.

Papathanasiou, I., & Csefalvay, Z. (2017). Written language and its impairments. In I. Papathanasiou & P. Coppens (Eds.), Aphasia and related neurogenic communication disorders (pp. 219-244). Burlington, MA: Jones & Bartlett Learning.

Riley, E. A., Brookshire, C. E., & Kendall, D. L. (2017). The acquired disorders of reading. In I. Papathanasiou & P. Coppens (Eds.), Aphasia and related neurogenic communication disorders (pp. 195-218). Burlington, MA: Jones & Bartlett Learning.

Roux, F. E., Dufor, O., Giussani, C., Wamain, Y., Draper, L., Longcamp, M., & Demonet, J. F. (2009). The graphemic/motor frontal area: Exner’s area revisited. Annals of Neurology, 66(4), 537-545.

Shaywitz, S. E., & Shaywitz, B. A. (2004). Reading disability and the brain. Educational Leadership, 61(6), 7-11.

Shaywitz, S. E., & Shaywitz, B. A. (2008). Paying attention to reading: The neurobiology of reading and dyslexia. Development and Psychopathology, 20, 1329-1349.

The Internet Stroke Center. (n.d.). What is aphasia? Retrieved from http://www.strokecenter.org/patients/caregiver-and-patient -resources/aphasia-information/

Thompson-Schill, S. L., Aguirre, G. K., Desposito, M., & Farah, M. J. (1999). A neural basis for category and modality specificity of semantic knowledge. Neuropsychologia, 37(6), 671-676.

Wagner, A. D., Koutstaal, W., Maril, A., Schacter, D. L., & Buckner, R. L. (2000). Task-specific repetition priming in left inferior prefrontal cortex. Cerebral Cortex, 10(12), 1176-1184.

Wagner, A. D., Pare-Blagoev, E. J., Clark, J., & Poldrack, R. A. (2001). Recovering meaning: Left prefrontal cortex guides controlled semantic retrieval. Neuron, 31(2), 329-338.



If you find an error or have any questions, please email us at admin@doctorlib.info. Thank you!