So, by a process of elimination we’ve decided that norepinephrine (NE) controls attention by sending a signal that multiplies the salience tags in active memory, thus controlling the salience gradient. The more NE, the more likely we are to attend to the most salient potential attendums. How much sense does this make? We’ll look at this two ways: chemically and historically (personally, even). And the chemical argument will divide into two parts: one about evolution, and one looking at traits in humans. The key chemical fact about NE is that it’s very close to dopamine (DA) structurally. In fact, in the synthesis of NE from the amino acid tyrosine, DA is an intermediate step; the brain actually makes DA, for a moment, in the process of making NE. Note that I’m not saying that the brain “makes NE out of DA,” although that’s technically true (and you may read that elsewhere). But that implies than some of an existing, usable cache of DA is being converted to NE, and that’s not at all true. In fact, if that were true, the levels of DA and NE would be inversely correlated; if you had a lot of one, you would have only a little of the other. But in fact the levels of the two neuromodulators are positively correlated: if you have a lot of one, you tend to have a lot of the other. The DA-producing cells all have a pair of enzymes which make DA out of tyrosine. The NE-producing cells add a third enzyme, dopamine β-hyroxylase, which turns the DA into NE. If you have alleles (“genes”) for especially active or inactive versions of either of the first two enzymes, you will thus tend to have high or low levels of both DA and NE. Furthermore, if you think about this chemical chain, you will see that there’s nothing to prevent someone from having very high levels of DA production but very low levels of NE—you’d just need very active variants of the DA-making genes and a very weakly productive version of the NE-making one. But the opposite would be impossible. If you have very low levels of DA production, that sets an upper limit on how much NE you can produce; the NE-making cells just don’t have enough DA to convert to NE even if the NE-making enzyme is very active. So folks who make very little DA are forced to make relatively little NE as well. So, evolution has selected for these two relationships: in general, DA and NE levels are correlated, and specifically, low DA and high NE is a forbidden combination. Does this make sense in terms of our hypothesized roles for each? The first thing you might want to know is at what point in evolution this relationship was established. And it so happens that four of the five neuromodulators of the control brain go way, way down the evolutionary ladder, and are found in invertebrates. NE is the exception. Invertebrates don’t have NE; they instead have octapamine (OA) serving an apparently analogous role. OA is in the same family of chemicals but does not have DA as a precursor. So in the original neuromodulatory paradigm, which is incredibly ancient, the five chemicals had unrelated manufacturing pathways. But at some more recent evolutionary point (not necessarily when the vertebrates evolved; I’m not sure anyone has ever examined the neurochemistry of hagfish and lampreys, which are on neighboring sides of the invertebrate / vertebrate division), this substitution happened:
Tyrosine -> [one enzyme] -> tyramine -> [dopamine β-hydroxylase] -> OA became Tyrosine -> [two enzymes] -> DA -> [dopamine β -hydroxylase] -> NE It’s important to note the conservation of the dopamine β -hydroxylase enzyme. The neuromodulator filling our hypothesized attention-controlling role went from OA to NE because a different substrate was provided for this enzyme. We can infer a surprising amount about the behavior of our early vertebrate ancestors from this knowledge. Let’s begin by reminding ourselves of the original purpose for varying the salience gradient: to adapt the strength of attention to the current environment, on the fly. It’s highly adaptive if you have the ability to keep attention focused once you have identified a predator threat, and nearly as adaptive if you can keep it focused after having identified a food source or potential mate. And it’s highly adaptive if you can instead keep attention volatile when there is no potential threat, food, or mate in sight, and the environment needs to be scanned and searched for same. So it’s no wonder that some sort of attentional control goes back essentially to the earliest animals. What can we infer from the substitution of DA for tyramine as the substrate for the enzyme that produced the attention-controlling neuromodulator? The apparent evolutionary purpose of this substitution was to correlate the levels of DA and that chemical, so that organisms with a high or low supply of one chemical would tend to have the same sort of supply of the other. That immediately tells us something crucial: that there were already different functional alleles for the enzymes involved in the synthesis of DA and OA. Because if every organism had the same allele for each of the four synthesizing enzymes, the levels would already be correlated: they’d be the same for every individual. And a little thought reveals that we would in fact need variation in both the levels of DA and of OA in order to make correlating them meaningful. Now, it’s not necessarily true that evolution would accommodate different alleles for these enzymes, and hence different levels of the neuromodulators. If there were a single best level of OA to have, any mutation of the enzymes involved in its synthesis would have rendered the organism less able to compete. The mutation would have been selected against, weeded out. But we know from the shift from OA to NE that more than one allele was present in the population: there were (at the least) high-OA and low-OA organisms, and even though their behavior would be different as a result, neither had an evolutionary advantage. And what can we make of that? If we’re right about the role of OA / NE, we are talking about organisms with different innate attention spans. And if different attention spans and hence different behaviors were equally adaptive, then we are talking about the organisms filling different behavioral niches. And in fact it’s not hard to imagine that an organism with a different attention span than its conspecifics would have an evolutionary advantage when hunting or being hunted, which could even mean evolutionary pressure to select for a wide variety of OA levels. Each distinct level of OA would correspond to a different behavioral niche in the great predator / prey dance. So this tells us a remarkable amount about the sophistication of the behavior of the earliest vertebrates: their environments and predator / prey interactions were complex enough to accommodate multiple behavioral niches. There were organisms with short attention spans and ones with long attention spans. And there were organisms with low DA levels and with high DA levels. And whatever was mediated by that trait, it was evolutionarily advantageous for the low DA organisms to have a short attention span (and, to a lesser extent, for the high-DA organisms to have a longer one). So what was the DA trait involved in this adaptation? We’ve hypothesized that DA turns on phenomenal consciousness, especially the experience of emotion, and hence the intensity of pleasure responses (and almost certainly pain responses as well). But that’s not the only thing DA does. DA initiates movement, and its current relative level represents the organism’s energy reserve or capacity for action. And as we saw quite a while ago, DA holds information in working and active memory, since that provides the simplest way of setting the correct salience tag. This last use of DA would seem to be the best candidate for the trait needing correlation, since it’s the one involved in attention. Individuals with high levels of DA would have larger stores of working and active memory. Let’s imagine four types of hunting behavior, derived from the four combinations of DA level and attention span (when hunted, behavioral differences melt away, as every organism gets its attention span driven up to transient high levels) . High-DA, long attention: Their ability to keep attention focused on a potential prey situation is rewarded by their large capacity to store potentially relevant details of that environment. High-DA, short attention: Their propensity to shift attention is rewarded by their ability to store potentially relevant details of multiple different environments. Low-DA, short attention: Their propensity to shift attention is compatible with their relatively limited ability to store information about the environment. Once they’ve observed as much as they can absorb, if there’s no prey found, they move on. (The reason why low DA does not put them at an evolutionary disadvantage is that it confers advantages unrelated to attention, such as a diminished conscious experience of pain.) Low DA, high attention: OK, this one doesn’t work. They’d be attending to their environment past the point where they could extract and store additional information about it. They’d all get outcompeted by the other three types, and they’d starve. And that explains why OA was replaced by NE. By making the size of working and active memory a prerequisite to the strength of the attention span, you eliminate individuals who have the ineffective combination of a low memory capacity but long attention span. And that would confer an evolutionary advantage on the mutation that caused the correlation. If we start with two equally sized populations of sharks, one of which still uses OA to control attention and one which has the mutation that substitutes NE, in the next generation the NE sharks will be more prevalent, since some of the OA sharks will have starved. With each passing generation the population imbalance will increase, and eventually, the OA sharks would become extinct.
And at this point you probably wonder what this means for people. That’s the next post.