Eglash, Ron. “Computation, Complexity and Coding in Native American Knowledge Systems.” in Judith Hankes and Gerald Fast (ed) Changing the Faces of Mathematics: Perspectives on Indigenous People of North America. Reston, VA: NCTM 2002.

Computation, Complexity and Coding in Native American Knowledge Systems

Research in the knowledge systems of indigenous societies can be hampered by both cultural and technological assumptions. We see these assumptions at work in many popular television documentaries, where one hears of the “vanishing native” who “lived at one with nature.” Such portraits come from good intentions; but they only serve to further the stereotype of indigenous peoples as historically isolated, alive only in a static past. The idea of “living close to nature” implies concrete rather than abstract thinking, a simplistic “primitive” society taking only the first steps up a supposed ladder of progress.  We need to take special efforts to open our eyes to the dynamic histories and technological sophistication of indigenous cultures--for example, to think about active indigenous ecological knowledge rather than the passive portraits we so often hear, e.g. “Indians lived as part of the ecosystem.” Rather than the illusion of a frozen pre-colonial tradition, we need to see indigenous societies as having always been in a state of change, and to understand more recent features of Native American life as part of that history.

Steven J. Gould (1981) describes how biological evolution [1] used to be thought of as a single ladder of progress, but is now seen as a “copiously branching bush.”  In the same way, contemporary anthropologists now see cultural evolution as a branching diversity of forms. While European societies may have followed one particular sequence in the development of mathematics, other cultures might have developed mathematical ideas along very different lines. Rather than assuming that Native American mathematics must be restricted to simple counting systems or geometric forms, we should be open to any mathematical pattern that appears, including those which are embedded, emergent, or obscured by difficulities in translation to their western counterparts. Equally important, we should strive to show the interrelationships among such culturally embedded mathematical concepts. This essay will attempt to show how such an approach can open new possibilities through computation, complexity and coding in Native American knowledge systems.

1) Algorithmic complexity and biodiversity

Gary Nabhan, an agricultural researcher who works with Native American growers, notes that sustaining genetic diversity was an important theme in indigenous knowledge systems.  In searching for greater crop variety, Nabhan found that those areas with the strongest focus on ceremonial religious practices were also those with the greatest diversity in genetic resources.  For example, a rare bean variety was propagated for a winter ceremony, in which it is sprouted in the underground kivas. A rare variety of sunflower had also been maintained, growing weed-like around the fields, because it petals were used to make the yellow ceremonial face-paint.  Cutler (1944) found that South American medicine men conducted a ritual in which they were propagating “podcorn,” a variety that cannot be grown naturally because each kernel is covered by a heavy husk. Even outside of these ceremonial settings, Native American farmers still cited the religious framework as their reason for maintaining genetic diversity:

On one occasion, I asked a Hopi woman at Munqapi if she selected only the biggest corn kernels of all one color for planting her blue maize.  She snapped back at me, “It is not a good habit to be too picky... we have been given this corn -- small seeds, fat seeds, misshapen seeds -- all of them.  It would show that we are not thankful for what we have received if we plant just certain ones and not others” (Nabham 1983 pp. 7)

Why should Native American religions have such a strong emphasis on maintaining a more complex set of genetic resources?  From a biological point of view, these turn out to be crucial for coping with environmental uncertainty.  The winter ceremony bean has strong resistance against root knot nematodes -- not a typical problem, but in years with a nematode epidemic it could be the key to survival.  The wild sunflower is not edible, but it can cross-fertilize with the cultivated versions, and thus adds more genetic potential for adaptations.  Cutler had similar ideas, and suggested that the figure of Kokopelli, the humped-back flute player of Southwestern iconography, was an image of one of the herbalists from the South American tradition. They are known historically to have traveled north at least as far as Central America, and who still travel with flutes and back packs today. Kokopelli is also a fertility figure, maintaining the reproduction of all living organisms. We can see how the link between plant diversity and animal diversity -- of strong concern in today's environmental framework -- was part of this traditional knowledge system.

This systematic relation between complexity and uncertainty was not restricted to plant and animal genetics.  Native Americans also made a similar correspondence in the myths of the creator/trickster, Coyote, as we see in the Navajo story of creation:

First Man, First Woman, and Coyote... were not satisfied with the sky. ...So they searched for glittering stones and found some mica dust. First Man placed the Star Which Does Not Move [polaris] at the top of the heavens. ...Then he placed the four bright stars at the four quarters of the sky. ...Then in a hurry, Coyote scattered the remaining mica dust so it did not fall into exact patterns but scattered the sky with irregular patterns of brilliance (Burland 1968 pp. 93).

While people are creating order, Coyote is creating randomness, tossing bits of rock into the sky.  For Native Americans “randomness” is not merely a colloquial expression: random tosses were precisely gauged in their gambling procedures. Ascher (1990 p. 93) describes a vivid illustration in the Native American game of Dish.  In the Cayuga version of the game six peach stones, blackened on one side, are tossed and the total numbers landing black side or brown side recorded as the outcome.  The traditional Cayuga point scores for each outcome are (rounded to whole numbers) the exact values calculated by probability theory.

In the Navajo myth cited above, Coyote -- acting in his usual haphazard way -- creates rain and brings the seeds of all the plants.  The idea of creating an “irregular” complexity of genetic resources to match the irregular randomness of natural events is thus deeply embedded across many Native American societies and in diverse aspects of their knowledge systems.  It is also a foundational concept for certain measures of complexity used in modern mathematics. Examining the details of these complexity measures can shed light on their relation to Native American knowledge systems.

2) Complexity and Computation: the Kolmogorov-Chaitin measure

The first mathematical models for complexity was developed in the work of A. N. Kolmogorov and G. Chaitin in the late 1950s (see Pagels 1988 for a popular historical review). Noting that some apparently random numbers can be completely determined by a simple algorithm, Kolmogorov and Chaitin proposed that the "algorithmic complexity" of a number was equal to the length of the shortest algorithm required to produce it.  This means that periodic numbers (such as .121212121...) will always have a low algorithmic complexity. Even though the number is infinitely long, the algorithm can simply say "repeat 12 forever." A longer algorithm, like that required to produce Pi, would have a higher algorithmic complexity. Truly random numbers (eg a string of numbers produced by rolling dice) have the highest algorithmic complexity possible, since their only algorithm is the number itself -- for an infinite length, you get infinite complexity.

From a mathematical point of view, this is why the Native American knowledge systems place a strong emphaisis on maintaining biodiversity: Since algorithmic complexity increases with randomness, Coyote’s random natural events can only be matched by the maximum in genetic resource complexity.  But the parallels may be more than just analogy. Miguel Jiménez-Montaño (1984) published what several researchers have regarded as one of the most workable systems for measuring algorithmic complexity -- a system he developed at the University of Veracruz, Mexico for use on amino acids and genetic sequences. Jiménez-Montaño credits his PhD advisor, Werner Ebeling, as the main scientific inspiration for his research. But he was well-aware of the indigenous plant complexity, having been a long-time friend of Mario Vazquez, a distingished botanist in the Biological Institute at the University of Veracruz who studies archaeological data on the plants used by the indigenous societies of Mexico. Although it may have been only a subliminal or indirect influence, it would not be too much of a stretch to think about Native American complexity concepts as one of the influences in Jimenez-Montano’s work.

The difficulty in practical application of the Kolmogorov-Chaitin measure is that they used a class of universal symbol-generating systems called “Turing machines,” named after British mathematician Alan Turing.  Jiménez-Montaño’s insight was in applying a more restricted set of procedures (what computation theorists call a “context-free grammar”) for generating the symbols (amino acid sequences), which makes it easier to ensure that one has actually found the minimal length algorithm.  In a context-free grammar, we begin with a starting symbol S, and production rules which tell you how to replace previous symbols with new ones. A list of “terminal symbols” tells you which cannot be further replaced. For example, given terminal symbols {a,c,t} and rules {S -> cb, b -> ad, d ->t} we get {cat} as the only permissible string in this grammar. To measure the complexity K, we just sum the number of symbols on the right-hand side of the production rules: K = 2 + 2 + 1 = 5.  There is just one more part to the complexity measure: since a string of symbols that repeat should count less (recall what we said previously about the low complexity of periodic numbers), any production rule in which there are n repetitions of a single symbol is counted as  1+log2n.

The application of complexity measures to biology in the work of Jiménez-Montaño was based on amino acid sequences, but we can just as easily apply it to biological features that we can  see in the everyday world.  Corn is a particularly good example because it is easy to see the contrast between the single variety of yellow corn we eat and the diverse varieties of "Indian corn" that we used for decorative purposes in the fall.  Geneticist Barbara McClintock discovered transposition--the release of a chromosome element and its insertion into a new chromosome --by observing the complex color patterns in Indian corn, and she later worked on NSF projects for  preserving the Native American seed stocks threatened by the increasing popularity of yellow corn (Keller 1983).

Real corn patterns are typically quite irregular, but for the sake of our illustration we will pretend that some very regular rows were found, restricted to the colors Yellow, Black, Red, and White. Suppose we compare two rows of 32 kernels each, with the following repeating patterns:

1) YBRW...

Pattern one repeats every four symbols, while pattern two repeats every eight symbols. Intuitively we see pattern two as more complex; but we can confirm that using the complexity measure. The minimum set of production rules can be found by experimentation; they are as follows:

1) S -> a8 (meaning “aaaaaaaa”)
a -> YBRW

The complexity is K = (1+log28) + 4 = 8

2) S -> a4
a -> bc
b -> YBRW
c -> BWYB

The complexity is K = (1+log24) + 2 + 4 + 4 = 13

This agrees with our intuition about the difference in the complexity of the two patterns. To learn more about the work of Jiménez-Montaño and his explorations of biological complexity, readers might examine his recent publications through the Santa Fe Institute (cf. Cocho et al 1993).   But lets look back again at the earlier Native Americans concepts.  How might Jiménez-Montaño’s approach compare to these indigenous knowledge systems?

2. Computation in Ojibway scrolls

The Ojibway societies historically occupied a vast area extending from Manitoba in Canada to Michigan and Minnesota in the U.S.  The Southern Objiway created a pictographic method for recording their ideas -- primarily those relating to cosmology and ritual narrative -- by etching birchbark strips. Selwyn Dewney (1975), a researcher at the Glenbow-Alberta Institute in Calgary, has published a beautiful collection of these sacred scrolls, and noted that there were certain numerological patterns evident in them.  In a similar analysis Closs (1986) shows that several of the scrolls suggest groupings by multiples of 4.  Dewney also noted these patterns, but focused more on multiples of 3 in what he termed “deviant scrolls.”  I think both authors are correct in that multiples do occur, but I want to suggest that the scroll patterns are better explained in terms of the kind of production rule generation system used by Jiménez-Montaño.

Figure 1 at top shows the diagram etched on a birch bark scroll by Ojibway shaman Sikassige, originally published in Hoffman (1891).  The diagram shows four stages of initiation; a shaman’s journey through the “path of life” (Dewney pp. 74). Each stage is represented as a lodge presided over by several officials; the number of these officials increase as the sequence 8, 12 18, 24.  Since 8 is not divisible by 3 and 18 is not divisible by 4, this sequence is not well explained by the idea of counting by multiples. But this numeric sequence is completely consistent as the result of a  production rule generation system based on the groupings of the officials, as shown in figure 1 bottom.  I don’t think Sikassige thought of it in terms of self-generating symbol strings -- for one thing, that would require that the spaces between officials also be represented as symbols, which makes the production rules more arbitrary (as indicated in the note at the bottom of figure 1). But I do suspect he thought of it as self-generating stages of initiation, that is, each stage creating the preconditions for the next one.

Figure  2 (top), another initiation/road of life scroll, is estimated by Dewney to have been created between 1825 and 1875.  It lends itself quite easily to description by production rule, as we see at the bottom of figure 2.  It is interesting to note that Dewney used the phrase “terminal symbols” to translate Ojibway shaman Skwekomik’s description of the icons that end the sequence.

Figure 3 (top), the Lac Court Oreilles scroll, has not been fully identified in terms of its function. The lodge officials, or whatever the human-like figures on the edges represent, appear to progress according to a production rule system that is specified by the posts inside each lodge (as shown in figure 3 bottom). The only exception is the final number in the last lodge, which is 25 by the production rule but only 20 in the scroll.  If our guess about the production rule is correct (and there are no guarantees of that), then what happened to the missing 5? Dewney notes that in the final stage initiates were warned that they may be diverted from the path by evil spirits, often appearing as serpents.  Since 5 pairs of serpents oriented in the “bad” direction of North-South appear in the inner rectangle, they may be all that is left of our 5 missing officials.

3. Complexity and Communication: the Shannon-Weaver measure

While genetic complexity is measured using the mathematical theory of computation, entire ecosystems can be measured for complexity using the mathematical theory of communication. Such measures are increasingly important in keeping track of endangered environments (cf. Whittaker 1975), and they too have parallels in both Native American biocultural practices as well as Native American mathematics. Ecologists first began measuring ecosystem complexity by counting the number of species per unit area.  But they found this unreliable; suppose there was an ecosystem in which there were 500 different species of animals, but 99% of the individual animals were of one species? That would mean that if you took a trip there, the probability of seeing anything but the one common species would be quite low; the ecosystem actually had a low species diversity. One way to solve this dilemma is to think of it in terms of communication. At an intuitive level, we can see that communicating the description of a very simple environment is easier than communicating the description of a complex environment. Ecologists applied the mathematical theory of communication, proposed by Claude Shannon and Warren Weaver in the late 1940s, to develop a quantitative measure of ecological complexity based on this intuition.

Shannon and Weaver defined the basic unit of information transmitted in a communication system in terms of probability.  Suppose you are lying in bed with a 50% chance of snow, and mom calls out “get down to breakfast! Sun’s out!” You have just be provided with some useful information.  Now suppose you are in the middle of a blizzard, with 99% chance of snow, and your little sister wakes you up to say “look, there is more snow falling today!”  She has provided you with less information, because telling you something you were already pretty sure of is less informative.  The less likely an event, the more information communicated by its symbol. This relation is precisely defined by Shannon and Weaver: each symbol of probability p contributes an amount of information in bits, I = -log2p. To get the average number of bits per symbol (usually referred to as the entropy H) for a given communication system, we need only take the sum of  pI for each symbol. In the 1960s ecologists noticed that H would make a good way to evaluate the diversity of an ecosystem, because it combines the contribution a species makes to diversity with its total population [2].  Suppose, for example, we have two ecosystems in which we have three birds: Sparrow, Robin, and Crow. We can think of the probabilites as the population percentages for each:

1) S = .10, R = .15, C =  .75
2) S = .33, R = .33, C = .34

H(1) = (.10*3.322) + (.15*2.737) + (.75*.415) = 1.054 bits
H(2) = (.33*1.600) + (.33*1.600) + (.34*1.556) = 1.585 bits

This confirms our intuition that ecosystem (1), overwhelmed with crows, is not as diverse. The maximum amount of information per symbol is obtained when all symbols are equally probable. More generally, we can see the relation between complexity and communication: the more complex the system, the greater amount of information it takes to convey its description. This leads quite easily to the concept of optimal coding, because you cannot use communication to gage complexity if some of the description is superfluous. Some Native American aphorisms regarding communication refer to this concept of optimal coding, such as "it does not require many words to speak the truth" (Chief Joseph, Nez Perce). It is often contrasted with the Euro-American tendency to be chatty or overly verbose (cf. Basso 1979).

4. Mathematics in Native American communication

Teachers interested in utilizing Native American knowlege systems in math education will find a rich resource in applications of information theory to indigenous coding practices.  Calculations such as bits per message (entropy) or bits per second (channel capacity), for example, can be carried out for sign language, smoke signals, bead and feather patterns, sand paintings, and other Native communication media (cf. Mallery 1972, Witherspoon and Peterson 1995 for Native American coding examples; Pierce 1980 for an introduction to information theory). As a quick example, lets compare smoke signals with fire arrows. Mallery notes a wide variety of smoke signal patterns, including reports of some resembling “the telegraphic alphabet.” Most appear to be based on the number of columns of smoke (that is, simultaneous fires) and the length of the column of smoke. Fire arrows, a similar signal system used only during the night, can also be in simultaneous groups, and are distinguished by vertical v.s. diagonal orientations. Which can convey more code symbols: a smoke system with a maximum of two columns of smoke (disregarding order) and one of three possible column lengths for each fire, or a fire arrow system with a maximum of three arrows and two possible orientations? This is a problem in combinatorics; here are all 9 possible combinations:

Smoke signals (columns Long, Medium, Short): LL, LM, LS, MM, MS, SS, L, M, S. 9 total
Fire arrows (Vertical v.s. Diagonal): VVV, VVD, VDD, DDD, VV, VD, DD, V, D.       9 total

Given equal probability for all n = 9 symbols, H = log2n = 3.17.

Now consider the Apache system (Mallery pp.538-9), where signals fall into three categories: “attention,” “safety,” and “caution.” Within each category it appears that distinctions can be made based on smoke column length, while the categories themselves are distinguished by the number of simultaneous fires (from one to three). Variation in column length is not entirely clear, but let’s assume that there are two equally probable possibilities, intermittent versus continuous, and that attention signals are sent 55% of the time, safety signals are sent 30% of the time, and caution signals are sent 15% of the time. We can then calculate:

H = -(  2((.55/2)log2.55/2)) + 2((.30/2)log2.30/2) + ((.15/2)log2.15/2))) = 2.406

If lighting two fires takes twice as long as one, and three fires takes three times as long, what is the optimal assignment of number of fires to categories? Our intuition correctly tells us that optimal coding (maximum rate of information transmission) would require that the frequency of use of a symbol should be inversely proportionate to the speed of its signal, but let’s confirm that mathematically. The information rate R is defined as the average number of bits per second for a given communication system, which is given by dividing H by the average number of signals per second.  Using f for the number of seconds it takes to send the signal (create a fire and generate the smoke), we have six possibilities for the information rate:

R1 = 2.406/(.55f + .30*2f + .15*3f) = 1.504/f
R2 = 2.406/(.55*f + .30*3f + .15*2f) = 1.375/f
R3 = 2.406/(.55*2f + .30f + .15*3f) = 1.301/f
R4 = 2.406/(.55*2f + .30*3f + .15*f) = 1.119/f
R5 = 2.406/(.55*3f + .30*f + .15*2f) = 1.070/f
R6 = 2.406/(.55*3f + .30*2f + .15*f) = 1.003/f

Code R1 is optimal, and this is the same code assignments as that recorded for the Apache system: “attention” signals use one fire, “safety” signals two, and “caution” signals three. Their own reasoning for this choice may have combined the idea of frequency of use with some concept of the urgency of the message (the more fires, the more important the information), and perhaps even reducing error from false signal detection (three fires being the least likely to happen by accident), but all of these criteria would involve the concept of optimal coding.

4. Native American communication and the history of computing

Looking at Native American complexity concepts, we found ties to contemporary computation in the work of Jiménez-Montaño. Are there connections through communication as well? Modern computing begins with a synthesis between the kind of production rule generation theory discussed previously, and the engineering of machines for performing the required symbol manipulations. John von Neumann, who created the first synthesis, credited Alan Turing with having contributed the mathematical theory of computation required (see Hodges 1983 pp. 304).  Turing was not only a theorist however, since the British government required him to work on cryptography in WWII; specifically cracking the codes produced by the German army’s “enigma machine.” While the German cryptographers were no match for Turing, they also failed to crack an  American military information system: the Native American “code talkers.”

This was not the first time Native Americans had been asked to provide U.S. military coding. Choctaw men relayed messages during World War I via field telephones in France, and 1940 the Army Signal Corps ran tests with Comanches from Michigan and Wisconsin. In WWII Choctaw, Kiowa, Winnebago, Creek and Seminole soldiers employed native languages to encrypt radio communications in Europe and North Africa.  The most famous in WWII were the Navajo code talkers (Kawano 1990); working together with army cryptographers they developed a complex system that included both alphabetic symbols as well as encrypted whole words.  The creation of the Navajo codes, as well as the training techniques, combined indigenous knowledge systems with modern communication theory.

It would be absurd to simply state that Native American code talkers lead to the first computer -- any influence on the cryptography that Turing was involved in, and even Turning’s influence on von Neumann, is diffuse and subtle.  But it would be equally absurd to simply write them out of this history.  The process of invention is always situated in the cross-currents of many different intellectual streams, and in the case of inventing the first computers, cryptography -- with all its cultural confluence -- is a part of that process.

WWII was not the end of this connection. At a reservation in Arizona, defense systems engineer Tom Ryan married into a Navajo family, and together with his father John Ryan, a Lockheed senior scientist, they began reflecting on the wartime coding effort -- if this kind of involvement was possible in war time, why not during peace? They began the Navajo Technologies Corporation, which combined training at the Navajo Community School in Birdsprings Arizona with several Department of Defense contracts for creating Ada compliers (Able 1988). This was just one of what would later become an enormous number of links between Native American communties and computing.  In Alaska, for example, environmental conditions promoted the early use of distance learning, and with home computers this resulted in the emergence of a widespread virtual community of indigenous people. The American Indian Computer Art Project features the work of indigenous artists who design in digital media, and even sells a digitizing stylus that is wrapped in traditional bead patterns. A more general web site, NativeTech,  is “dedicated to disconnecting the term "primitive" from perceptions of native american technology and art.”

Tribal websites and have now blossomed across the internet in a variety of forms, and some have started working in  relation to technical projects ranging from linguistics to ethnomathematics. Native Seeds/SEARCH, for example, a botannical organization dedicated to continuation of the indigenous plant stock, has been creating a “cultural memory bank” that will tie Native American farmers to their agricultural contributions. The concept, originating from Philippine ethnobotanist Virginia Nazarea-Sandoval (1996), documents the combination of cultural and biological information about the crops, seeds, farming, and utilization methods. The information, including video interviews, is stored on CD-ROM, with access controlled entirely by the indigenous farmers. Here we have come full circle, as the modern computer is used to help maintain the biogenetic complexity created by Native cultures.

4. Conclusion

While the math portion of ethnomathematics has seen an extraordinary wealth of creative yet rigorous frameworks -- an exploration of mathematical activity that has generated many new teaching resources -- the portrait of culture in ethnomathematics has received much less attention.  In particular, the portraits of Native American “tradition” can imply a static, homogenous society lost in the distant past. Critiques of such frameworks have been a focus in anthropology for at least a decade (cf. Clifford 1988).  This essay is an attempt to broaden the view of Native American ethnomathematics so that we can, when needed, see change as traditional, “authenticity” as a part of colonial politics, and the artificial worlds of mathematical technologies as a staging ground for sacred space.


[1]. People often confuse biological and cultural evolution. Here are two crucial differences.First, cultural evolution is Lamarkian; we can pass knowledge we acquired to the next generation, while biological evolution is Darwinian, with the rare lucky mutant having an advantage that is then passed on.Second, the time scales are of different orders of magnitude.Significant biological evolution occurs over millions of years, while dramatic cultural evolution no more than a few thousand years.This is why human beings have such a tiny amount of genetic variation: the first modern humans, from their singular origin in Africa, quickly spread across the earth over a few thousand years. Our nearly identical genetic composition is a result of speedy Lamarkian cultural evolution adapting us into these new environments.

[2]. That is, it not only considers how many species there are, but also how much of each species. The “how much” can be measured in many different ways, not just individuals per unit area, and is referred to as the “species importance” (see
Whittaker 1975)


Able, Dawn. “The Navajos: using language for two nations.”  Defense Computing, pp. 35-38, May-June 1988.

Ascher, M. Ethnomathematics: a multicultural view of mathematical ideas. Pacific Grove: Brooks/Cole Publishing, 1990.

Basso, Keith. Portraits of “The Whiteman.” New York: Cambridge University Press 1979.

Burland, C. North American Indian Mythology. London: Hamlyn 1968.

Clifford, J.  The Predicament of Culture.  Cambridge:  Harvard University Press, 1988.

Closs. M.P.  “Tallies and the ritual use of of number in Ojibway pictography.”  in M.P. Closs (ed) Native American Mathematics, Austin: University of Texas 1986.

Cocho G., Lara-Ochoa F., Jiménez-Montaño M. A. , and Ruis J. L. , "Structural Patterns in Macromolecules." in Wilfred D. Stein and Francisco J. Varela (eds) Thinking About Biology. Santa Fe: Santa Fe Institute 1993.

Dewney, S.  The Sacred Scrolls of the Southern Ojibway. Toronto: Univ. Toronto Press 1975.

Eglash, R.  "Inferring representation type from spectral estimates of fractal dimension in communication waveforms."  Journal of Social and Evolutionary Structures, vol 16, #4, 1993.

Gould, J.S.  The Mismeasure of Man.  NY: W.W. Norton, 1981.

Hoffman, W.J.  Middéwein or Grand Medicine Society of the Ojibway.  Washington D.C.: 7th report of the U.S. Bureau of Ethology to the Smithsonian Institute, 1891.

Jiménez-Montaño, M.A. "On the syntactic structure of protein sequences and the concept of grammar complexity." Bull. Math. Bio. 46,4 pp. 641-659., 1984.

Kawano, Kenji. Warriors: Navajo Code Talkers. Flagstaff: Northland Pub, 1990.

Keller, Evelyn F. A Feeling for the Organism. New York: W.H. Freeman 1983.

Mallery, Garrick.  Sign Language Among North American Indians. NY: Mouton 1972.

Nabhan, G. "Kokopelli: the humpbacked flute player" Coevolution Quarterly, pp 4-11, Sp 1983.

Nazarea-Sandoval, Virginia. “Fields of memories as everyday resistance.” Cultural Survival Quarterly, pp. 61-66, spring 1996.

Pagels, H.R. The Dreams of Reason: the computer and the rise of the sciences of complexity. NY:  Simon and Schuster 1988.

Pierce, John. An Introduction to Information Theory. NY: Dover 1980.

Hodges, Andrew. Alan Turing -- the Enigma. Burnett Books, London 1983

Whittaker, Robert H. Communities and Ecosystems.  New York: MacMillan Publishing 1975.

Witherspoon, Gary and Peterson, Glen.  Dynamic Symmetry and Holistic Asymmetry in Navajo and Western Art and Cosmology.  Bern and New York: Peter Lang Publishing, 1995.