Coscienza Paleolitica - Poliamorismo

Diete Paleolitiche

« Older   Newer »
  Share  
pietrosperoni
view post Posted on 2/2/2012, 16:47




Mi sorprende che non ci sia una sezione sulle diete paleolitiche.
In fondo vanno per la maggiore negli States.
Ma in Italia le segue qualcuno?
 
Top
Skamall
view post Posted on 2/2/2012, 18:03




non ne so praticamente niente e, inoltre, sento pareri molto discordi:
c'è chi sostiene (ma se è un'opinione diffusa, oggi, ancora non lo so!) che "un tempo" eravamo vegetariani e prede di animali provvisti di zanne e artigli che noi non avevamo. solo in un "altro tempo" ancora siamo diventati da prede a cacciatori "imitando gli altri animali"*.
roberto calasso sostiene che per svolgere questo processo -da preda a predatore- ci siano voluti almeno 100mila anni.

*beh, per ora, l'unico testo che ho sottomano che sostiene questa "teoria" è la letteratura lasciataci dai Veda, ma come prenderli in parola?!!

non so, qualcuno ha link "affidabili"?!
 
Top
pietrosperoni
view post Posted on 2/2/2012, 18:19




CODE
io i Veda li lascerei nello stessa sezione della biblioteca dove metto la bibbia, che certo non è nel la parte di archeologia, antropologia o dietetica.

I link di dietetica che ho raccolto negli anni sono qui:
www.bibsonomy.org/user/pietrosperoni/diet

questo è forse il migliore che ho trovato:
www.beyondveg.com/nicholson-w/hb/hb-interview1a.shtml

beyondveg è in genere un sito serio che cerca di usare una bibliografia il più possibile scientifica.

Qui ci sono anche un po' di riferimenti a peer reviewed works:
www.amazon.com/Paleo-Diet-Weight-Healthy-Designed/dp/0470913029/


ho messo il messaggio precedente come CODE perchè se no non me lo faceva mandare.
 
Top
Skamall
view post Posted on 2/2/2012, 19:56




grazie, visionerò i links con piacere :)

p.s. sia i Veda che la Bibbia contengono molti miti (eheheh) e per questo possono essere d'ausilio nell'interpretare certi passaggi su cui anche l'antropologia e le scienze hanno dubbi. penso che sia un atteggiamento costruttivo includere scienza e mito negli studi che riguardino questi temi e molti altri, certo :)
 
Top
Skamall
view post Posted on 14/2/2012, 21:17




(RIASSUNTO DAI LINK DI PIETRO PER LA TRADUZIONE IN I IN UN ALTRO THREAD CHE LINKERO'.)
CITAZIONE
65,000,000 to 50,000,000 B.C.: insectivorous diet.

50,000,000 to 30,000,000 B.C.: mostly frugivorous
herbivorous towards the end of it
lesser items in the diet, such as insects, meat, and other plant foods.[9]

30,000,000 to 10,000,000 B.C.: Fairly stable persistence of above dietary pattern.[10]

Approx. 10,000,000 to 7,000,000 B.C.: Last common primate ancestor of both humans and the modern ape family.[11]

Approx. 7,000,000 to 5,000,000 B.C.:
after the split ape/human + chimps, flesh foods began to assume a greater role in the human side of the primate family at this time.

Approx. 4,500,000 B.C.: First known hominid (proto-human) which may not yet have been fully bipedal Anatomy and dentition (teeth) are very suggestive of a form similar to that of modern chimpanzees.

Approx. 3,700,000 B.C.: First fully upright bipedal hominid, Australopithecus afarensis
"Lucy" skeleton.

3,000,000 to 2,000,000 B.C.: Australopithecus line diverges into sub-lines,[17] one of which will eventually give rise to Homo sapiens (modern man).
It appears
changing global climate between 2.5 and 2 million years ago driven by glaciation in the polar regions.[18]
The different Australopithecus lineages, thus, ate somewhat differing diets, ranging from more herbivorous (meaning high in plant matter) to more frugivorous (higher in soft and/or hard fruits than in other plant parts).

some meat was eaten in addition to the plant foods and fruits which were the staples.[20]

2,300,000 to 1,500,000 B.C.: Appearance of the first "true humans" (signified by the genus Homo), known as Homo habilis ("handy man")
still retained tree-climbing adaptations (such as curved finger bones)[21] while subsisting on wild plant foods and scavenging and/or hunting meat. (The evidence for flesh consumption based on cut-marks on animal bones, as well as use of hammerstones to smash them for the marrow inside, dates to this period.[22]) It is thought that they lived in small groups like modern hunter-gatherers but that the social structure would have been more like that of chimpanzees.[23]

The main controversy about this time period by paleoanthropologists is not whether Homo habilis consumed flesh (which is well established) but whether the flesh they consumed was primarily obtained by scavenging kills made by other predators or by hunting.[24] (The latter would indicate a more developed culture, the former a more primitive one.) While meat was becoming a more important part of the diet at this time, ..

1,700,000 to 230,000 B.C.: Evolution of Homo habilis into the "erectines,"* a range of human species often collectively referred to as Homo erectus, after the most well-known variant.
meat in the diet assumed greater importance. Teeth microwear studies of erectus specimens have indicated harsh wear patterns typical of meat-eating animals like the hyena.[26]
plants still made up the largest portion of the subsistence.* More typically human social structures made their appearance with the erectines as well.[27]

The erectines were the first human ancestor to control and use fire.
about 900,000 years ago in response to another peak of glacial activity and global cooling (which broke up the tropical landscape further into an even patchier mosaic), the erectines were forced to adapt to an increasingly varied savanna/forest environment by being able to alternate opportunistically between vegetable and animal foods to survive, and/or move around nomadically.[28]
For whatever reasons, it was also around this time (dated to approx. 700,000 years ago) that a significant increase in large land animals occurred in Europe (elephants, hoofed animals, hippopotamuses, and predators of the big-cat family) as these animals spread from their African home. It is unlikely to have been an accident that the spread of the erectines to the European and Asian continent during and after this timeframe coincides with this increase in game as well, as they probably followed them

Because of the considerably harsher conditions and seasonal variation in food supply, hunting became more important to bridge the seasonal gaps, as well as the ability to store nonperishable items such as nuts, bulbs, and tubers for the winter when the edible plants withered in the autumn. All of these factors, along with clothing (and also perhaps fire), helped enable colonization of the less hospitable environment. There were also physical changes in response to the colder and darker areas that were inhabited, such as the development of lighter skin color that allowed the sun to penetrate the skin and produce vitamin D, as well as the adaptation of the fat layer and sweat glands to the new climate.*[30]

Erectus finds from northern China 400,000 years ago have indicated an omnivorous diet of meats, wild fruit and berries (including hackberries), plus shoots and tubers, and various other animal foods such as birds and their eggs, insects, reptiles, rats, and large mammals



Edited by Skamall - 15/2/2012, 13:53
 
Top
Skamall
view post Posted on 15/2/2012, 13:51




CITAZIONE
500,000 to 200,000 B.C.: Archaic Homo sapiens
the later forms are sometimes not treated separately from the earliest modern forms of true Homo sapiens.[32]

150,000 to 120,000 B.C.: Homo sapiens neanderthalensis--or the Neanderthals--
It is now well accepted that the Neanderthals and Homo sapiens were sister species descended from a prior common archaic sapiens ancestor.[33]

140,000 to 110,000 B.C.: First appearance of anatomically modern humans (Homo sapiens).
fire, though discovered earlier, came into widespread use around this same time[37] corresponding with the advent of modern human beings.

130,000 to 120,000 B.C.: Some of the earliest evidence for seafoods (molluscs, primarily)

40,000 to 35,000 B.C.: The first "behaviorally modern" human beings--
The impetus or origin for this watershed event is still a mystery.[44]

40,000 B.C. to 10-8,000 B.C.: Last period prior to the advent of agriculture in which human beings universally subsisted by hunting and gathering (also known as the "Late Paleolithic"--or "Stone Age"--period). Paleolithic peoples did process some of their foods, but these were simple methods that would have been confined to pounding, grinding, scraping, roasting, and baking.[45]

35,000 B.C. to 15-10,000 B.C.: The Cro-Magnons (fully modern pre-Europeans) thrive in the cold climate of Europe via big-game hunting, with meat consumption rising to as much as 50%* of the diet.[46]

25,000 to 15,000 B.C.: Coldest period of the last Ice Age, during which global temperatures averaged 14°F cooler than they do today[47] (with local variations as much as 59°F lower[48]), with an increasingly arid environment and much more difficult conditions of survival to which plants, animals, and humans all had to adapt.[49] The Eurasian steppes just before and during this time had a maximum annual summer temperature of only 59°F.[50]

Humans in Europe and northern Asia, and later in North America, adapted by increasing their hunting of the large mammals such as mammoths, horses, bison and caribou which flourished on the open grasslands, tundra, and steppes which spread during this period.[51] Storage of vegetable foods that could be consumed during the harsh winters was also exploited. Clothing methods were improved (including needles with eyes) and sturdier shelters developed--the most common being animal hides wrapped around wooden posts, some of which had sunken floors and hearths.[52] In the tropics, large areas became arid. (In South Africa, for instance, the vegetation consisted mostly of shrubs and grass with few fruits.[53])

20,000 B.C. to 9,000 B.C.: Transitional period known as the "Mesolithic," during which the bow-and-arrow appeared,[54] and gazelle, antelope, and deer were being intensively hunted,[55] while at the same time precursor forms of wild plant and game management began to be more intensively practiced. At this time, wild grains, including wheat and barley by 17,000 B.C.--before their domestication--were being gathered and ground into flour as evidenced by the use of mortars-and-pestles in what is now modern-day Israel. By 13,000 B.C. the descendants of these peoples were harvesting wild grains intensely and it was only a small step from there to the development of agriculture.[56] Game management through the burning-off of land to encourage grasslands and the increase of herds became widely practiced during this time as well. In North America, for instance, the western high plains are the only area of the current United States that did not see intensive changes to the land through extensive use of fire.[57]

Also during this time, and probably also for some millennia prior to the Mesolithic (perhaps as early as 45,000 B.C.), ritual and magico-religious sanctions protecting certain wild plants developed, initiating a new symbiotic relationship between people and their food sources that became encoded culturally and constituted the first phase of domestication well prior to actual cultivation. Protections were accorded to certain wild food species (yams being a well-known example) to prevent disruption of their life cycle at periods critical to their growth, so that they could be profitably harvested later.[58] Digging sticks for yams have also been found dating to at least 40,000 B.C.,[59] so these tubers considerably antedated the use of grains in the diet.

Foods known to be gathered during the Mesolithic period in the Middle East were root vegetables, wild pulses (peas, beans, etc.), nuts such as almonds, pistachios, and hazelnuts, as well as fruits such as apples. Seafoods such as fish, crabs, molluscs, and snails also became common during this time.[60]

 
Top
Skamall
view post Posted on 15/2/2012, 15:19




CITAZIONE
Approx. 10,000 B.C.: The beginning of the "Neolithic" period, or "Agricultural Revolution," i.e., farming and animal husbandry. The transition to agriculture was made necessary by gradually increasing population pressures due to the success of Homo sapiens' prior hunting and gathering way of life. (Hunting and gathering can support perhaps one person per square 10 miles; Neolithic agriculture 100 times or more that many.[61]) Also, at about the time population pressures were increasing, the last Ice Age ended, and many species of large game became extinct (probably due to a combination of both intensive hunting and disappearance of their habitats when the Ice Age ended).[62] Wild grasses and cereals began flourishing,* making them prime candidates for the staple foods to be domesticated, given our previous familiarity with them.[63] By 9,000 B.C. sheep and goats were being domesticated in the Near East, and cattle and pigs shortly after, while wheat, barley, and legumes were being cultivated somewhat before 7,000 B.C., as were fruits and nuts, while meat consumption fell enormously.[64] By 5,000 B.C. agriculture had spread to all inhabited continents except Australia.[65] During the time since the beginning of the Neolithic, the ratio of plant-to-animal foods in the diet has sharply increased from an average of probably 65%/35%* during Paleolithic times[66] to as high as 90%/10% since the advent of agriculture.[67]

Remains of fossil humans indicate decrease in health status after the Neolithic. In most respects, the changes in diet from hunter-gatherer times to agricultural times have been almost all detrimental, although there is some evidence we'll discuss later indicating that at least some genetic adaptation to the Neolithic has begun taking place in the approximately 10,000 years since it began. With the much heavier reliance on starchy foods that became the staples of the diet, tooth decay, malnutrition, and rates of infectious disease increased dramatically over Paleolithic times, further exacerbated by crowding leading to even higher rates of communicable infections.

Skeletal remains show that height decreased by four inches* from the Late Paleolithic to the early Neolithic, brought about by poorer nutrition, and perhaps also by increased infectious disease causing growth stress, and possibly by some inbreeding in communities that were isolated. Signs of osteoporosis and anemia, which was almost non-existent in pre-Neolithic times, have been frequently noted in skeletal pathologies observed in the Neolithic peoples of the Middle East. It is known that certain kinds of osteoporosis which have been found in these skeletal remains are caused by anemia, and although the causes have not yet been determined exactly, the primary suspect is reduced levels of iron thought to have been caused by the stress of infectious disease rather than dietary deficiency, although the latter remains a possibility.[68]

 
Top
Skamall
view post Posted on 15/2/2012, 15:44




CITAZIONE
Animals who eat meat have large canines, rough rasping tongues, sharp claws, and short digestive tracts to eliminate the poisons in the meat before it putrefies, etc.

Anthropomorphism is the psychological tendency to unconsciously make human behavior the standard for comparison, or to project human characteristics and motivations onto the things we observe. Reverse anthropomorphism in this case would be saying humans should take specific behaviors of other animals as our own model where food is concerned.

Those who maintain that the only "natural" food for us is that which we can catch or process with our bare hands are by any realistic evolutionary definition for what is natural grossly in error, since stone tools for obtaining animals and cutting the flesh have been with us almost 2 million years now.

you either agree to the "animal analogy" for raw-food eating and vegetarianism, or you have reservations about it--but you are not offering scientific evidence.

what is "natural" is simply what we are adapted to by evolution, and a central axiom of evolution is that what we are adapted to is the behavior our species engaged in over a long enough period of evolutionary time for it to have become selected for in the species' collective gene pool.

CITAZIONE
Type of fat may be more important than amount of fat. As a related point, while it is likely to be controversial for some time to come, an increasing amount of recent research on fats in the diet suggests there may actually be little if any correlation between the overall level of fat consumption and heart disease, atherosclerosis, cancer, etc., contrary to previous conclusions. (See The World's Biggest Fad Diet, offsite, for an overview of the recent studies that have called into question the conventional wisdom's low-fat gospel.) Instead, the evidence seems to point more specifically to the trans fats (hydrogenated vegetable oils used to emulsify and preserve foods) which permeate much of the modern food supply in supermarkets, and highly saturated fats (including dairy products and probably saturated commercial meats, but not lean game meats) along with, perhaps, a high consumption of starches and refined carbohydrates driving the hyperinsulinism syndrome. (See the postscript to Part 2 for a brief discussion of the recent findings on hyperinsulinism.)

the evolutionary increase in brain size would not have been able to be supported physiologically without an increased intake of preformed long-chain fatty acids, which are an essential component in the formation of brain tissue.

Recent work is showing that the brain (20-25% of the human metabolic budget) and the intestinal system are both so metabolically energy-expensive that in mammals generally (and this holds particularly in primates), an increase in the size of one comes at the expense of the size of the other in order not to exceed the organism's limited "energy budget" that is dictated by its basal metabolic rate. The suggestion here is not that the shrinkage in gut size caused the increase in brain size, but rather that it was a necessary accompaniment. In other words, gut size is a constraining factor on potential brain size, and vice versa. [Aiello and Wheeler 1995]

compared to the other primates, the design of the more compact human gut is less efficient at extracting sufficient energy and nutrition from fibrous foods and considerably more dependent on higher-density, higher-bioavailable foods, which require less energy for their digestion per unit of energy/nutrition released. Again, while it is not clear that the increasing levels of animal flesh in the human diet were a directly causative factor in the growth of the evolving human brain, their absence would have been a limiting factor regardless, without which the change likely could not have occurred. Other supporting data suggest that in other animals there is a pattern whereby those with larger brain-to-body-size ratios are carnivores and omnivores, with smaller, less complex guts, and dependent on diets of denser nutrients of higher bioavailability.

The human pattern of an overall smaller gut with a proportionately longer small intestine dedicated more to absorptive functions, combined with a simple stomach, fits the same pattern seen in carnivores. [Aiello and Wheeler 1995, p. 206]

 
Top
Skamall
view post Posted on 22/2/2012, 16:07




CITAZIONE
The question is not simply whether fire and cooking are "natural" by some subjective definition. It's whether they have been used long enough and consistently enough by humans during evolutionary time for our bodies to have adapted genetically to the effects their use in preparing foods may have on us. Again, this is the definition for "natural" that you have to adopt if you want a functional justification that defines "natural" based on scientific validation rather than subjectivity.

Earliest dates for control of fire accepted by skeptical critics. At the other end of the timescale, these same critics who are only willing to consider the most unequivocal evidence will still admit that at least by 230,000 years ago[102] there is enough good evidence at at least one site to establish fire was under control at this time by humans. At this site, called Terra Amata, an ancient beach location on the French Riviera, stone hearths are found at the center of what may have been huts; and more recent sources may put the site's age at possibly 300,000 years old rather than 230,000.[103]
Somewhat further back--from around 300,000 to 500,000 years ago--more evidence has been accumulating recently at sites in Spain and France[104] that looks as if it may force the ultraconservative paleontologists to concede their 230,000-year-ago date is too stingy, but we'll see.


The most recent excavation with evidence for early use of fire has been within just the last couple of years in France at the Menez-Dregan site, where a hearth and evidence of fire has been preliminarily dated to approximately 380,000 to 465,000 years. If early interpretations of the evidence withstand criticism and further analysis, the fact that a hearth composed of stone blocks inside a small cave was found with burnt rhinoceros bones close by has provoked speculation that the rhino may have been cooked at the site.

125,000 years ago is the earliest reasonable estimate for widespread control.


The first fires on earth occurred approximately 350 million years ago--the geological evidence for fire in remains of forest vegetation being as old as the forests themselves.[110] It is usual to focus only on fire's immediately destructive effects to plants and wildlife, but there are also benefits.
birds of prey (such as falcons and kites)--the first types of predators to appear at fires--are attracted to the flames to hunt fleeing animals and insects. Later, land-animal predators appear when the ashes are smoldering and dying out to pick out the burnt victims for consumption. Others, such as deer and bovine animals appear after that to lick the ashes for their salt content. Notable as well is that most mammals appear to enjoy the heat radiated at night at sites of recently burned-out fires.[112]

It would have been inconceivable, therefore, that human beings, being similarly observant and opportunistic creatures, would not also have partaken of the dietary windfall provided by wildfires they came across. And thus, even before humans had learned to control fire purposefully--and without here getting into the later stages of control over fire--their early passive exposures to it would have already introduced them, like the other animals, to the role fire could play in obtaining edible food and providing warmth.


Niles Eldredge, along with Stephen Jay Gould, two of the most well-known modern evolutionary theorists, estimated the time span required for "speciation events" (the time required for a new species to arise in response to evolutionary selection pressures) to be somewhere within the range of "five to 50,000 years."


this time span is for changes large enough to result in a new species classification. Since we are talking here about changes (digestive changes) that may or may not be large enough to result in a new species (though changes in diet often are in fact behind the origin of new species), it's difficult to say from this particular estimate whether we may be talking about a somewhat shorter or longer time span than that for adaptation to changes in food.

those population groups that do retain the ability to produce lactase and digest milk into adulthood are those descended from the very people who first began domesticating animals for milking during the Neolithic period several thousand years ago.[119] (The earliest milking populations in Europe, Asia, and Africa began the practice probably around 4,000 B.C.[120]) And even more interestingly, in population groups where cultural changes have created "selection pressure" for adapting to certain behavior--such as drinking milk in this case--the rate of genetic adaptation to such changes significantly increases. In this case, the time span for widespread prevalence of the gene for lactose tolerance within milking population groups has been estimated at approximately 1,150 years[121]--a very short span of time in evolutionary terms.
the example does powerfully illustrate that genetic adaptations for digestive changes can take place with much more rapidity than was perhaps previously thought.

two specific genes whose prevalence has been found to correlate with the amount of time populations in different geographical regions have been eating the grain-based high-carbohydrate diets common since the transition from hunting and gathering to Neolithic agriculture began 10,000 years ago. (These two genes are the gene for angiotensin-converting enzyme--or ACE--and the one for apolipoprotein B, which, if the proper forms are not present, may increase one's chances of getting cardiovascular disease.)

Those in Mediterranean countries who have been eating high-carbohydrate grain-based diets the longest (for example since approximately 6,000 B.C. in France and Italy) have the lowest rates of heart disease, while those in areas where dietary changes due to agriculture were last to take hold, such as Finland (perhaps only since 2,000 B.C.), have the highest rates of death due to heart attack. Statistics on breast cancer rates in Europe also are higher for countries who have been practicing agriculture the least amount of time.
Whether grain-based diets eaten by people whose ancestors only began doing so recently (and therefore lack the appropriate gene) is actually causing these health problems (and not simply correlated by coincidence) is at this point a hypothesis under study. (One study with chickens, however--who in their natural environment eat little grain--has shown much less atherosclerosis on a high-fat, high-protein diet than on a low-fat, high-carbohydrate diet.

Cavalli-Sforza - Their estimate here is that the current variants of these genes were selected for within the last 50,000-100,000 years..,
However, the significant exception they mention--and this relates especially to our discussion here--is where there are cultural pressures for certain behaviors that affect survival rates.[128] And the two examples we cited above: the gene for lactose tolerance (milk-drinking) and those genes associated with high-carbohydrate grain consumption, both involve cultural selection pressures that came with the change from hunting and gathering to Neolithic agriculture. Again, cultural selection pressures for genetic changes operate more rapidly than any other kind.

 
Top
Skamall
view post Posted on 22/2/2012, 16:43




CITAZIONE
Nature's pesticides appear to be present in all plants, and though only a few are found in any one plant, 5-10% of a plant's total dry weight is made up of them.

We have a liver and kidneys for a reason, which is that there have always been toxins in natural foods that the body has had to deal with..

..constant shedding of surface-layer cells of the digestive system, many defenses against oxygen free-radical damage, and DNA excision repair, among others.


Also, and I know raw-foodists generally don't like to hear this, but there has long been evidence cooking in fact does make foods of certain types more digestible. For example, trypsin inhibitors (themselves a type of protease inhibitor) which are widely distributed in the plant kingdom, particularly in rich sources of protein, inhibit the ability of digestive enzymes to break down protein. (Probably the best-known plants containing trypsin inhibitors are legumes and grains.)
Research has shown the effect of most such protease inhibitors on digestion to be reduced by cooking.[138] And it is this advantage in expanding the range of utilizable foods in an uncertain environment that was the evolutionary advantage that helped bring cooking about and enhanced survival.

 
Top
Skamall
view post Posted on 22/2/2012, 17:08




CITAZIONE
there is provision in the evolutionary picture for reasonable amounts of cooked foods of certain types, such as at the very least, yams, probably some other root vegetables, the legumes, some meat, and so forth. (With meat, the likelihood is that it was eaten raw when freshly killed, but what could not be eaten would likely have been dried or cooked to preserve it for later consumption, rather than wasting it.) Whether or not some foods like these can be eaten raw if one has no choice or is determined enough to do so is not the real question. The question is what was more expedient or practical to survival and which prevailed over evolutionary time.


in general, we can probably say there just hasn't been enough time for full adaptation yet. Or if so, only for people descended from certain ancestral groups with the longest involvement with agriculture.

My guess (and it is just a guess) would be that we are still mostly adapted to a Paleolithic diet, but for any particular individual with a given ancestral background, certain Neolithic foods such as grains, perhaps even modest amounts of certain cultured milk products such as cheese or yogurt (ones more easily digested than straight milk) for even fewer people, might be not only tolerated, but helpful. Especially where people are avoiding flesh products which is our primary animal food adaptation, these animal by-products may be helpful,..


So my advice is: don't be afraid to experiment. Unless you have specific allergies or strong food intolerances and whatnot, the body is flexible enough by evolution to handle short-term variations in diet from whatever an optimal diet might be anyway. If you start within the general parameters we've outlined here and allow yourself to experiment, you have a much better chance of finding the particular balance among these factors that will work best for you. If you already have something that works well for you, that's great. If, however, you are looking for improvements, given the uncertainties above we've talked about, it's important to look at any rigid assumptions you may have about the "ideal" diet, and be willing to challenge them through experimentation. In the long run, you only have yourself to benefit by doing so.

Campbell's conclusions about cholesterol and animal protein are contradicted by evidence from studies of modern hunter-gatherers. Yet as rigorous as the study is proclaimed to be, I have to tell you that Campbell's claim that animal protein by itself is the biggest culprit in raising blood cholesterol is contradicted by studies of modern-day hunter-gatherers eating considerable amounts of wild game in their diet who have very low cholesterol levels comparable to those of the China study. One review of different tribes studied showed low cholesterol levels for the Hadza of 110 mg/dl (eating 20% animal food), San Bushmen 120 (20-37% animal), Aborigines 139 (10-75% animal), and Pygmies at 106, considerably lower than the now-recommended safe level of below 150.[145] Clearly there are unaccounted-for factors at work here yet to be studied sufficiently.
the proportion of saturated fat in domesticated meat compared to wild game is also five times higher.

Other differences between these two meat sources are that significant amounts of EPA.
Since omega-6 fatty acids may have a cancer-promoting effect, some investigators are recommending lower ratios of omega-6 to omega-3 in the diet which would, coincidentally, be much closer to the evolutionary norm.

Analyses of numerous skeletons from our Paleolithic ancestors have shown development of high peak bone mass and low rates of bone loss in elderly specimens compared to their Neolithic agricultural successors whose rates of bone loss increased considerably even though they ate much lower-protein diets.[153] Why, nobody knows for sure, though it is thought that the levels of phosphorus in meat reduce excretion of calcium, and people in Paleolithic times also ate large amounts of fruits and vegetables[154] with an extremely high calcium intake (perhaps 1,800 mg/day compared to an average of 500-800 for Americans today[155]) and led extremely rigorous physical lives, all of which would have encouraged increased bone mass.[156]

 
Top
Skamall
view post Posted on 22/2/2012, 17:39




CITAZIONE
Not all "hunter-gatherer" tribes of modern times eat diets in line with Paleolithic norms. Aspects of their diets and/or lifestyle can be harmful just as modern-day industrial diets can be. When using these people as comparative models, it's important to remember they are not carbon copies of Paleolithic-era hunter-gatherers.[157] They can be suggestive (the best living examples we have), but they are a mixed bag as "models" for behavior, and it is up to us to keep our thinking caps on

Masai tribe of Africa who are really more pastoralists (animal herders) than hunter-gatherers. They have low cholesterol levels ranging from 115 to 145,[158] yet autopsies have shown considerable atherosclerosis.[159] Why? Maybe because they deviate from the Paleolithic norm of 20-25% fat intake due to their pastoralist lifestyle by eating a 73% fat diet that includes large amounts of milk from animals in addition to meat and blood.*[160] Our bodies do have certain limits.

encroaching civilization
"growth arrest lines" in human bone that can be seen under conditions of nutritional deprivation. Such nutritional stress is most likely for hunter-gatherers in environments where either the number of food sources is low (exposing them to the risk of undependable supply), or where food is abundant only seasonally (polinesia)

Going without food--or fasting while under conditions of total rest as hygienists do as a regenerative/recuperative measure--is one thing, but nutritional stress or deprivation while under continued physical stress is unhealthy and leaves one more susceptible to pathologies including infection.
less artificially controlled sanitary conditions (one of the areas where modern civilization is conducive rather than destructive to health)-
Creatures in the wild are in frequent contact with feces and other breeding grounds for microorganisms such as rotting fruit and/or carcasses, to which they are exposed by skin breaks and injuries, and so forth.[164]

large infectious viral and bacterial plagues in the past and present among wild animal populations are known to have occurred.
Signs of infectious disease in the fossil record have also been detected in remains as far back as the dinosaur-age, as have signs of immune system mechanisms to combat them

 
Top
Skamall
view post Posted on 22/2/2012, 18:06




CITAZIONE
Pure "naturalism" often overlooks the large positive impact of modern environmental health advantages. One of the conclusions to be drawn from this is that artificial modern conditions are not all bad where health is concerned. Such conditions as "sanitation" due to hygienic measures, shelter and protection from harsh climatic extremes and physical trauma, professional emergency care after potentially disabling or life-threatening accidents, elimination of the stresses of nomadism, plus protection from seasonal nutritional deprivation due to the modern food system that Westerners like ourselves enjoy today all play larger roles in health and longevity than we realize

It seems to me you can easily have "enervation" (lowered energy and resistance) without toxemia..
Indeed I have personally become ill once or twice during the rebuilding period after lengthy fasts when overworked, a situation in which it would be difficult to blame toxemia as the cause.

The examples of modern-day hunter-gatherers as well as those of chimps should show us that you can eat a healthy natural diet and still suffer from health problems, including infectious disease, due to excessive stresses--what we would call "enervation" in Natural Hygiene.

CITAZIONE
In general, while I do take the evolutionary picture heavily into account, I also believe it is important to listen to our own bodies and experiment, given the uncertainties that remain.

Diet only gets you so far. I usually sleep about 8-10 hours a night, and I very much enjoy vigorous exercise, which I find is necessary to help control my blood-sugar levels, which are still a weak spot for me. The optimum amount is important, though. A few years ago I was running every day, totaling 35-40 miles/week and concentrating on hard training for age-group competition, and more prone to respiratory problems like colds, etc. (not an infrequent complaint of runners). In the last couple of years, I've cut back to every-other-day running totaling roughly 20 miles per week. I still exercise fairly hard, but a bit less intensely than before, I give myself a day of rest in between, and the frequency of colds and so forth is now much lower.


I still have an ongoing tussle with sugar-sensitivity due to the huge amounts of soft drinks I used to consume, and have to eat fruits conservatively. I also notice that I still do not hold up under stress and the occasional long hours of work as well as I think I ought to, even though it's better than before. Reducing stress and trying not to do so much in today's world is an area I really pay attention to and try to stay on top of. My own personal experience has been that no matter what kind of variation of the Hygienic or evolutionary "Paleolithic" diet I have tried so far, excessive prolonged stress of one sort or another (whether physical or mental) is a more powerful factor than lapses in diet in bringing on symptoms of illness, assuming of course that one is consistently eating well most of the time.


And Chet, this brings up something I also want to emphasize: Just as you've freely mentioned about yourself here in H&B on numerous occasions, I'm not perfect in my food or health habits, and I don't intend to set myself up as some sort of example for anyone. Like anyone else, I'm a fallible human being. I still have the occasional extra-cheese pizza or frozen yogurt or creamy lasagna or whatever as treats, for example. Not that I consider having those on occasion to be huge sins or anything. I stick to my intended diet most of the time, but I don't beat myself up for the indiscretions. I would hope other people don't beat themselves up for it either, and that we can all be more forgiving of each other than that.

 
Top
Skamall
view post Posted on 22/2/2012, 18:53




CITAZIONE
How can things end up bad when they started out so good?
As we mentioned earlier, the key conversion mechanism for convincing people of the Natural Hygiene system--or other "clean-diet" approaches--is that when they go on a cleaner diet, almost everyone improves initially because of the detoxification that takes place. People often feel so incredibly good initially on a clean diet that they cannot believe--they do not want to believe--it could somehow go sour.
However, over the longer term, many people's bodies simply may not be extracting sufficient nutrition from the diet and they begin to decline, or develop various symptoms even if they still feel okay, presumably due to deficiencies after their reserves are exhausted. Except in the case of B-12, these are not usually your typical "deficiency-disease" symptoms--like the case of rickets that was mentioned earlier. They are more subtle, but they have their effect, and they can be slowly debilitating over long-enough periods of time.

The lulling effect of imperceptibly slow declines in health. From a psychological standpoint, the most interesting thing about this syndrome is that it occurs slowly enough that people often mentally adjust to the lowered state of health and do not perceive it as such, particularly since they so strongly do not want to believe it is happening. Instead, they continue to feel they are doing fine--sometimes even when they describe some of the above symptoms, which they may interpret as improvements that indicate "continued detox"--while people who know them may tell them otherwise. Yet because of the extreme anti-authoritarianism, they don't listen to what others are telling them, until finally, the problems reach a point they can't deny them anymore.


The dynamic, of course, is a little different for those who are getting good results. They, too, are plenty interested in being right about Hygiene, but since it has worked for them, they have a more solid sense of "certainty" about it, which of course they project to others. And because they have this greater sense of certainty, these individuals tend to become the repositories of traditional wisdom which is dispensed to the less fortunate.

hat's one of the biggest things holding all this in place: Those for whom Hygiene has worked cannot afford to consider failures as failures of Hygiene either, because just like everyone, they uphold the ideal that it must work for all, and so if it does not, it is also a threat to their own beliefs in spite of their success.

Significant parallels with religious behavior
The role of unseen and unverifiable "toxemia" as evidence of one's "sin."
The "relativity" of absolute purification transforms it into an ever-receding goal and goad.
Self-restriction becomes its own virtue as absolute purity recedes.
Endgame: fundamentalist obsessive-compulsiveness

Successful Natural Hygiene diets are often less strict and more diverse than traditional/"official" recommendations.

Focus first on results rather than theoretical certainty.
Don't ignore feedback about how you are doing from people who know you well.
Open-mindedness is an ongoing process, not something one achieves and then "has." In fact, the tendency if we do nothing is for our minds to crystallize and become more closed as we age. We all die eventually no matter what diet we eat. I would hope we can all stay forever young at heart even as our bodies inevitably age. That's the most important thing.

CITAZIONE
Thank you for talking with us, Ward.

And thank you, Chet, for the opportunity to express these views with a wider audience.

 
Top
13 replies since 2/2/2012, 16:47   109 views
  Share