Information

Evolutionary origin and exogenous cues of ~28 day infradian rhythm?

Evolutionary origin and exogenous cues of ~28 day infradian rhythm?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

The most obvious example of an approximately monthly biological cycle is the human menstrual cycle. My questions are the following:

  • Is it known when and where this cycle or one like it arose?

  • What exogenous cue(s) (if any) is/are this cycle based on? The moon's orbit around the earth is seemingly obvious however the only cue that I can think of that would be easily sensed by organisms is variance in night-time illumination, which seems like a very weak phenomenon when compared to day/night and seasonal cycles. King and neap tides may be a suitable cue however they have a half-monthly modulation.

  • The moon has been steadily retreating from the earth over the course of its history, which means that months have been slowly getting longer. Is there any fossil (e.g. biogeochemical) evidence of organisms with biological cycles in synchrony with a 'short month'?


A double-blind, prospective study during the fall of 1979 investigated the association between the menstrual cycles of 305 Brooklyn College undergraduates and their associates and the lunar cycles.

… Approximately 1/3 of the subjects had lunar period cycles, i.e. a mean cycle length of 29.5 ± 1 day. Almost 2/3 of the subjects started their October cycle in the light 1/2 of the lunar cycle, significantly more than would be expected by random distribution. The author concludes that there is a lunar influence on ovulation.

(Menstrual and Lunar Cycles, Friedmann E., American Journal of Obstetrics and Gynecology, 1981)

Another source supports this conclusion, finding that "a large proportion of menstruations occurred around the new moon."

Somewhat related, this study found that light exposure shortened menstruation cycles.

In summary, there seems to be a good amount of data suggesting that lunar cycles do in fact calibrate the length of human menstrual cycles to some degree.


Talk:Infradian rhythm/Archive 1

It would probably be good to also reference Ultradian and Circadian rhythms. And link to the article on Menstrual Cycles rather than describe them here.

This definition is correct according to the OED's entry on Ultradian ryhthms, the Stedman's Medical Dictionary (infradian. (n.d.). The American Heritage® Stedman's Medical Dictionary. Retrieved September 13, 2007, from Dictionary.com website:), primary sources like http://archpsyc.ama-assn.org/cgi/content/abstract/42/3/295, and just plain etymology (infra = below in the context of waves, like light, "slower than". I.e. infrared has a longer period than red.

However, be aware that the online Merriam-Webster's medical dictionary (infradian. (n.d.). Merriam-Webster's Medical Dictionary. Retrieved September 13, 2007, from Dictionary.com website: http://dictionary.reference.com/browse/infradian) returns a contrary definition -- i.e. periods shorter than a day.

Shaav 20:34, 13 September 2007 (UTC)

Dictionary.com now has no entry for infradian, however its "entry for ultradian" gives a definition from Random House that is consistent with this and the ultradian page. --ddickison (talk) 19:41, 3 December 2008 (UTC) Infradian is definitely cycles shorter than a day, this page is wrong. —Preceding unsigned comment added by 159.92.101.25 (talk) 15:38, 5 April 2010 (UTC) In "infrared", "red" refers to a frequency, not a period, and "infrared" means "less than red". But in "infradian", "-dian" refers to a period (a day) which suggests "less than a day" (?). lifeform (talk) 23:31, 22 April 2014 (UTC)

Googling "infradian" 23 April 2014

    : Adjective. (Of a rhythm or cycle) having a period of recurrence longer than a day occurring less than once a day. Origin: mid 20th century: from infra- 'below' (i.e. expressing a lower frequency). : being, characterized by, or occurring in periods or cycles (as of biological activity) of less than 24 hours : Cyclical with a periodicity significantly longer than 24 hours, that is, less frequent than circadian. : Relating to biological variations or rhythms occurring in cycles less frequent than every 24 hours : referring to cycles longer than 24 hours (for example monthly menstruation) : A biorhythm whose periodicity is less than a day in length This site quotes definitions from several sources:
  1. Relating to biological variations or rhythms occurring in cycles less frequent than every 24 hours (The American Heritage® Medical Dictionary)
  2. pertaining to a period longer than 24 hours applied to the cyclic behavior of certain phenomena in living organisms (Saunders Comprehensive Veterinary Dictionary, 3 ed. © 2007 Elsevier, Inc.)
  3. Relating to biologic variations or rhythms occurring in cycles less frequent than every 24 hours (Medical Dictionary for the Health Professions and Nursing © Farlex) [Aside: I would have thought that nursing was a health profession.]
  4. And quotes from various publications: "Rhythms shorter than 24 hours are called ultradian, while those longer than 24 hours are called infradian", "Three main cycles, or rhythms, have been identified: ultradian (20 hours or less), circadian (20 to 28 hours), and infradian (28 hours or more)", "infradian rhythms [. ] range in period from several days, months, or years", "regular recurrence in cycles of more than 24 hours"

And all that is from Google's first page of hits! So what do we have?

longer than a day significantly longer than 24 hours cycles less frequent than every 24 hours less frequent than every 24 hours longer than 24 hours a period longer than 24 hours less frequent than every 24 hours longer than 24 hours 28 hours or more several days, months, or years of more than 24 hours of less than 24 hours (Merriam-Webster) less than a day (Encyclopedia.com)

Sources don't agree, but the majority say significantly longer than 24 hours. So our article here, as well as Wictionary, agree with the majority. --Hordaland (talk) 08:06, 23 April 2014 (UTC)

Should this be listed in the "rhythym" category? Everything else there seems to be musical, this isn't. —Preceding unsigned comment added by 59.101.238.220 (talk) 10:22, 15 March 2008 (UTC)

Are pheromones an exogenous cue? They seem internal in the effect described here.72.66.130.166 (talk) 22:55, 25 January 2009 (UTC)

Hmmm. That's not the way I read it. They are produced by one person (A) and affect the rhythm of another (B). From B's point of view, then, they are an exogenous cue. - Hordaland (talk) 08:44, 26 January 2009 (UTC)

Removing unenclopedic text from article and placing it here. User 24.8.36.102 added the (below) text with this edit summary: (heads up to incorrect definitions of infradian rhythms added, source http://www.biology-online.org/dictionary/Infradian)

  • "Contradicting sources state infradian rhythms to have a period of less than 24 hours, while ultradian rhythms having periods of more than one day but less than one year (old text book editions). Suggest further verification. Original definition for infradian rhythm here was stated less than 24 hours, which seems to be the CORRECT definition."

Please see Coleman, A. (2001). Dictionary of Psychology. Oxford University Press which says that ultradian rhythms are LONGER than 24 hours. ACEOREVIVED (talk) 20:44, 16 May 2009 (UTC)


Evaluating Infradian Rhythms

Research suggests that the menstrual cycle is, to some extent, governed by exogenous zeitgebers (external factors). Reinberg (1967) examined a woman who spent three months in a cave with only a small lamp to provide light. Reinberg noted that her menstrual cycle shortened from the usual 28 days to 25.7 days. This result suggests that the lack of light (an exogenous zeitgeber) in the cave affected her menstrual cycle, and therefore this demonstrates the effect of external factors on infradian rhythms.

There is further evidence to suggest that exogenous zeitgebers can affect infradian rhythms. Russell et al. (1980) found that female menstrual cycles became synchronised with other females through odour exposure. In one study, sweat samples from one group of women were rubbed onto the upper lip of another group. Despite the fact that the two groups were separate, their menstrual cycles synchronised. This suggests that the synchronisation of menstrual cycles can be affected by pheromones, which have an effect on people nearby rather than on the person producing them. These findings indicate that external factors must be taken into consideration when investigating infradian rhythms and that perhaps a more holistic approach should be taken, as opposed to a reductionist approach that considers only endogenous influences.

Evolutionary psychologists claim that the synchronised menstrual cycle provides an evolutionary advantage for groups of women, as the synchronisation of pregnancies means that childcare can be shared among multiple mothers who have children at the same time.

There is research to suggest that infradian rhythms such as the menstrual cycle are also important regulators of behaviour. Penton-Volk et al. (1999) found that woman expressed a preference for feminised faces at the least fertile stage of their menstrual cycle, and for a more masculine face at their most fertile point. These findings indicate that women’s sexual behaviour is motivated by their infradian rhythms, highlighting the importance of studying infradian rhythms in relation to human behaviour.

Finally, evidence supports the role of melatonin in SAD. Terman (1988) found that the rate of SAD is more common in Northern countries where the winter nights are longer. For example, Terman found that SAD affects roughly 10% of people living in New Hampshire (a northern part of the US) and only 2% of residents in southern Florida. These results suggest that SAD is in part affected by light (exogenous zeitgeber) that results in increased levels of melatonin.


Mathematical Modeling in Experimental Nutrition

L.Preston Mercer , Danita Saxon Kelley , in Advances in Food and Nutrition Research , 1996

II Characterization of Biological Rhythms

According to the approach pioneered by Halberg, deterministic, biological rhythms (i.e., chronobiologic rhythms) have four measurable parameters: the mean, amplitude, acrophase, and period ( Pauly, 1980 ). These are shown graphically in Fig. 1 .

Fig. 1 . A cosinor curve showing the various parameters of response.

The mean of a rhythm is the average value of a continuous variable over a single cycle. When the rhythm is described by the fitting of a cosine curve, the half way point between the peaks and the troughs is known as the MESOR. Only when the data are measured equidistantly, over an integral number of cycles, will the MESOR equal the arithmetic mean.

The amplitude refers to the magnitude of the response variable between its mean value and the (estimated) trough or peak. Such mathematical usage, however, is limited to rhythms which oscillate symmetrically about the mean value.

The phase refers to the value of a biological variable at a fixed time. The word phasing is often used to describe the shape of a curve that depicts the relationship of a biological function to time. Acrophase is a more limited term which refers to a specified reference standard or zero time and indicates the lag in the crest of the function used to describe the rhythm.

The period is the duration of one complete cycle in a rhythmic function and is equal to 1/frequency.

Haus and Halberg (1980) have further categorized rhythms (by time frame) as infradian, circadian, and ultradian. Circadian rhythms are the rhythms that have been studied most extensively and have periods in the range 20–28 hr (therefore, frequencies are about 0.04 cycles per hour). There are many examples that can be cited, including rhythms in mitotic activity, metabolic processes, and susceptibility to drugs.

Infradian rhythms have periods longer than 28 hr and therefore their frequencies are correspondingly lower than circadian. Some of the well known infradian rhythms are the human menstrual cycle and the annual reproductive cycle of salmon. Infradian rhythms have been identified in nutrient intake and metabolism of foodstuffs ( Reinberg, 1983 ). A more specific type of infradian rhythm is the circasemiseptan (period approximately 3.5 day) found by Schweiger et al. (1986) .

Ultradian rhythms have periods shorter than 20 hr. Examples of these rhythms are the electrocardiogram, respiration, peristalsis in the intestine, etc.

Rhythms may also be categorized as exogenous and endogenous ( Pauly, 1980 ). The exogenous rhythm can be caused, driven, and/or coordinated by a force in the environment, but disappears when the driving force ceases. The endogenous rhythm has an intrinsic mechanism and its coordination lies at a cellular level, such as transcription of DNA. Rhythmicity of phospholipids, RNA, DNA, glycogen content, and mitosis has been demonstrated by Halberg et al. (1959) . Endogenous rhythms have periods similar to, but statistically different from, their environmental counterparts. Those external influences (environmental factors) which are capable of entraining a rhythm are referred to as synchronizers ( Minors and Waterhouse, 1981 ), and their manipulation can reset the phase of rhythms. Several environmental factors, such as light/dark cycles, sleep/wakefulness, timing of energy intake, and, presumably, qualitative dietary factors, may act simultaneously or separately on a given physiologic variable. One or the other of these external synchronizers may be dominant for the timing of the rhythm of a given function, but not for others. After a change in the synchronizer schedule, the adjustment of a rhythm to the changed environmental routine will occur with a different rate for different variables ( Haus and Halberg, 1980 ). However, if the external synchronizer disappears, the endogenous rhythm will not disappear and will take on a characteristic called “free running.” Our goal in this manuscript is to demonstrate the protocols necessary for time-based analysis of weight gain in rats. The techniques can then be applied to other responses.


KEY TERMS

BIOLOGICAL CLOCK:

A mechanism within an organism (for example, the pineal gland in the human brain) that governs biological rhythms.

BIOLOGICAL RHYTHMS:

Processes that occur periodically in an organism in conjunction with and often in response to periodic changes in environmental conditions.

CHRONOBIOLOGY:

A subdiscipline of biology devoted to the study of biological rhythms.

CIRCADIAN RHYTHM:

A biological cycle that takes place over the course of approximately a day. In humans circadian rhythms run on a cycle of approximately 25 hours and govern states of sleep and wakefulness as well as core body tempera ture and other biological functions.

HORMONE:

Molecules produced by living cells, which send signals to spots remote from their point of origin and which induce specific effects on the activities of other cells.

INFRADIAN RHYTHM:

A biological cycle that takes place over the course of a month.

JET LAG:

A physiological and psycho logical condition in humans that typically includes fatigue and irritability it usually follows from a long flight through several time zones and probably results from disruption of circadian rhythms.

MENOPAUSE:

The point at which menstrual cycles cease, a time that typical ly corresponds to the cessation of the female's reproductive abilities.

MENSTRUATION:

Sloughing off of the lining of the uterus, which occurs monthly in non pregnant females who have not reached menopause (the point at which menstrual cycles cease) and which manifests as a discharge of blood.

PINEAL GLAND:

A small, usually cone-shaped portion of the brain, often located between the two lobes, that plays a principal role in governing the release of certain hormones, including those associated with human circadian rhythms.

ULTRADIAN RHYTHM:

A biological cycle that takes place over the course of less than a day. Compare with circadian rhythm.

Cite this article
Pick a style below, and copy the text for your bibliography.

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

Notes:
  • Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.
  • In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.

Thursday, 5 November 2015

Schizophrenia - cognitive therapies

One more schizophrenia post after this one - diagnosis, reliability & validity. This post will cover Cognitive Behavioural Therapy and Cognitive Behavioural Family therapy, two psychological treatments for schizophrenia. There's probably more here than you can expect to write in half an hour, so pick your favourite studies and relevant evaluative points and use those. The only one I would recommend to definitely use would be be Falloon et al (1985), as it is some strong supporting evidence for the efficiency of CBFT compared to CBT.

Black: AO1 - Description
Blue: AO2 - Evaluation - studies
Red: AO2 - Evaluation - evaluative points

Cognitive Behavioural Therapy

CBT is not a "cure" for schizophrenia, as the cognitive distortions and disorganised thinking associated with schizophrenia are a result of biological processes that will not right themselves when the correct interpretation of reality is explained to the patient. The patient is not in control of their thought processes. The goal of CBT is to help the patient use information from the world to make adaptive coping decisions - improving their ability to manage problems, to function independently and to be free of extreme distress and other psychological symptoms. CBT teaches them the social skills that they never learned, as well as how to learn from experience and better assess cause and effect. Skills taught often address negative symptoms, such as alogia, social withdrawal and avolition, and can include social communication skills, the importance of taking antipsychotics routinely, and managing paranoia and delusions of persecution by challenging the evidence for these irrational beliefs.

Cognitive Behavioural Family Therapy (CBFT) is designed to delay relapse by helping the family of the schizophrenic to support the patient, by methods such as stress management training, relaxation techniques, communication and social skills, emphasis on the importance of antipsychotic drugs, and assessment of expressed emotion. High levels of expressed emotion on scales of hostility, emotional over-involvement and critical comments have been linked to rehospitalisation, so CBFT uses cognitive and behavioural methods to lower the emotional intensity of the patient’s home life. It has two general goals: To educate family members about schizophrenia, and to restructure family relationships to facilitate a healthier emotional environment.

Laing suggested the most important factor in the progression of schizophrenia is the family and how they treat the patient. A study by Brown (1972) supports this - he studied family communication patterns in schizophrenics returning home after hospitalisation. Results showed that communication was a critical variable in whether patients would relapse into a psychotic state – patients returning to homes with a high level of expressed emotion were much more likely to relapse than those returning to homes with a low level. This supports the role of expressed emotion in determining long-term outcomes for schizophrenics.

Vaughn + Leff (1976) studied 128 schizophrenics discharged from hospital and returned to their families. Communication patterns between family members were rated for EE. The crucial finding was that families showing high levels of negative expressed emotion (Hostility, over-involvement, criticism) were more likely to have their patient relapse than families showing low levels of negative EE. Relatives with high levels of negative EE responded fearfully to the patient, characterised by lacking insight into and understanding of the condition.

Leff + Vaughn (1985) found that a high level of positive EE with communication patterns showing warmth and positive comments is associated with prevention of relapse. They concluded that not all expressed emotion is detrimental to the relapse prospects of the patient.

Sarason + Sarason (1998) summarised key findings from research into EE and schizophrenia:

  • Rates of EE in a family may change over time – during periods of lower symptom severity, rates of negative EE drop and vice versa. High rates of EE may only reflect periods of high symptoms severity, and not be an overall reflection of the family dynamic.
  • Cultural factors may play a role in EE. The association between high EE rates and relapse has been replicated in many cultures, but cultural factors may influence rate of EE and the way it is communicated. Cross-cultural studies have shown that Indian and Mexican-American families show lower levels of negative EE than Anglo-American families.
  • EE is not limited to families. The association between EE and relapse has been demonstrated with patients living in community care - the significant factor could be communication patterns between patient and those they live with, rather than with family.

Contents

The term ethology derives from the Greek language: ἦθος, ethos meaning "character" and -λογία , -logia meaning "the study of". The term was first popularized by American myrmecologist (a person who studies ants) William Morton Wheeler in 1902. [6]

The beginnings of ethology Edit

Because ethology is considered a topic of biology, ethologists have been concerned particularly with the evolution of behaviour and its understanding in terms of natural selection. In one sense, the first modern ethologist was Charles Darwin, whose 1872 book The Expression of the Emotions in Man and Animals influenced many ethologists. He pursued his interest in behaviour by encouraging his protégé George Romanes, who investigated animal learning and intelligence using an anthropomorphic method, anecdotal cognitivism, that did not gain scientific support. [7]

Other early ethologists, such as Charles O. Whitman, Oskar Heinroth, Wallace Craig and Julian Huxley, instead concentrated on behaviours that can be called instinctive, or natural, in that they occur in all members of a species under specified circumstances. Their beginning for studying the behaviour of a new species was to construct an ethogram (a description of the main types of behaviour with their frequencies of occurrence). This provided an objective, cumulative database of behaviour, which subsequent researchers could check and supplement. [6]

Growth of the field Edit

Due to the work of Konrad Lorenz and Niko Tinbergen, ethology developed strongly in continental Europe during the years prior to World War II. [6] After the war, Tinbergen moved to the University of Oxford, and ethology became stronger in the UK, with the additional influence of William Thorpe, Robert Hinde, and Patrick Bateson at the Sub-department of Animal Behaviour of the University of Cambridge. [8] In this period, too, ethology began to develop strongly in North America.

Lorenz, Tinbergen, and von Frisch were jointly awarded the Nobel Prize in Physiology or Medicine in 1973 for their work of developing ethology. [9]

Ethology is now a well-recognized scientific discipline, and has a number of journals covering developments in the subject, such as Animal Behaviour, Animal Welfare, Applied Animal Behaviour Science, Animal Cognition, Behaviour, Behavioral Ecology and Journal of Ethology, Ethology. In 1972, the International Society for Human Ethology was founded to promote exchange of knowledge and opinions concerning human behaviour gained by applying ethological principles and methods and published their journal, The Human Ethology Bulletin. In 2008, in a paper published in the journal Behaviour, ethologist Peter Verbeek introduced the term "Peace Ethology" as a sub-discipline of Human Ethology that is concerned with issues of human conflict, conflict resolution, reconciliation, war, peacemaking, and peacekeeping behaviour. [10]

Social ethology and recent developments Edit

In 1972, the English ethologist John H. Crook distinguished comparative ethology from social ethology, and argued that much of the ethology that had existed so far was really comparative ethology—examining animals as individuals—whereas, in the future, ethologists would need to concentrate on the behaviour of social groups of animals and the social structure within them. [11]

E. O. Wilson's book Sociobiology: The New Synthesis appeared in 1975, [12] and since that time, the study of behaviour has been much more concerned with social aspects. It has also been driven by the stronger, but more sophisticated, Darwinism associated with Wilson, Robert Trivers, and W. D. Hamilton. The related development of behavioural ecology has also helped transform ethology. [13] Furthermore, a substantial rapprochement with comparative psychology has occurred, so the modern scientific study of behaviour offers a more or less seamless spectrum of approaches: from animal cognition to more traditional comparative psychology, ethology, sociobiology, and behavioural ecology. In 2020, Dr. Tobias Starzak and Professor Albert Newen from the Institute of Philosophy II at the Ruhr University Bochum postulated that animals may have beliefs. [14]

Comparative psychology also studies animal behaviour, but, as opposed to ethology, is construed as a sub-topic of psychology rather than as one of biology. Historically, where comparative psychology has included research on animal behaviour in the context of what is known about human psychology, ethology involves research on animal behaviour in the context of what is known about animal anatomy, physiology, neurobiology, and phylogenetic history. Furthermore, early comparative psychologists concentrated on the study of learning and tended to research behaviour in artificial situations, whereas early ethologists concentrated on behaviour in natural situations, tending to describe it as instinctive.

The two approaches are complementary rather than competitive, but they do result in different perspectives, and occasionally conflicts of opinion about matters of substance. In addition, for most of the twentieth century, comparative psychology developed most strongly in North America, while ethology was stronger in Europe. From a practical standpoint, early comparative psychologists concentrated on gaining extensive knowledge of the behaviour of very few species. Ethologists were more interested in understanding behaviour across a wide range of species to facilitate principled comparisons across taxonomic groups. Ethologists have made much more use of such cross-species comparisons than comparative psychologists have.

The Merriam-Webster dictionary defines instinct as "A largely inheritable and unalterable tendency of an organism to make a complex and specific response to environmental stimuli without involving reason". [15]

Fixed action patterns Edit

An important development, associated with the name of Konrad Lorenz though probably due more to his teacher, Oskar Heinroth, was the identification of fixed action patterns. Lorenz popularized these as instinctive responses that would occur reliably in the presence of identifiable stimuli called sign stimuli or "releasing stimuli". Fixed action patterns are now considered to be instinctive behavioural sequences that are relatively invariant within the species and that almost inevitably run to completion. [16]

One example of a releaser is the beak movements of many bird species performed by newly hatched chicks, which stimulates the mother to regurgitate food for her offspring. [17] Other examples are the classic studies by Tinbergen on the egg-retrieval behaviour and the effects of a "supernormal stimulus" on the behaviour of graylag geese. [18] [19]

One investigation of this kind was the study of the waggle dance ("dance language") in bee communication by Karl von Frisch. [20]

Habituation Edit

Habituation is a simple form of learning and occurs in many animal taxa. It is the process whereby an animal ceases responding to a stimulus. Often, the response is an innate behaviour. Essentially, the animal learns not to respond to irrelevant stimuli. For example, prairie dogs (Cynomys ludovicianus) give alarm calls when predators approach, causing all individuals in the group to quickly scramble down burrows. When prairie dog towns are located near trails used by humans, giving alarm calls every time a person walks by is expensive in terms of time and energy. Habituation to humans is therefore an important adaptation in this context. [21] [22] [23]

Associative learning Edit

Associative learning in animal behaviour is any learning process in which a new response becomes associated with a particular stimulus. [24] The first studies of associative learning were made by Russian physiologist Ivan Pavlov, who observed that dogs trained to associate food with the ringing of a bell would salivate on hearing the bell. [25]

Imprinting Edit

Imprinting enables the young to discriminate the members of their own species, vital for reproductive success. This important type of learning only takes place in a very limited period of time. Lorenz observed that the young of birds such as geese and chickens followed their mothers spontaneously from almost the first day after they were hatched, and he discovered that this response could be imitated by an arbitrary stimulus if the eggs were incubated artificially and the stimulus were presented during a critical period that continued for a few days after hatching. [26]

Cultural learning Edit

Observational learning Edit

Imitation Edit

Imitation is an advanced behaviour whereby an animal observes and exactly replicates the behaviour of another. The National Institutes of Health reported that capuchin monkeys preferred the company of researchers who imitated them to that of researchers who did not. The monkeys not only spent more time with their imitators but also preferred to engage in a simple task with them even when provided with the option of performing the same task with a non-imitator. [27] Imitation has been observed in recent research on chimpanzees not only did these chimps copy the actions of another individual, when given a choice, the chimps preferred to imitate the actions of the higher-ranking elder chimpanzee as opposed to the lower-ranking young chimpanzee. [28]

Stimulus and local enhancement Edit

There are various ways animals can learn using observational learning but without the process of imitation. One of these is stimulus enhancement in which individuals become interested in an object as the result of observing others interacting with the object. [29] Increased interest in an object can result in object manipulation which allows for new object-related behaviours by trial-and-error learning. Haggerty (1909) devised an experiment in which a monkey climbed up the side of a cage, placed its arm into a wooden chute, and pulled a rope in the chute to release food. Another monkey was provided an opportunity to obtain the food after watching a monkey go through this process on four occasions. The monkey performed a different method and finally succeeded after trial-and-error. [30] Another example familiar to some cat and dog owners is the ability of their animals to open doors. The action of humans operating the handle to open the door results in the animals becoming interested in the handle and then by trial-and-error, they learn to operate the handle and open the door.

In local enhancement, a demonstrator attracts an observer's attention to a particular location. [31] Local enhancement has been observed to transmit foraging information among birds, rats and pigs. [32] The stingless bee (Trigona corvina) uses local enhancement to locate other members of their colony and food resources. [33]

Social transmission Edit

A well-documented example of social transmission of a behaviour occurred in a group of macaques on Hachijojima Island, Japan. The macaques lived in the inland forest until the 1960s, when a group of researchers started giving them potatoes on the beach: soon, they started venturing onto the beach, picking the potatoes from the sand, and cleaning and eating them. [12] About one year later, an individual was observed bringing a potato to the sea, putting it into the water with one hand, and cleaning it with the other. This behaviour was soon expressed by the individuals living in contact with her when they gave birth, this behaviour was also expressed by their young - a form of social transmission. [34]

Teaching Edit

Teaching is a highly specialized aspect of learning in which the "teacher" (demonstrator) adjusts their behaviour to increase the probability of the "pupil" (observer) achieving the desired end-result of the behaviour. For example, killer whales are known to intentionally beach themselves to catch pinniped prey. [35] Mother killer whales teach their young to catch pinnipeds by pushing them onto the shore and encouraging them to attack the prey. Because the mother killer whale is altering her behaviour to help her offspring learn to catch prey, this is evidence of teaching. [35] Teaching is not limited to mammals. Many insects, for example, have been observed demonstrating various forms of teaching to obtain food. Ants, for example, will guide each other to food sources through a process called "tandem running," in which an ant will guide a companion ant to a source of food. [36] It has been suggested that the pupil ant is able to learn this route to obtain food in the future or teach the route to other ants. This behaviour of teaching is also exemplified by crows, specifically New Caledonian crows. The adults (whether individual or in families) teach their young adolescent offspring how to construct and utilize tools. For example, Pandanus branches are used to extract insects and other larvae from holes within trees. [37]

Individual reproduction is the most important phase in the proliferation of individuals or genes within a species: for this reason, there exist complex mating rituals, which can be very complex even if they are often regarded as fixed action patterns. The stickleback's complex mating ritual, studied by Tinbergen, is regarded as a notable example. [38]

Often in social life, animals fight for the right to reproduce, as well as social supremacy. A common example of fighting for social and sexual supremacy is the so-called pecking order among poultry. Every time a group of poultry cohabitate for a certain time length, they establish a pecking order. In these groups, one chicken dominates the others and can peck without being pecked. A second chicken can peck all the others except the first, and so on. Chickens higher in the pecking order may at times be distinguished by their healthier appearance when compared to lower level chickens. [ citation needed ] While the pecking order is establishing, frequent and violent fights can happen, but once established, it is broken only when other individuals enter the group, in which case the pecking order re-establishes from scratch. [39]

Several animal species, including humans, tend to live in groups. Group size is a major aspect of their social environment. Social life is probably a complex and effective survival strategy. It may be regarded as a sort of symbiosis among individuals of the same species: a society is composed of a group of individuals belonging to the same species living within well-defined rules on food management, role assignments and reciprocal dependence.

When biologists interested in evolution theory first started examining social behaviour, some apparently unanswerable questions arose, such as how the birth of sterile castes, like in bees, could be explained through an evolving mechanism that emphasizes the reproductive success of as many individuals as possible, or why, amongst animals living in small groups like squirrels, an individual would risk its own life to save the rest of the group. These behaviours may be examples of altruism. [40] Of course, not all behaviours are altruistic, as indicated by the table below. For example, revengeful behaviour was at one point claimed to have been observed exclusively in Homo sapiens. However, other species have been reported to be vengeful including chimpanzees, [41] as well as anecdotal reports of vengeful camels. [42]

Classification of social behaviours
Type of behaviour Effect on the donor Effect on the receiver
Egoistic Increases fitness Decreases fitness
Cooperative Increases fitness Increases fitness
Altruistic Decreases fitness Increases fitness
Revengeful Decreases fitness Decreases fitness

Benefits and costs of group living Edit

One advantage of group living can be decreased predation. If the number of predator attacks stays the same despite increasing prey group size, each prey may have a reduced risk of predator attacks through the dilution effect. [13] [ page needed ] Further, according to the selfish herd theory, the fitness benefits associated with group living vary depending on the location of an individual within the group. The theory suggests that conspecifics positioned at the centre of a group will reduce the likelihood predations while those at the periphery will become more vulnerable to attack. [45] Additionally, a predator that is confused by a mass of individuals can find it more difficult to single out one target. For this reason, the zebra's stripes offer not only camouflage in a habitat of tall grasses, but also the advantage of blending into a herd of other zebras. [46] In groups, prey can also actively reduce their predation risk through more effective defence tactics, or through earlier detection of predators through increased vigilance. [13]

Another advantage of group living can be an increased ability to forage for food. Group members may exchange information about food sources between one another, facilitating the process of resource location. [13] [ page needed ] Honeybees are a notable example of this, using the waggle dance to communicate the location of flowers to the rest of their hive. [47] Predators also receive benefits from hunting in groups, through using better strategies and being able to take down larger prey. [13] [ page needed ]

Some disadvantages accompany living in groups. Living in close proximity to other animals can facilitate the transmission of parasites and disease, and groups that are too large may also experience greater competition for resources and mates. [48]

Group size Edit

Theoretically, social animals should have optimal group sizes that maximize the benefits and minimize the costs of group living. However, in nature, most groups are stable at slightly larger than optimal sizes. [13] [ page needed ] Because it generally benefits an individual to join an optimally-sized group, despite slightly decreasing the advantage for all members, groups may continue to increase in size until it is more advantageous to remain alone than to join an overly full group. [49]

Niko Tinbergen argued that ethology always needed to include four kinds of explanation in any instance of behaviour: [50] [51]

  • Function – How does the behaviour affect the animal's chances of survival and reproduction? Why does the animal respond that way instead of some other way?
  • Causation – What are the stimuli that elicit the response, and how has it been modified by recent learning?
  • Development – How does the behaviour change with age, and what early experiences are necessary for the animal to display the behaviour?
  • Evolutionary history – How does the behaviour compare with similar behaviour in related species, and how might it have begun through the process of phylogeny?

These explanations are complementary rather than mutually exclusive—all instances of behaviour require an explanation at each of these four levels. For example, the function of eating is to acquire nutrients (which ultimately aids survival and reproduction), but the immediate cause of eating is hunger (causation). Hunger and eating are evolutionarily ancient and are found in many species (evolutionary history), and develop early within an organism's lifespan (development). It is easy to confuse such questions—for example, to argue that people eat because they're hungry and not to acquire nutrients—without realizing that the reason people experience hunger is because it causes them to acquire nutrients. [52]


Friday, 25 February 2011

The moon - Smallest lunar probe that can be made using today's technology

A starting point is the smallest sample return probe ever built, the Luna 24.

Massing 5300 kg in lunar orbit it was pretty much bare bones. To see where we can improve this, we must first split the mission it its separate parts. This follows the standard procedure of doing the mission backwards. First, I would not choose ISS as the target, as braking into orbit and then rendezvous and dock with it is a complicated task. To simply hit the Earth is much easier.

Of the 514kg for the return stage of the Luna 24, around 300kg was propellant. Perhaps a slightly more efficient engine exist today, but the technology used now for landing and ascent would still be hypergolic propellants. That is the most important limit for scaling.

The re-entry capsule was only 34kg, but you might still be able to shave off a few kilograms. The main savings are in the remaining 180kg though, including electrical systems, control systems, the engine and the propellant tanks. With the miniaturization of electronic equipment since the seventies, and some new lighter materials, you may be able to squeeze everything required into a dry mass of 100kg. That is about 220kg at the lunar surface, roughly halved. A similar miniaturization of the descent stage and drilling equipment yields a spacecraft of about 2 metric tonnes in Lunar orbit.

To transport the spacecraft to the Moon, the minimal solution is a spiralling ion-craft. That is within current technology, take for instance the engine powering Dawn. At the cost of a long transfer time, the total mass in Earth orbit is only going to be around 3 metric tonnes minimum. However, considering the relatively high drag at the altitude of the ISS, the craft must start from a higher orbit.

Additionally, it is more effective to combine multiple related goals, like a lunar rover, and on-site experiments, than to launch multiple minimal missions with high risk of failure.


Contents

The 20th century saw greatly increased interest in and research on all questions about sleep. Tremendous strides have been made in molecular, neural and medical aspects of biological rhythmicity. Physiology professor Nathaniel Kleitman's 1939 book Sleep and Wakefulness, revised 1963, [7] summarized the existing knowledge of sleep, and it was he who proposed the existence of a basic rest-activity cycle. Kleitman, with his students including William C. Dement and Eugene Aserinsky, continued his research throughout the 1900s. O. Öquist's 1970 thesis at the Department of Psychology, University of Göteborg, Sweden, marks the beginning of modern research into chronotypes, and is entitled Kartläggning av individuella dygnsrytmer, or "Charting Individual Circadian Rhythms". [8]

Morningness–eveningness questionnaire Edit

Olov Östberg modified Öquist's questionnaire and in 1976, together with J.A. (Jim) Horne, he published the 19-item morningness–eveningness questionnaire, MEQ, [9] which is still used and referred to in virtually all research on this topic.

Researchers in many countries have worked on validating the MEQ with regard to their local cultures. A revision of the scoring of the MEQ as well as a component analysis was done by Jacques Taillard et al. in 2004, [10] working in France with employed people over the age of 50. Previously the MEQ had been validated only for subjects of university age.

Circadian Type Inventory Edit

The Circadian Type Inventory, developed by Folkard (1987), is an improved version of the 20-item Circadian Type Questionnaire (CTQ).

The CTI was initially developed to identify individuals capable of adapting to shift work. Thus, the scale assesses two factors that influence a person’s ability to alter his or her sleeping rhythms: rigidity/flexibility of sleeping habits and ability/inability to overcome drowsiness. Since its creation, the scale has undergone a number of revisions to improve its psychometric properties. An 18-item version was used as part of the larger Standard Shiftwork Index (SSI) in a study conducted by Barton and colleagues. This shorter scale was then reduced and altered to make an 11 item scale by De Milia et al. [11]

Composite Scale of Morningness Edit

Smith et al. (1989) [12] analyzed items from MEQ, Diurnal Type Scale (DTS), [13] and CTQ and chose the best ones to develop an improved instrument, the 13-item Composite Scale of Morningness (CSM or CS). CSM consists of 9 items from the MEQ and 4 items from the Diurnal Type Scale and is regarded [ by whom? ] as an improved version of MEQ. It currently exists in 14 language versions [ citation needed ] the most recently developed are Polish, [14] Russian [15] and Hindi. [16]

Others Edit

Roberts, in 1999, designed the Lark-Owl Chronotype Indicator, LOCI. [17] Till Roenneberg's Munich Chronotype Questionnaire (MCTQ) from 2003 uses a quantitative approach his many thousands of subjects have answered questions about their sleep behavior. [18] [19]

Most people are neither evening nor morning types but lie somewhere in between. Estimates vary, but a 2007 survey of over 55,000 people by Roenneberg et al. showed that morningness–eveningness tends to follow a normal distribution. [18] People who share a chronotype, morningness or eveningness, have similar activity-pattern timing: sleep, appetite, exercise, study etc. Researchers in the field of chronobiology look for objective markers by which to measure the chronotype spectrum. Paine et al. [20] conclude that "morningness/eveningness preference is largely independent of ethnicity, gender, and socioeconomic position, indicating that it is a stable characteristic that may be better explained by endogenous factors".

Sleep Edit

Horne and Östberg found that morning types had a higher daytime temperature with an earlier peak time than evening types and that they went to sleep and awoke earlier, but no differences in sleep lengths were found. They also note that age should be considered in assessments of morningness and eveningness, noting how a "bed time of 23:30 may be indicative of a morning type within a student population, but might be more related to an evening type in the 40–60 years age group". [9] : 109 Clodoré et al. found differences in alertness between morning and evening types after a two-hour sleep reduction. [21] Duffy et al. investigated "changes in the phase relationship between endogenous circadian rhythms and the sleep-wake cycle", and found that although evening types woke at a later clock hour than morning types, morning types woke at a later circadian phase. [22] Zavada et al. show that the exact hour of mid-sleep on free (non-work) days may be the best marker for sleep-based assessments of chronotype it correlates well with such physiological markers as dim-light melatonin onset (DLMO) and the minimum of the daily cortisol rhythm. [23] They also state that each chronotype category "contains a similar portion of short and long sleepers". Chung et al. studied sleep quality in shift-working nurses and found that "the strongest predictor of sleep quality was morningness–eveningness, not the shift schedule or shift pattern", as "evening types working on changing shifts had higher risk of poor sleep quality compared to morning types". [24]

Diurnal rhythms Edit

Gibertini et al. [25] assessed blood levels of the hormone melatonin, finding that the melatonin acrophase (the time at which the peak of a rhythm occurs [26] ) was strongly related to circadian type, whereas amplitude was not. They note that morning types evidence a more rapid decline in melatonin levels after the peak than do evening types. Baehr et al. [27] found that, in young adults, the daily body temperature minimum occurred at about 4 a.m. for morning types but at about 6 a.m. for evening types. This minimum occurred at approximately the middle of the eight-hour sleep period for morning types, but closer to waking in evening types. Evening types had a lower nocturnal temperature. The temperature minimum occurred about a half-hour earlier in women than in men. Similar results were found by Mongrain et al. in Canada, 2004. [28] Morning types had lower pain sensitivity throughout a day than evening types, but the two chronotype groups did not differ in the shape of diurnal variations in pain. [29] There are some differences between chronotypes in sexual activity, with evening chronotypes preferring later hours for sex as compared to other chronotypes. [30]

Personality Edit

Chronotypes differ in many aspects of personality, such as Grit (personality trait), [31] but also in intellectual domains, like creative thinking. [32]

Intelligence Edit

A meta analysis found a small positive association between an evening chronotype and intelligence [33] similar results were subsequently found in a large sample using a standardized battery. [34]

Genetic variants associated with chronotype Edit

Studies show [a] that there are 22 genetic variants associated with chronotype. These variants occur near genes known to be important in photoreception and circadian rhythms. [36] The variant most strongly associated with chronotype occurs near RGS16, which is a regulator of G-protein signalling and has a known role in circadian rhythms. In mice, gene ablation of Rgs16 lengthens the circadian period of behavioural rhythm. By temporally regulating cAMP signalling, Rgs16 has been shown to be a key factor in synchronising intercellular communication between pacemaker neurons in the suprachiasmatic nucleus (SCN), the centre for circadian rhythm control in humans. [36] [37]

PER2 is a well-known regulator of circadian rhythms and contains a variant recently shown to be associated with iris formation. This suggests a link between iris function and chronotype. Per2 knockout mice show arrhythmic locomotor activity. [36] [38] [39] The gene ASB1, associated with eveningness and a tendency to day-napping is a result of interbreeding between archaic and modern humans and is originally a Neanderthal trait, possibly linked to a more crepuscular lifestyle in this species. [40]

Chronotype and disease Edit

Disrupted circadian rhythms are associated with several human diseases, for example, chronotype is genetically correlated with BMI (body mass index). [36] [42] [43] However, cause-and-effect is not yet determined. [36]


Thursday, 12 November 2015

Disruption of biological rhythms

Black: AO1 - Description
Blue: AO2 - Evaluation - studies
Red: AO2 - Evaluation - evaluative points/IDAs

Shift work

Normally, exogenous zeitgebers change gradually, such as the changing light levels around the year. However, with shift work and jet lag, this change is rapid, and exogenous zeitgebers become desynchronised with endogenous pacemakers. For animals, this could lead to dangerous situations such as an animal leaving their dens at night when dangerous predators are around. In humans, the lack of synchrony may lead to health problems such as gastrointestinal disorders.

Shift workers are required to be alert at night and must sleep in the day, contrary to our natural diurnal lifestyle, and out of synchronisation with available cues from zeitgebers. Night workers experience a "circadian trough" - a period of decreased alertness and body temperature between 12 a.m. and 4 a.m. during their shifts, triggered by a decrease in the stress hormone cortisol. They may also experience sleep deprivation due to being unable to sleep during the day, as daytime sleep is shorter than natural night-time sleep, and more likely to be interrupted.

Czeisler (1982) studied workers at a Utah chemical plant as they adjusted from the traditional backwards shift rotation to a forwards shift rotation. Workers reported feeling less stressed, with fewer health problems and sleeping difficulties, along with higher productivity. This was due to the workers undergoing "phase delay", where sleep was delayed to adjust to new EZs, rather than the traditional "phase advance", where sleep time was advanced by sleeping earlier than usual. These results suggest that phase delay is healthier than phase advance, as it is significantly easier to adjust to so carries less risk of circadian rhythm disruption.

Czeisler's findings have valuable real-world applications. For businesses employing shift workers, using a forwards rather than backward shift rotation will increase productivity and reduce the risk of employees making mistakes, as well as improve health due to phase delay being easier for the body's circadian clock to adjust to than phase advance.

Gordon et al (1986) found similar results to Czeisler that support the superiority of forward rotation over backwards rotation. Moving police officers from a backwards to a forwards rotation led to a 30% reduction in sleeping on the job, and a 40% reduction in accidents. Officers reported better sleep and less stress.

Studies suggest that there is a significant relationship between chronic circadian disruption resulting from shift work, and organ disease. Knuttson (1996) found that individuals who worked shifts for more than 15 years were 3 times more likely to develop heart disease than non-shift workers. Martino et al (2008) found a link between shift work and kidney disease, and suggested that kidney disease is a potential hazard for long-term shift workers. However, the use of correlations in these studies means that a direct cause and effect cannot be established, and there is not enough evidence to conclude that organ disease is a direct result of shift work - third, intervening variables cannot be ruled out.

The Chernobyl nuclear power plant and the Challenger space shuttle disasters both occurred during night shifts, when performance of workers was most impaired by the circadian trough. The catastrophic nature of these events emphasises the importance that should be placed on healthy shift rotations and the minimising of circadian disruption for workers in order to avoid further disasters.

  • Permanent non-rotating shift work allows the body clock to synchronise with the new exogenous zeitgebers and adapt to a specific rhythm. However, this is unpopular because not many people want permanent night work.
  • Planned napping during shifts has been shown to reduce tiredness and improve employee performance - but this is unpopular with both employees and employers.
  • Improved daysleep for night shift workers - keeping bedrooms quiet and dark, avoiding bright light and stimulants such as caffeine. However, this method can be disruptive of family life and lead to its own pressures.
  • Rapid rotation: rotating shift work patterns every two or three days avoids even trying to adjust to new exogenous zeitgebers. However, it also means that most of the time, rhythms are out of synchronisation, and there is controversy over the suggested effectiveness of this tactic.

Jet Lag

Jet lag is the disruption in circadian rhythms caused by travelling through multiple time zones very quickly by aeroplane, causing endogenous pacemakers to become desynchronised with local exogenous zeitgebers. This can result in a number of problems including fatigue, insomnia, anxiety, immune weakness and gastrointestinal disruption.

Flying west to east causes worse symptoms and a greater degree of circadian disruption than flying east to west, because phase advance is required in order to adjust to EZ changes when flying east, whereas phase delay is required in order to adjust to EZ changes when flying west. Studies into shift work demonstrate that phase delay is easier for the body's circadian clock than phase advance, causing a lesser degree of disruption and impairment.

Three ways of coping with jet lag have been suggested. Melatonin supplements are widely prescribed in the US to restore melatonin levels when jet lag has greatly disrupted circadian rhythms in order to restore the synchronicity between the internal clock (EPs) and EZs. Planning sleep patterns beforehand has been shown to help adjustment - if arriving in the daytime, stay awake on the plane, if arriving at nighttime, sleep on the plane. Splitting the travel into two days can also help, as each disruption is less severe and people have to make a less significant adjustment on the day of arrival.

Cho (2001) found that airline staff who regularly travelled across 7 time zones had a reduction in temporal lobe size and memory function, providing supporting evidence for the idea that chronic disruption of circadian rhythms due to jet-lag has long-term symptoms of cognitive impairment and neurological damage.

Theories on the function of sleep

In the exam, you can be asked a 24-marker specifically on either restoration or evolutionary theories, so it important to know both of these in equal depth and breadth.

Black: AO1 - Description
Blue: AO2 - Evaluation - studies
Red: AO2 - Evaluation - evaluative points/IDAs


Evolutionary theories of sleep

Evolutionary theories explain sleep as an adaptive behaviour - one that increases the chance of an organism's survival and reproduction, providing a selective advantage. Sleep has evolved as an essential behaviour due to this selective advantage it has provided over the course of our evolutionary history - animals who did not sleep were more likely to fall victim to predation, so could not go on to reproduce.

Meddis proposed the predator-prey status theory, claiming that sleep evolved to keep prey hidden and safe from predators when normal adaptive activities such as foraging are impossible - such as at night for diurnal animals, and in the day for nocturnal animals. Therefore, the hours of sleep required are related to an animal's need for and method of obtaining food, as well as their exposure to predators. Factors other than predator-prey status that can affect sleep behavior include sleeping environment and foraging requirements. Sleep evolved to ensure animals stay still and out of the way of predators when productive activities are impossible, so the higher the vulnerability to predation, the safer the sleep site, and the lesser the time required to spend foraging, the more time an animal should spend sleeping.

This explanation is supported by the fact that animals are often inconspicuous when sleeping - taking the time beforehand to find themselves adequate shelter to keep them hidden from predators. This also explains the early stages of the sleep cycle, "light sleep", as a transitional phase from wake to sleep, allowing the animal to ensure their own safety in their immediate environment before completely losing their alertness.

A study by DeCoursey also supports this explanation. 30 chipmunks had their suprachiasmatic nuclei (a part of the brain involved in regulation of the sleep/wake cycle) removed, and were released into the wild. All 30 chipmunks were killed by predators within 80 days, suggesting that sleep patterns are vital in ensuring the safety of an animal in its natural habitat.

A strength of DeCoursey's study was the scientific validity provided by the use of control groups, treating psychology with rigorous scientific methodology. Three groups of chipmunks were used: one with SCN damage, one who had brain surgery but no SCN damage (to control for the stress of brain surgery) and a healthy control group. The use of these controls mean that cause and effect can easily be determined - it can be reliably established that circadian disruption due to SCN damage increase the risk of death due to predation.

However, a study by Allison and Cicchetti challenges this explanation, finding that on average, prey sleep for fewer hours a night than predators - Meddis suggested the opposite trend, so his theory conflicts with these results.

The predator-prey status theory of sleep is holistic, compared to Webb's hibernation theory. Rather than only focusing on one factor, (status), Meddis suggested that several factors other than this can influence sleep behaviour, such as site of sleep (whether it's enclosed in a nest or a cave, or exposed on prairies or plains) and foraging requirements (whether it requires lots of grazing on nutrient-poor found sources, or relatively few hours gathering nutrient-rich foods such as nuts or insects.) A holistic theory that takes into account multiple factors is likely to be able to provide the best explanation for the complex behaviour that is sleep.

A problem with explaining sleep as a means to safety from predation is that many species may actually be far more vulnerable during sleep, and it would be safer to remain quiet and still yet alert. However, some species have adapted to this need for vigilance: porpoises only sleep one brain hemisphere at a time, while mallards sleep with one eye open to be able to see potential threats. The phenomenon of snoring also challenges this explanation, as it is likely to draw attention to the otherwise inconspicuous sleeping animal, and increase their risk of predation.

Webb proposed the hibernation theory, claiming that sleep evolved as a way of conserving energy when hunting or foraging were impossible. This theory suggests that animals should sleep for longer if they have a higher metabolic rate, as they burn up energy more quickly, so are in greater need of energy conservation. Conservation of energy is best carried out by limiting the brain's sensory inputs, i.e. sleep.

Berger and Philips found that sleep deprivation causes increased energy expenditure, especially under bed rest conditions. This suggests that sleep does conserve energy, and is especially useful when you're not doing normal activities.

Studies have found a positive correlation between metabolic rate and required sleep duration - small animals such as mice generally sleep for longer than larger animals, supporting the idea that sleep is adaptive as a form of energy conservation.

In times of hardship, such as when food is scarce or the weather too cold, animals sleep for longer, suggesting that sleep helps them conserve all the energy they can when resources are scarce and every calorie is critical for survival.

However, not all organisms follow this general trend, and there are some extreme outliers that challenge this theory. The sloth, a relatively large animal with a slow metabolic rate sleeps for approximately 20 hours a day, challenging the general trend that Webb's theory.

REM sleep, characterised by high levels of brain activity, actually uses the same amount of energy as waking. If REM sleep did not serve some other purpose, it would be maladaptive, as it does not help conserve energy due to the high levels of brain activity.


Overall evaluation of evolutionary theories of sleep

Restoration theories of sleep

Restoration theories explain the physiological patterns associated with sleep as produced by the body's natural recovery processes. Oswald explained NREM sleep as responsible for the body's regeneration, restoring skin cells due to the release of the body's growth hormone during deep sleep. He suggested that REM sleep restores the brain.

Oswald's theory is supported by the findings that newborn babies spend large amounts of time in proto-REM sleep (a third of every day.) This is a time of massive brain growth, with the development of new synaptic connections requiring neuronal growth and neurotransmitter production. REM is a very active phase of sleep, with brain energy consumption similar to waking, so Oswald's theory can explain this phase and why it's so dominant in newborns.

Oswald also found that sufferers of severe brain trauma such as drug overdoses spend much more time in REM sleep. It was also known that new skin cells regenerate faster during sleep - Oswald used these results to conclude that REM sleep is for restoration of the brain, and NREM sleep is for restoration of the body.

Jouvet (1967) placed cats on upturned flowerpots surrounded by water, which they would fall into upon entering REM sleep. Over time, the cats became conditioned to wake up upon entering REM sleep, depriving them of the vital fifth stage of sleep. The cats became mentally disturbed very quickly, and died after an average of 35 days. This supports Oswald's theory: the cats had NREM sleep and suffered no obvious physical ailments, buts died from organ failure brought on by brain fatigue, resulting from the lack of REM sleep.

Jouvet's use of non-human animals raises an important issue. As well as being potentially considered unethical due to the extreme cruelty inflicted upon the animals for relatively little in the way of socially important results, the use of cats is a problem due to physiological differences in the mechanisms controlling sleep in humans and cats, meaning that it is anthropomorphic to generalise the results to humans.

Horne's restoration theory suggests that REM and deep NREM sleep are essential for normal brain function, as the brain restores itself in these stages of "core sleep." Light NREM has no obvious function - Horne refers to it as optional sleep, that might have had a role in keeping the animal inconspicuous by ensuring safety before its progression to deep sleep. Entering NREM causes a surge in growth hormone release - but this is unlikely to be used for tissue growth and repair, as nutrients required will have already have been used. He therefore theorises that bodily restoration takes place in hours of relaxed wakefulness during the day, when energy expenditure is low and nutrients are readily available.

Supporting evidence for Horne's theory comes from sleep-deprived participants given cognitive tasks to carry out. They can only maintain reasonable performance through significantly increased effort, suggesting that sleep deprivation causes cognitive impairment because the brain has not had enough sleep necessary to maintain prime cognitive function.

Radio DJ Peter Tripp managed to stay awake for 8 days (200 hours). During this time he suffered delusions and hallucinations so severe it was impossible to test his psychological functioning. It is thought that sleep deprivation caused these effects as the brain was unable to restore itself. This supports Horne's theory, as having no REM or NREM lead to cognitive disturbances, rather than any physical impairment.

Randy Gardner remained awake for 11 days (264) hours, suffering from slurred speech, blurred vision and paranoia. He had fewer symptoms than Tripp despite being awake for longer, and soon managed to adjust back to his usual sleep pattern after the experiment. This again supports Horne's theory - slurred speech, paranoia and blurred vision are likely to be a result of neurological rather than physical impairment due to lack of core sleep.

Both Tripp and Gardner's studies are case studies, meaning they lack generalisability to a wider population. The massive individual differences found between only two case studies suggest that individual differences alone play a large role in how the individual experiences sleep, and how much sleep they need, so individual differences affect sleep too much to draw any valid conclusions from case studies.

Also, Tripp and Gardner were both male. research has shown that hormonal differences and levels can play a large role in determining how the individual experiences sleep, so, taking into account hormonal differences between genders, it would be beta bias to attempt to generalise their results to females specifically.

Finally, a methodological issue in Gardner's study comes from the observation of symptoms like blurred vision. It is difficult to establish whether this has a psychological or physiological cause, as it could either a result of bodily impairment such as a malfunction of the optic nerve, or brain impairment such as occipital lobe malfunction, the part of the brain responsible for visual processing. This makes it difficult to establish what damage was done by the sleep deprivation - physical and mental as Oswald would suggest, or purely mental, as Horne would suggest?


How To Maintain a Healthy Circadian Rhythm

While we don’t have full control over our circadian rhythm, there are healthy sleep tips that can be taken to try to better entrain our 24-hour sleep cycles.

  • Seek out sun: Exposure to natural light, especially early in the day, helps reinforce the strongest circadian cue.
  • Follow a consistent sleep schedule: Varying your bedtime or morning wake-up time can hinder your body’s ability to adjust to a stable circadian rhythm.
  • Get daily exercise: Activity during the day can support your internal clock and help make it easier to fall asleep at night.
  • Avoid caffeine: Stimulants like caffeine can keep you awake and throw off the natural balance between sleep and wakefulness. Everyone is different, but if you’re having trouble sleeping, you should avoid caffeine after noon.
  • Limit light before bed: Artificial light exposure at night can interfere with circadian rhythm. Experts advise dimming the lights and putting down electronic devices in the lead-up to bedtime and keeping electronics out of the bedroom and away from your sleeping surface.
  • Keep naps short and early in the afternoon: Late and long naps can push back your bedtime and throw your sleep schedule off-kilter.

These steps to improve sleep hygiene can be an important part of supporting a healthy circadian rhythm, but other steps may be necessary depending on the situation. If you have persistent or severe sleeping problems, daytime drowsiness, and/or a problematic sleep schedule, it’s important to talk with a doctor who can best diagnose the cause and offer the most appropriate treatment.



Comments:

  1. Amycus

    you can look at that infinitely.

  2. Faek

    I absolutely agree with you. I think this is a very great idea. I completely agree with you.



Write a message