text
stringlengths 10
951k
| source
stringlengths 39
44
|
---|---|
Horse tack
Tack is equipment or accessories equipped on horses and other equines in the course of their use as domesticated animals. Saddles, stirrups, bridles, halters, reins, bits, harnesses, martingales, and breastplates are all forms of horse tack. Equipping a horse is often referred to as tacking up. A room to store such equipment, usually near or in a stable, is a tack room.
Saddles are seats for the rider, fastened to the horse's back by means of a "girth" (English-style riding), known as a "cinch" in the Western US, a wide strap that goes around the horse at a point about four inches behind the forelegs. Some western saddles will also have a second strap known as a "flank" or "back cinch" that fastens at the rear of the saddle and goes around the widest part of the horse's belly.
It is important that the saddle be comfortable for both the rider and the horse as an improperly fitting saddle may create pressure points on the horse's back muscle (Latissimus dorsi) and cause the horse pain and can lead to the horse, rider, or both getting injured.
There are many types of saddle, each specially designed for its given task.
Saddles are usually divided into two major categories: "English saddles" and "Western saddles" according to the riding discipline they are used in. Other types of saddles, such as racing saddles, Australian saddles, sidesaddles and endurance saddles do not necessarily fit neatly in either category.
Stirrups are supports for the rider's feet that hang down on either side of the saddle. They provide greater stability for the rider but can have safety concerns due to the potential for a rider's feet to get stuck in them. If a rider is thrown from a horse but has a foot caught in the stirrup, they could be dragged if the horse runs away. To minimize this risk, a number of safety precautions are taken. First, most riders wear riding boots with a heel and a smooth sole. Next, some saddles, particularly English saddles, have safety bars that allow a stirrup leather to fall off the saddle if pulled backwards by a falling rider. Other precautions are done with stirrup design itself. Western saddles have wide stirrup treads that make it more difficult for the foot to become trapped. A number of saddle styles incorporate a tapedero, which is covering over the front of the stirrup that keeps the foot from sliding all the way through the stirrup. The English stirrup (or "iron") has several design variations which are either shaped to allow the rider's foot to slip out easily or are closed with a very heavy rubber band. The invention of stirrups was of great historic significance in mounted combat, giving the rider secure foot support while on horseback.
"Bridles", hackamores, "halters" or "headcollars", and similar equipment consist of various arrangements of straps around the horse's head, and are used for control and communication with the animal.
A "halter" (US) or "headcollar" (UK) (occasionally "headstall") consists of a noseband and headstall that buckles around the horse's head and allows the horse to be led or tied. The lead rope is separate, and it may be short (from six to ten feet, two to three meters) for everyday leading and tying, or much longer (up to , eight meters) for tasks such as for leading packhorses or for picketing a horse out to graze.
Some horses, particularly stallions, may have a chain attached to the lead rope and placed over the nose or under the jaw to increase the control provided by a halter while being led. Most of the time, horses are not ridden with a halter, as it offers insufficient precision and control. Halters have no bit.
In Australian and British English, a "halter" is a rope with a spliced running loop around the nose and another over the poll, used mainly for unbroken horses or for cattle. The lead rope cannot be removed from the halter. A show halter is made from rolled leather and the lead attaches to form the chinpiece of the noseband. These halters are not suitable for paddock usage or in loose stalls. An "underhalter" is a lightweight halter or headcollar which is made with only one small buckle, and can be worn under a bridle for tethering a horse without untacking.
Bridles usually have a "bit" attached to "reins" and are used for riding and driving horses.
"English Bridles" have a "cavesson" style noseband and are seen in English riding. Their reins are buckled to one another, and they have little adornment or flashy hardware.
"Western Bridles" used in Western riding usually have no noseband, are made of thin bridle leather. They may have long, separated "Split" reins or shorter closed reins, which sometimes include an attached "Romal". Western bridles are often adorned with silver or other decorative features.
"Double bridles" are a type of English bridle that use two bits in the mouth at once, a snaffle and a curb. The two bits allow the rider to have very precise control of the horse. As a rule, only very advanced horses and riders use double bridles. Double bridles are usually seen in the top levels of dressage, but also are seen in certain types of show hack and Saddle seat competition.
A "hackamore" is a headgear that utilizes a heavy noseband of some sort, rather than a bit, most often used to train young horses or to go easy on an older horse's mouth. Hackamores are more often seen in western riding. Some related styles of headgear that control a horse with a noseband rather than a bit are known as bitless bridles.
The word "hackamore" is derived from the Spanish word "jáquima." Hackamores are seen in western riding disciplines, as well as in endurance riding and English riding disciplines such as show jumping and the stadium phase of eventing. While the classic bosal-style hackamore is usually used to start young horses, other designs, such as various bitless bridles and the mechanical hackamore are often seen on mature horses with dental issues that make bit use painful, horses with certain training problems, and on horses with mouth or tongue injuries. Some riders also like to use them in the winter to avoid putting a frozen metal bit into a horse's mouth.
Like bitted bridles, noseband-based designs can be gentle or harsh, depending on the hands of the rider. It is a myth that a bit is cruel and a hackamore is gentler. The horse's face is very soft and sensitive with many nerve endings. Misuse of a hackamore can cause swelling on the nose, scraping on the nose and jawbone, and extreme misuse may cause damage to the bones and cartilage of the horse's head.
A "longeing cavesson" (UK: "lungeing") is a special type of halter or noseband used for longeing a horse. Longeing is the activity of having a horse walk, trot and/or canter in a large circle around the handler at the end of a rope that is 25 to long. It is used for training and exercise.
Reins consist of leather straps or rope attached to the outer ends of a "bit" and extend to the rider's or driver's hands. Reins are the means by which a horse rider or driver communicates directional commands to the horse's head. Pulling on the reins can be used to steer or stop the horse. The sides of a horse's mouth are sensitive, so pulling on the reins pulls the bit, which then pulls the horse's head from side to side, which is how the horse is controlled.
On some types of harnesses there might be supporting rings to carry the reins over the horse's back. When pairs of horses are used in drawing a wagon or coach it is usual for the outer side of each pair to be connected to reins and the inside of the bits connected by a short bridging strap or rope. The driver carries "four-in-hand" or "six-in-hand" being the number of reins connecting to the pairs of horses.
A rein may be attached to a halter to lead or guide the horse in a circle for training purposes or to lead a packhorse, but a simple lead rope is more often used for these purposes. A longe line is sometimes called a "longe rein," but it is actually a flat line about long, usually made of nylon or cotton web, about one inch wide, thus longer and wider than even a driving rein.
Horses should never be tied by the reins. Not only do they break easily, but, being attached to a bit in the horse's sensitive mouth, a great deal of pain can be inflicted if a bridled horse sets back against being tied.
A bit is a device placed in a horse's mouth, kept on a horse's head by means of a headstall. There are many types, each useful for specific types of riding and training.
The mouthpiece of the bit does not rest on the teeth of the horse, but rather rests on the gums or "bars" of the horse's mouth in an interdental space behind the front incisors and in front of the back molars. It is important that the style of bit is appropriate to the horse's needs and is fitted properly for it to function properly and be as comfortable as possible for the horse.
The basic "classic" styles of bits are:
While there are literally hundreds of types of bit mouthpieces, bit rings and bit shanks, essentially there are really only two broad categories: direct pressure bits, broadly termed snaffle bits; and leverage bits, usually termed curbs.
Bits that act with direct pressure on the tongue and lips of the bit are in the general category of "snaffle" bits. Snaffle bits commonly have a single jointed mouthpiece and act with a nutcracker effect on the bars, tongue and occasionally roof of the mouth. However, regardless of mouthpiece, any bit that operates only on direct pressure is a "snaffle" bit.
Leverage bits have shanks coming off the mouthpiece to create leverage that applies pressure to the poll, chin groove and mouth of the horse are in the category of "curb" bits. Any bit with shanks that works off of leverage is a "curb" bit, regardless of whether the mouthpiece is solid or jointed.
Some combination or hybrid bits combine direct pressure and leverage, such as the Kimblewick or Kimberwicke, which adds slight leverage to a two-rein design that resembles a snaffle; and the four rein designs such as the single mouthpiece Pelham bit and the double bridle, which places a curb and a snaffle bit simultaneously in the horse's mouth.
In the wrong hands even the mildest bit can hurt the horse. Conversely, a very severe bit, in the right hands, can transmit subtle commands that cause no pain to the horse. Bit commands should be given with only the quietest movements of the hands, and much steering and stopping should be done with the legs and seat.
A horse harness is a set of devices and straps that attaches a horse to a cart, carriage, sledge or any other load. There are two main styles of harnesses - breaststrap and collar and hames style. These differ in how the weight of the load is attached. Most Harnesses are made from leather, which is the traditional material for harnesses, though some designs are now made of nylon webbing or synthetic biothane.
A breaststrap harness has a wide leather strap going horizontally across the horses' breast, attached to the traces and then to the load. This is used only for lighter loads. A collar and hames harness has a collar around the horses' neck with wood or metal hames in the collar. The traces attach from the hames to the load. This type of harness is needed for heavy draft work.
Both types will also have a bridle and reins. A harness that is used to support shafts, such as on a cart pulled by a single horse, will also have a "saddle" attached to the harness to help the horse support the shafts and "breeching" to brake the forward motion of the vehicle, especially when stopping or moving downhill. Horses guiding vehicles by means of a pole, such as two-horse teams pulling a wagon, a hay-mower, or a dray, will have "pole-straps" attached to the lower part of the horse collar.
Breastplates, breastcollars or breastgirths attach to the front of the saddle, cross the horse's chest, and usually have a strap that runs between the horse's front legs and attaches to the girth. They keep the saddle from sliding back or sideways. They are usually seen in demanding, fast-paced sports. They are crucial pieces of safety equipment for English riding activities requiring jumping, such as eventing, show jumping, polo, and fox hunting. They are also seen in Western riding events, particularly in rodeo, reining and cutting, where it is particularly important to prevent a saddle from shifting. They may also be worn in other horse show classes for decorative purposes.
A martingale is a piece of equipment that keeps a horse from raising its head too high. Various styles can be used as a control measure, to prevent the horse from avoiding rider commands by raising its head out of position; or as a safety measure to keep the horse from tossing its head high or hard enough to smack its rider in the face.
They are allowed in many types of competition, especially those where speed or jumping may be required, but are not allowed in most "flat" classes at horse shows, though an exception is made in a few classes limited exclusively to young or "green" horses who may not yet be fully trained.
Martingales are usually attached to the horse one of two ways. They are either attached to the center chest ring of a breastplate or, if no breastplate is worn, they are attached by two straps, one that goes around the horse's neck, and the other that attaches to the girth, with the martingale itself beginning at the point in the center of the chest where the neck and girth straps intersect.
Martingale types include:
There are other training devices that fall loosely in the martingale category, in that they use straps attached to the reins or bit which limit the movement of the horse's head or add leverage to the rider's hands in order to control the horse's head. Common devices of this nature include the overcheck, the chambon, de Gogue, grazing reins, draw reins and the "bitting harness" or "bitting rig". However, most of this equipment is used for training purposes and is not legal in any competition. In some disciplines, use of leverage devices, even in training, is controversial.
|
https://en.wikipedia.org/wiki?curid=14215
|
Hausa language
Hausa (; /) is a Chadic language spoken by the Hausa people, the largest ethnic group in Sub-Saharan Africa, mainly within the territories of Southern Niger and Northern Nigeria. Historically, it has also developed into a lingua franca across much of Western Africa for purposes of trade.
Hausa is a member of the Afroasiatic language family and is the most widely spoken language within the Chadic branch of that family. Ethnologue estimated that it was spoken as a first language by some 47 million people and as a second language by another 25 million, bringing the total number of Hausa speakers to an estimated 72 million. By more recent estimations, Hausa is estimated to be spoken by 100–150 million people, making it the most spoken indigenous African language.
Hausa belongs to the West Chadic languages subgroup of the Chadic languages group, which in turn is part of the Afroasiatic language family. Other Afroasiatic languages are Semitic languages including Arabic, Aramaic languages, Hebrew, extinct Phoenician, the extinct Akkadian, as well as the Ethio-Semitic Amharic, Tigre, Tigrinya, and Gurage; Berber languages; Cushitic languages including Somali and Oromo; other Chadic languages including Glavda, Babur, Mwaghavul, Tera, Tangale, Karekare, Bole, Sayawa, Bwatiye, Ngas, Bade, Gwandara, Galambu, Pali, Higi, Ron, Duhwa, Margi, Kilba, Duwai and many others.
Roger Blench (2011) has found that Hausa in fact has many words that are not found in other Chadic languages. Because of this, Hausa is not actually representative of the Chadic group. This is because Hausa had borrowed from Adamawa, Plateau, Kainji, Nupoid, and other Benue-Congo languages during its expansion across the Nigerian Middle Belt. While those languages became assimilated, many of their words had changed the lexicon of Hausa. Some likely influence from vanished Nilo-Saharan languages on Hausa was also proposed by Blench. Blench (2011) lists these Hausa words of likely Benue-Congo origin:
Native speakers of Hausa, the Hausa people, are mostly found in Niger, in Northern Nigeria, and in Chad. Furthermore, the language is used as a "lingua franca" by non-native speakers in most of Northern Nigeria and Southern Niger, and as a trade language across a much larger swathe of West Africa (Benin, Ghana, Cameroon, Togo, Ivory Coast) and parts of Sudan.
Eastern Hausa dialects include "Dauranci" in Daura, "Kananci" in Kano, "Bausanci" in Bauchi, "Gudduranci" in Katagum Misau and part of Borno, and "Hadejanci" in Hadejiya.
Western Hausa dialects include "Sakkwatanci" in Sokoto, "Katsinanci" in Katsina, "Arewanci" in Gobir, Adar, Kebbi, and Zamfara, and "Kurhwayanci" in Kurfey in Niger. Katsina is transitional between Eastern and Western dialects. Sokoto is used in a variety of classical Hausa literature, and is often known as "Classical Hausa".
Northern Hausa dialects include "Arewa" (meaning 'North') and "Arewaci".
"Zazzaganci" in Zazzau is the major Southern dialect.
The Daura ("Dauranchi") and Kano ("Kananci") dialect are the standard. The BBC, Deutsche Welle, Radio France Internationale and Voice of America offer Hausa services on their international news web sites using Dauranci and Kananci. In recent language development Zazzaganci took over the innovation of writing and speaking the current Hausa language use.
The western to eastern Hausa dialects of "Kurhwayanci", "Daragaram" and "Aderawa", represent the traditional northernmost limit of native Hausa communities. These are spoken in the northernmost sahel and mid-Saharan regions in west and central Niger in the Tillaberi, Tahoua, Dosso, Maradi, Agadez and Zinder regions. While mutually comprehensible with other dialects (especially "Sakkwatanci", and to a lesser extent "Gaananci"), the northernmost dialects have slight grammatical and lexical differences owing to frequent contact with the Zarma, Fula, and Tuareg groups and cultural changes owing to the geographical differences between the grassland and desert zones. These dialects also have the quality of bordering on non-tonal pitch accent dialects.
This link between non-tonality and geographic location is not limited to Hausa alone, but is exhibited in other northern dialects of neighbouring languages; such as differences within the Songhay language (between the non-tonal northernmost dialects of Koyra Chiini in Timbuktu and Koyraboro Senni in Gao; and the tonal southern Zarma dialect, spoken from western Niger to northern Ghana), and within the Soninke language (between the non-tonal northernmost dialects of Imraguen and Nemadi spoken in east-central Mauritania; and the tonal southern dialects of Senegal, Mali and the Sahel).
The Ghanaian Hausa dialect ("Gaananci"), spoken in Ghana, Togo, and western Ivory Coast, is a distinct western native Hausa dialect-bloc with adequate linguistic and media resources available. Separate smaller Hausa dialects are spoken by an unknown number of Hausa further west in parts of Burkina Faso, and in the Haoussa Foulane, Badji Haoussa, Guezou Haoussa, and Ansongo districts of northeastern Mali (where it is designated as a minority language by the Malian government), but there are very little linguistic resources and research done on these particular dialects at this time.
Gaananci forms a separate group from other Western Hausa dialects, as it now falls outside the contiguous Hausa-dominant area, and is usually identified by the use of "c" for "ky", and "j" for "gy". This is attributed to the fact that Ghana's Hausa population descend from Hausa-Fulani traders settled in the zongo districts of major trade-towns up and down the previous Asante, Gonja and Dagomba kingdoms stretching from the sahel to coastal regions, in particular the cities of Tamale, Salaga, Bawku, Bolgatanga, Achimota, Nima and Kumasi.
Gaananci exhibits noted inflected influences from Zarma, Gur, Jula-Bambara, Akan, and Soninke, as Ghana is the westernmost area in which the Hausa language is a major lingua-franca among sahelian/Muslim West Africans, including both Ghanaian and non-Ghanaian zango migrants primarily from the northern regions, or Mali and Burkina Faso. Ghana also marks the westernmost boundary in which the Hausa people inhabit in any considerable number. Immediately west and north of Ghana (in Cote d'Ivoire, and Burkina Faso), Hausa is abruptly replaced with Dioula–Bambara as the main sahelian/Muslim lingua-franca of what become predominantly Manding areas, and native Hausa-speakers plummet to a very small urban minority.
Because of this, and the presence of surrounding Akan, Gbe, Gur and Mande languages, Gaananci was historically isolated from the other Hausa dialects. Despite this difference, grammatical similarities between "Sakkwatanci" and Ghanaian Hausa determine that the dialect, and the origin of the Ghanaian Hausa people themselves, are derived from the northwestern Hausa area surrounding Sokoto.
Hausa is also widely spoken by non-native Gur, and Mandé Ghanaian Muslims, but differs from Gaananci, and rather has features consistent with non-native Hausa dialects.
Hausa is also spoken in various parts of Cameroon and Chad, which combined the mixed dialects of Northern Nigeria and Niger. In addition, Arabic has had a great influence in the way Hausa is spoken by the native Hausa speakers in these areas.
In West Africa, Hausa's use as a lingua franca has given rise to a non-native pronunciation that differs vastly from native pronunciation by way of key omissions of implosive and ejective consonants present in native Hausa dialects, such as "ɗ", "ɓ" and "kʼ/ƙ", which are pronounced by non-native speakers as "d", "b" and "k" respectively. This creates confusion among non-native and native Hausa speakers, as non-native pronunciation does not distinguish words like ' ("correct") and ' ("one-by-one"). Another difference between native and non-native Hausa is the omission of vowel length in words and change in the standard tone of native Hausa dialects (ranging from native Fulani and Tuareg Hausa-speakers omitting tone altogether, to Hausa speakers with Gur or Yoruba mother tongues using additional tonal structures similar to those used in their native languages). Use of masculine and feminine gender nouns and sentence structure are usually omitted or interchanged, and many native Hausa nouns and verbs are substituted with non-native terms from local languages.
Non-native speakers of Hausa numbered more than 25 million and, in some areas, live close to native Hausa. It has replaced many other languages especially in the north-central and north-eastern part of Nigeria and continues to gain popularity in other parts of Africa as a result of Hausa movies and music which spread out throughout the region.
There are several pidgin forms of Hausa. Barikanchi was formerly used in the colonial army of Nigeria. Gibanawa is currently in widespread use in Jega in northwestern Nigeria, south of the native Hausa area.
Hausa has between 23 and 25 consonant phonemes depending on the speaker.
The three-way contrast between palatalized velars , plain velars , and labialized velars is found only before long and short , e.g. ('grass'), ('to increase'), ('shea-nuts'). Before front vowels, only palatalized and labialized velars occur, e.g. ('jealousy') vs. ('side of body'). Before rounded vowels, only labialized velars occur, e.g. ('ringworm').
Other analysis suggest the presence of the three additional phonemes .
Hausa has glottalic consonants (implosives and ejectives) at four or five places of articulation (depending on the dialect). They require movement of the glottis during pronunciation and have a staccato sound.
They are written with modified versions of Latin letters. They can also be denoted with an apostrophe, either before or after depending on the letter, as shown below.
Hausa has five phonetic vowel sounds, which can be either short or long, giving a total of 10 monophthongs. In addition, there are four joint vowels (diphthongs), giving a total number of 14 vowel phonemes.
In comparison with the long vowels, the short can be similar in quality to the long vowels, mid-centralized to or centralized to .
Medial can be neutralized to , with the rounding depending on the environment.
Medial are neutralized with .
The short can be either similar in quality to the long , or it can be as high as , with possible intermediate pronunciations ().
Hausa is a tonal language. Each of its five vowels may have low tone, high tone or falling tone. In standard written Hausa, tone is not marked. In recent linguistic and pedagogical materials, tone is marked by means of diacritics.
An acute accent () may be used for high tone, but the usual practice is to leave high tone unmarked.
Except for the Zaria and Bauchi dialects spoken south of Kano, Hausa distinguishes between masculine and feminine genders.
Hausa, like the rest of the Chadic languages, is known for its complex, irregular pluralization of nouns. Noun plurals in Hausa are derived using a variety of morphological processes, such as suffixation, infixation, reduplication, or a combination of any of these processes. There are 20 plural classes proposed by Newman (2000).
Hausa's modern official orthography is a Latin-based alphabet called "boko", which was introduced in the 1930s by the British colonial administration.
The letter "ƴ" (y with a right hook) is used only in Niger; in Nigeria it is written "ʼy".
Tone and vowel length are not marked in writing. So, for example, "from" and "battle" are both written "daga". The distinction between and (which does not exist for all speakers) is not always marked.
Hausa has also been written in "ajami", an Arabic alphabet, since the early 17th century. The first known work to be written in Hausa is Riwayar Nabi Musa by Abdullahi Suka in the 17th century. There is no standard system of using "ajami", and different writers may use letters with different values. Short vowels are written regularly with the help of vowel marks, which are seldom used in Arabic texts other than the Quran. Many medieval Hausa manuscripts in "ajami", similar to the Timbuktu Manuscripts, have been discovered recently; some of them even describe constellations and calendars.
In the following table, short and long "e" are shown along with the Arabic letter for "t" ().
Hausa is one of three indigenous languages of Nigeria which has been rendered in braille.
At least three other writing systems for Hausa have been proposed or "discovered". None of these are in active use beyond perhaps some individuals.
|
https://en.wikipedia.org/wiki?curid=14216
|
History of mathematics
The area of study known as the history of mathematics is primarily an investigation into the origin of discoveries in mathematics and, to a lesser extent, an investigation into the mathematical methods and notation of the past. Before the modern age and the worldwide spread of knowledge, written examples of new mathematical developments have come to light only in a few locales. From 3000 BC the Mesopotamian states of Sumer, Akkad and Assyria, together with Ancient Egypt and Ebla began using arithmetic, algebra and geometry for purposes of taxation, commerce, trade and also in the field of astronomy and to formulate calendars and record time.
The most ancient mathematical texts available are from Mesopotamia and Egypt – "Plimpton 322" (Babylonian c. 1900 BC), the "Rhind Mathematical Papyrus" (Egyptian c. 2000–1800 BC) and the "Moscow Mathematical Papyrus" (Egyptian c. 1890 BC). All of these texts mention the so-called Pythagorean triples and so, by inference, the Pythagorean theorem, seems to be the most ancient and widespread mathematical development after basic arithmetic and geometry.
The study of mathematics as a "demonstrative discipline" begins in the 6th century BC with the Pythagoreans, who coined the term "mathematics" from the ancient Greek "μάθημα" ("mathema"), meaning "subject of instruction". Greek mathematics greatly refined the methods (especially through the introduction of deductive reasoning and mathematical rigor in proofs) and expanded the subject matter of mathematics. Although they made virtually no contributions to theoretical mathematics, the ancient Romans used applied mathematics in surveying, structural engineering, mechanical engineering, bookkeeping, creation of lunar and solar calendars, and even arts and crafts. Chinese mathematics made early contributions, including a place value system and the first use of negative numbers. The Hindu–Arabic numeral system and the rules for the use of its operations, in use throughout the world today evolved over the course of the first millennium AD in India and were transmitted to the Western world via Islamic mathematics through the work of Muḥammad ibn Mūsā al-Khwārizmī. Islamic mathematics, in turn, developed and expanded the mathematics known to these civilizations. Contemporaneous with but independent of these traditions were the mathematics developed by the Maya civilization of Mexico and Central America, where the concept of zero was given a standard symbol in Maya numerals.
Many Greek and Arabic texts on mathematics were translated into Latin from the 12th century onward, leading to further development of mathematics in Medieval Europe. From ancient times through the Middle Ages, periods of mathematical discovery were often followed by centuries of stagnation. Beginning in Renaissance Italy in the 15th century, new mathematical developments, interacting with new scientific discoveries, were made at an increasing pace that continues through the present day. This includes the groundbreaking work of both Isaac Newton and Gottfried Wilhelm Leibniz in the development of infinitesimal calculus during the course of the 17th century. At the end of the 19th century the International Congress of Mathematicians was founded and continues to spearhead advances in the field.
The origins of mathematical thought lie in the concepts of number, magnitude, and form. Modern studies of animal cognition have shown that these concepts are not unique to humans. Such concepts would have been part of everyday life in hunter-gatherer societies. The idea of the "number" concept evolving gradually over time is supported by the existence of languages which preserve the distinction between "one", "two", and "many", but not of numbers larger than two.
Prehistoric artifacts discovered in Africa, dated 20,000 years old or more suggest early attempts to quantify time. The Ishango bone, found near the headwaters of the Nile river (northeastern Congo), may be more than 20,000 years old and consists of a series of marks carved in three columns running the length of the bone. Common interpretations are that the Ishango bone shows either a "tally" of the earliest known demonstration of sequences of prime numbers or a six-month lunar calendar. Peter Rudman argues that the development of the concept of prime numbers could only have come about after the concept of division, which he dates to after 10,000 BC, with prime numbers probably not being understood until about 500 BC. He also writes that "no attempt has been made to explain why a tally of something should exhibit multiples of two, prime numbers between 10 and 20, and some numbers that are almost multiples of 10." The Ishango bone, according to scholar Alexander Marshack, may have influenced the later development of mathematics in Egypt as, like some entries on the Ishango bone, Egyptian arithmetic also made use of multiplication by 2; this however, is disputed.
Predynastic Egyptians of the 5th millennium BC pictorially represented geometric designs. It has been claimed that megalithic monuments in England and Scotland, dating from the 3rd millennium BC, incorporate geometric ideas such as circles, ellipses, and Pythagorean triples in their design. All of the above are disputed however, and the currently oldest undisputed mathematical documents are from Babylonian and dynastic Egyptian sources.
Babylonian mathematics refers to any mathematics of the peoples of Mesopotamia (modern Iraq) from the days of the early Sumerians through the Hellenistic period almost to the dawn of Christianity. The majority of Babylonian mathematical work comes from two widely separated periods: The first few hundred years of the second millennium BC (Old Babylonian period), and the last few centuries of the first millennium BC (Seleucid period). It is named Babylonian mathematics due to the central role of Babylon as a place of study. Later under the Arab Empire, Mesopotamia, especially Baghdad, once again became an important center of study for Islamic mathematics.
In contrast to the sparsity of sources in Egyptian mathematics, our knowledge of Babylonian mathematics is derived from more than 400 clay tablets unearthed since the 1850s. Written in Cuneiform script, tablets were inscribed whilst the clay was moist, and baked hard in an oven or by the heat of the sun. Some of these appear to be graded homework.
The earliest evidence of written mathematics dates back to the ancient Sumerians, who built the earliest civilization in Mesopotamia. They developed a complex system of metrology from 3000 BC. From around 2500 BC onwards, the Sumerians wrote multiplication tables on clay tablets and dealt with geometrical exercises and division problems. The earliest traces of the Babylonian numerals also date back to this period.
Babylonian mathematics were written using a sexagesimal (base-60) numeral system. From this derives the modern-day usage of 60 seconds in a minute, 60 minutes in an hour, and 360 (60 × 6) degrees in a circle, as well as the use of seconds and minutes of arc to denote fractions of a degree. It is likely the sexagesimal system was chosen because 60 can be evenly divided by 2, 3, 4, 5, 6, 10, 12, 15, 20 and 30. Also, unlike the Egyptians, Greeks, and Romans, the Babylonians had a true place-value system, where digits written in the left column represented larger values, much as in the decimal system. The power of the Babylonian notational system lay in that it could be used to represent fractions as easily as whole numbers; thus multiplying two numbers that contained fractions was no different than multiplying integers, similar to our modern notation. The notational system of the Babylonians was the best of any civilization until the Renaissance, and its power allowed it to achieve remarkable computational accuracy; for example, the Babylonian tablet YBC 7289 gives an approximation of accurate to five decimal places. The Babylonians lacked, however, an equivalent of the decimal point, and so the place value of a symbol often had to be inferred from the context. By the Seleucid period, the Babylonians had developed a zero symbol as a placeholder for empty positions; however it was only used for intermediate positions. This zero sign does not appear in terminal positions, thus the Babylonians came close but did not develop a true place value system.
Other topics covered by Babylonian mathematics include fractions, algebra, quadratic and cubic equations, and the calculation of regular reciprocal pairs. The tablets also include multiplication tables and methods for solving linear, quadratic equations and cubic equations, a remarkable achievement for the time. Tablets from the Old Babylonian period also contain the earliest known statement of the Pythagorean theorem. However, as with Egyptian mathematics, Babylonian mathematics shows no awareness of the difference between exact and approximate solutions, or the solvability of a problem, and most importantly, no explicit statement of the need for proofs or logical principles.
Egyptian mathematics refers to mathematics written in the Egyptian language. From the Hellenistic period, Greek replaced Egyptian as the written language of Egyptian scholars. Mathematical study in Egypt later continued under the Arab Empire as part of Islamic mathematics, when Arabic became the written language of Egyptian scholars.
The most extensive Egyptian mathematical text is the Rhind papyrus (sometimes also called the Ahmes Papyrus after its author), dated to c. 1650 BC but likely a copy of an older document from the Middle Kingdom of about 2000–1800 BC. It is an instruction manual for students in arithmetic and geometry. In addition to giving area formulas and methods for multiplication, division and working with unit fractions, it also contains evidence of other mathematical knowledge, including composite and prime numbers; arithmetic, geometric and harmonic means; and simplistic understandings of both the Sieve of Eratosthenes and perfect number theory (namely, that of the number 6). It also shows how to solve first order linear equations as well as arithmetic and geometric series.
Another significant Egyptian mathematical text is the Moscow papyrus, also from the Middle Kingdom period, dated to c. 1890 BC. It consists of what are today called "word problems" or "story problems", which were apparently intended as entertainment. One problem is considered to be of particular importance because it gives a method for finding the volume of a frustum (truncated pyramid).
Finally, the Berlin Papyrus 6619 (c. 1800 BC) shows that ancient Egyptians could solve a second-order algebraic equation.
Greek mathematics refers to the mathematics written in the Greek language from the time of Thales of Miletus (~600 BC) to the closure of the Academy of Athens in 529 AD. Greek mathematicians lived in cities spread over the entire Eastern Mediterranean, from Italy to North Africa, but were united by culture and language. Greek mathematics of the period following Alexander the Great is sometimes called Hellenistic mathematics.
Greek mathematics was much more sophisticated than the mathematics that had been developed by earlier cultures. All surviving records of pre-Greek mathematics show the use of inductive reasoning, that is, repeated observations used to establish rules of thumb. Greek mathematicians, by contrast, used deductive reasoning. The Greeks used logic to derive conclusions from definitions and axioms, and used mathematical rigor to prove them.
Greek mathematics is thought to have begun with Thales of Miletus (c. 624–c.546 BC) and Pythagoras of Samos (c. 582–c. 507 BC). Although the extent of the influence is disputed, they were probably inspired by Egyptian and Babylonian mathematics. According to legend, Pythagoras traveled to Egypt to learn mathematics, geometry, and astronomy from Egyptian priests.
Thales used geometry to solve problems such as calculating the height of pyramids and the distance of ships from the shore. He is credited with the first use of deductive reasoning applied to geometry, by deriving four corollaries to Thales' Theorem. As a result, he has been hailed as the first true mathematician and the first known individual to whom a mathematical discovery has been attributed. Pythagoras established the Pythagorean School, whose doctrine it was that mathematics ruled the universe and whose motto was "All is number". It was the Pythagoreans who coined the term "mathematics", and with whom the study of mathematics for its own sake begins. The Pythagoreans are credited with the first proof of the Pythagorean theorem, though the statement of the theorem has a long history, and with the proof of the existence of irrational numbers. Although he was preceded by the Babylonians and the Chinese, the Neopythagorean mathematician Nicomachus (60–120 AD) provided one of the earliest Greco-Roman multiplication tables, whereas the oldest extant Greek multiplication table is found on a wax tablet dated to the 1st century AD (now found in the British Museum). The association of the Neopythagoreans with the Western invention of the multiplication table is evident in its later Medieval name: the "mensa Pythagorica".
Plato (428/427 BC – 348/347 BC) is important in the history of mathematics for inspiring and guiding others. His Platonic Academy, in Athens, became the mathematical center of the world in the 4th century BC, and it was from this school that the leading mathematicians of the day, such as Eudoxus of Cnidus, came. Plato also discussed the foundations of mathematics, clarified some of the definitions (e.g. that of a line as "breadthless length"), and reorganized the assumptions. The analytic method is ascribed to Plato, while a formula for obtaining Pythagorean triples bears his name.
Eudoxus (408–c. 355 BC) developed the method of exhaustion, a precursor of modern integration and a theory of ratios that avoided the problem of incommensurable magnitudes. The former allowed the calculations of areas and volumes of curvilinear figures, while the latter enabled subsequent geometers to make significant advances in geometry. Though he made no specific technical mathematical discoveries, Aristotle (384–c. 322 BC) contributed significantly to the development of mathematics by laying the foundations of logic.
In the 3rd century BC, the premier center of mathematical education and research was the Musaeum of Alexandria. It was there that Euclid (c. 300 BC) taught, and wrote the "Elements", widely considered the most successful and influential textbook of all time. The "Elements" introduced mathematical rigor through the axiomatic method and is the earliest example of the format still used in mathematics today, that of definition, axiom, theorem, and proof. Although most of the contents of the "Elements" were already known, Euclid arranged them into a single, coherent logical framework. The "Elements" was known to all educated people in the West up through the middle of the 20th century and its contents are still taught in geometry classes today. In addition to the familiar theorems of Euclidean geometry, the "Elements" was meant as an introductory textbook to all mathematical subjects of the time, such as number theory, algebra and solid geometry, including proofs that the square root of two is irrational and that there are infinitely many prime numbers. Euclid also wrote extensively on other subjects, such as conic sections, optics, spherical geometry, and mechanics, but only half of his writings survive.
Archimedes (c. 287–212 BC) of Syracuse, widely considered the greatest mathematician of antiquity, used the method of exhaustion to calculate the area under the arc of a parabola with the summation of an infinite series, in a manner not too dissimilar from modern calculus. He also showed one could use the method of exhaustion to calculate the value of π with as much precision as desired, and obtained the most accurate value of π then known, . He also studied the spiral bearing his name, obtained formulas for the volumes of surfaces of revolution (paraboloid, ellipsoid, hyperboloid), and an ingenious method of exponentiation for expressing very large numbers. While he is also known for his contributions to physics and several advanced mechanical devices, Archimedes himself placed far greater value on the products of his thought and general mathematical principles. He regarded as his greatest achievement his finding of the surface area and volume of a sphere, which he obtained by proving these are 2/3 the surface area and volume of a cylinder circumscribing the sphere.
Apollonius of Perga (c. 262–190 BC) made significant advances to the study of conic sections, showing that one can obtain all three varieties of conic section by varying the angle of the plane that cuts a double-napped cone. He also coined the terminology in use today for conic sections, namely parabola ("place beside" or "comparison"), "ellipse" ("deficiency"), and "hyperbola" ("a throw beyond"). His work "Conics" is one of the best known and preserved mathematical works from antiquity, and in it he derives many theorems concerning conic sections that would prove invaluable to later mathematicians and astronomers studying planetary motion, such as Isaac Newton. While neither Apollonius nor any other Greek mathematicians made the leap to coordinate geometry, Apollonius' treatment of curves is in some ways similar to the modern treatment, and some of his work seems to anticipate the development of analytical geometry by Descartes some 1800 years later.
Around the same time, Eratosthenes of Cyrene (c. 276–194 BC) devised the Sieve of Eratosthenes for finding prime numbers. The 3rd century BC is generally regarded as the "Golden Age" of Greek mathematics, with advances in pure mathematics henceforth in relative decline. Nevertheless, in the centuries that followed significant advances were made in applied mathematics, most notably trigonometry, largely to address the needs of astronomers. Hipparchus of Nicaea (c. 190–120 BC) is considered the founder of trigonometry for compiling the first known trigonometric table, and to him is also due the systematic use of the 360 degree circle. Heron of Alexandria (c. 10–70 AD) is credited with Heron's formula for finding the area of a scalene triangle and with being the first to recognize the possibility of negative numbers possessing square roots. Menelaus of Alexandria (c. 100 AD) pioneered spherical trigonometry through Menelaus' theorem. The most complete and influential trigonometric work of antiquity is the "Almagest" of Ptolemy (c. AD 90–168), a landmark astronomical treatise whose trigonometric tables would be used by astronomers for the next thousand years. Ptolemy is also credited with Ptolemy's theorem for deriving trigonometric quantities, and the most accurate value of π outside of China until the medieval period, 3.1416.
Following a period of stagnation after Ptolemy, the period between 250 and 350 AD is sometimes referred to as the "Silver Age" of Greek mathematics. During this period, Diophantus made significant advances in algebra, particularly indeterminate analysis, which is also known as "Diophantine analysis". The study of Diophantine equations and Diophantine approximations is a significant area of research to this day. His main work was the "Arithmetica", a collection of 150 algebraic problems dealing with exact solutions to determinate and indeterminate equations. The "Arithmetica" had a significant influence on later mathematicians, such as Pierre de Fermat, who arrived at his famous Last Theorem after trying to generalize a problem he had read in the "Arithmetica" (that of dividing a square into two squares). Diophantus also made significant advances in notation, the "Arithmetica" being the first instance of algebraic symbolism and syncopation.
Among the last great Greek mathematicians is Pappus of Alexandria (4th century AD). He is known for his hexagon theorem and centroid theorem, as well as the Pappus configuration and Pappus graph. His "Collection" is a major source of knowledge on Greek mathematics as most of it has survived. Pappus is considered the last major innovator in Greek mathematics, with subsequent work consisting mostly of commentaries on earlier work.
The first woman mathematician recorded by history was Hypatia of Alexandria (AD 350–415). She succeeded her father (Theon of Alexandria) as Librarian at the Great Library and wrote many works on applied mathematics. Because of a political dispute, the Christian community in Alexandria had her stripped publicly and executed. Her death is sometimes taken as the end of the era of the Alexandrian Greek mathematics, although work did continue in Athens for another century with figures such as Proclus, Simplicius and Eutocius. Although Proclus and Simplicius were more philosophers than mathematicians, their commentaries on earlier works are valuable sources on Greek mathematics. The closure of the neo-Platonic Academy of Athens by the emperor Justinian in 529 AD is traditionally held as marking the end of the era of Greek mathematics, although the Greek tradition continued unbroken in the Byzantine empire with mathematicians such as Anthemius of Tralles and Isidore of Miletus, the architects of the Hagia Sophia. Nevertheless, Byzantine mathematics consisted mostly of commentaries, with little in the way of innovation, and the centers of mathematical innovation were to be found elsewhere by this time.
Although ethnic Greek mathematicians continued under the rule of the late Roman Republic and subsequent Roman Empire, there were no noteworthy native Latin mathematicians in comparison. Ancient Romans such as Cicero (106–43 BC), an influential Roman statesman who studied mathematics in Greece, believed that Roman surveyors and calculators were far more interested in applied mathematics than the theoretical mathematics and geometry that were prized by the Greeks. It is unclear if the Romans first derived their numerical system directly from the Greek precedent or from Etruscan numerals used by the Etruscan civilization centered in what is now Tuscany, central Italy.
Using calculation, Romans were adept at both instigating and detecting financial fraud, as well as managing taxes for the treasury. Siculus Flaccus, one of the Roman "gromatici" (i.e. land surveyor), wrote the "Categories of Fields", which aided Roman surveyors in measuring the surface areas of allotted lands and territories. Aside from managing trade and taxes, the Romans also regularly applied mathematics to solve problems in engineering, including the erection of architecture such as bridges, road-building, and preparation for military campaigns. Arts and crafts such as Roman mosaics, inspired by previous Greek designs, created illusionist geometric patterns and rich, detailed scenes that required precise measurements for each tessera tile, the opus tessellatum pieces on average measuring eight millimeters square and the finer opus vermiculatum pieces having an average surface of four millimeters square.
The creation of the Roman calendar also necessitated basic mathematics. The first calendar allegedly dates back to 8th century BC during the Roman Kingdom and included 356 days plus a leap year every other year. In contrast, the lunar calendar of the Republican era contained 355 days, roughly ten-and-one-fourth days shorter than the solar year, a discrepancy that was solved by adding an extra month into the calendar after the 23rd of February. This calendar was supplanted by the Julian calendar, a solar calendar organized by Julius Caesar (100–44 BC) and devised by Sosigenes of Alexandria to include a leap day every four years in a 365-day cycle. This calendar, which contained an error of 11 minutes and 14 seconds, was later corrected by the Gregorian calendar organized by Pope Gregory XIII (), virtually the same solar calendar used in modern times as the international standard calendar.
At roughly the same time, the Han Chinese and the Romans both invented the wheeled odometer device for measuring distances traveled, the Roman model first described by the Roman civil engineer and architect Vitruvius (c. 80 BC – c. 15 BC). The device was used at least until the reign of emperor Commodus (), but its design seems to have been lost until experiments were made during the 15th century in Western Europe. Perhaps relying on similar gear-work and technology found in the Antikythera mechanism, the odometer of Vitruvius featured chariot wheels measuring 4 feet (1.2 m) in diameter turning four-hundred times in one Roman mile (roughly 4590 ft/1400 m). With each revolution, a pin-and-axle device engaged a 400-tooth cogwheel that turned a second gear responsible for dropping pebbles into a box, each pebble representing one mile traversed.
An analysis of early Chinese mathematics has demonstrated its unique development compared to other parts of the world, leading scholars to assume an entirely independent development. The oldest extant mathematical text from China is the "Zhoubi Suanjing", variously dated to between 1200 BC and 100 BC, though a date of about 300 BC during the Warring States Period appears reasonable. However, the Tsinghua Bamboo Slips, containing the earliest known decimal multiplication table (although ancient Babylonians had ones with a base of 60), is dated around 305 BC and is perhaps the oldest surviving mathematical text of China.
Of particular note is the use in Chinese mathematics of a decimal positional notation system, the so-called "rod numerals" in which distinct ciphers were used for numbers between 1 and 10, and additional ciphers for powers of ten. Thus, the number 123 would be written using the symbol for "1", followed by the symbol for "100", then the symbol for "2" followed by the symbol for "10", followed by the symbol for "3". This was the most advanced number system in the world at the time, apparently in use several centuries before the common era and well before the development of the Indian numeral system. Rod numerals allowed the representation of numbers as large as desired and allowed calculations to be carried out on the "suan pan", or Chinese abacus. The date of the invention of the "suan pan" is not certain, but the earliest written mention dates from AD 190, in Xu Yue's "Supplementary Notes on the Art of Figures".
The oldest existent work on geometry in China comes from the philosophical Mohist canon c. 330 BC, compiled by the followers of Mozi (470–390 BC). The "Mo Jing" described various aspects of many fields associated with physical science, and provided a small number of geometrical theorems as well. It also defined the concepts of circumference, diameter, radius, and volume.
In 212 BC, the Emperor Qin Shi Huang commanded all books in the Qin Empire other than officially sanctioned ones be burned. This decree was not universally obeyed, but as a consequence of this order little is known about ancient Chinese mathematics before this date. After the book burning of 212 BC, the Han dynasty (202 BC–220 AD) produced works of mathematics which presumably expanded on works that are now lost. The most important of these is "The Nine Chapters on the Mathematical Art", the full title of which appeared by AD 179, but existed in part under other titles beforehand. It consists of 246 word problems involving agriculture, business, employment of geometry to figure height spans and dimension ratios for Chinese pagoda towers, engineering, surveying, and includes material on right triangles. It created mathematical proof for the Pythagorean theorem, and a mathematical formula for Gaussian elimination. The treatise also provides values of π, which Chinese mathematicians originally approximated as 3 until Liu Xin (d. 23 AD) provided a figure of 3.1457 and subsequently Zhang Heng (78–139) approximated pi as 3.1724, as well as 3.162 by taking the square root of 10. Liu Hui commented on the "Nine Chapters" in the 3rd century AD and gave a value of π accurate to 5 decimal places (i.e. 3.14159). Though more of a matter of computational stamina than theoretical insight, in the 5th century AD Zu Chongzhi computed the value of π to seven decimal places (i.e. 3.141592), which remained the most accurate value of π for almost the next 1000 years. He also established a method which would later be called Cavalieri's principle to find the volume of a sphere.
The high-water mark of Chinese mathematics occurred in the 13th century during the latter half of the Song dynasty (960–1279), with the development of Chinese algebra. The most important text from that period is the "Precious Mirror of the Four Elements" by Zhu Shijie (1249–1314), dealing with the solution of simultaneous higher order algebraic equations using a method similar to Horner's method. The "Precious Mirror" also contains a diagram of Pascal's triangle with coefficients of binomial expansions through the eighth power, though both appear in Chinese works as early as 1100. The Chinese also made use of the complex combinatorial diagram known as the magic square and magic circles, described in ancient times and perfected by Yang Hui (AD 1238–1298).
Even after European mathematics began to flourish during the Renaissance, European and Chinese mathematics were separate traditions, with significant Chinese mathematical output in decline from the 13th century onwards. Jesuit missionaries such as Matteo Ricci carried mathematical ideas back and forth between the two cultures from the 16th to 18th centuries, though at this point far more mathematical ideas were entering China than leaving.
Japanese mathematics, Korean mathematics, and Vietnamese mathematics are traditionally viewed as stemming from Chinese mathematics and belonging to the Confucian-based East Asian cultural sphere. Korean and Japanese mathematics were heavily influenced by the algebraic works produced during China's Song dynasty, whereas Vietnamese mathematics was heavily indebted to popular works of China's Ming dynasty (1368–1644). For instance, although Vietnamese mathematical treatises were written in either Chinese or the native Vietnamese Chữ Nôm script, all of them followed the Chinese format of presenting a collection of problems with algorithms for solving them, followed by numerical answers. Mathematics in Vietnam and Korea were mostly associated with the professional court bureaucracy of mathematicians and astronomers, whereas in Japan it was more prevalent in the realm of private schools.
The earliest civilization on the Indian subcontinent is the Indus Valley Civilization (mature phase: 2600 to 1900 BC) that flourished in the Indus river basin. Their cities were laid out with geometric regularity, but no known mathematical documents survive from this civilization.
The oldest extant mathematical records from India are the Sulba Sutras (dated variously between the 8th century BC and the 2nd century AD), appendices to religious texts which give simple rules for constructing altars of various shapes, such as squares, rectangles, parallelograms, and others. As with Egypt, the preoccupation with temple functions points to an origin of mathematics in religious ritual. The Sulba Sutras give methods for constructing a circle with approximately the same area as a given square, which imply several different approximations of the value of π. In addition, they compute the square root of 2 to several decimal places, list Pythagorean triples, and give a statement of the Pythagorean theorem. All of these results are present in Babylonian mathematics, indicating Mesopotamian influence. It is not known to what extent the Sulba Sutras influenced later Indian mathematicians. As in China, there is a lack of continuity in Indian mathematics; significant advances are separated by long periods of inactivity.
Pāṇini (c. 5th century BC) formulated the rules for Sanskrit grammar. His notation was similar to modern mathematical notation, and used metarules, transformations, and recursion. Pingala (roughly 3rd–1st centuries BC) in his treatise of prosody uses a device corresponding to a binary numeral system. His discussion of the combinatorics of meters corresponds to an elementary version of the binomial theorem. Pingala's work also contains the basic ideas of Fibonacci numbers (called "mātrāmeru").
The next significant mathematical documents from India after the "Sulba Sutras" are the "Siddhantas", astronomical treatises from the 4th and 5th centuries AD (Gupta period) showing strong Hellenistic influence. They are significant in that they contain the first instance of trigonometric relations based on the half-chord, as is the case in modern trigonometry, rather than the full chord, as was the case in Ptolemaic trigonometry. Through a series of translation errors, the words "sine" and "cosine" derive from the Sanskrit "jiya" and "kojiya".
Around 500 AD, Aryabhata wrote the "Aryabhatiya", a slim volume, written in verse, intended to supplement the rules of calculation used in astronomy and mathematical mensuration, though with no feeling for logic or deductive methodology. Though about half of the entries are wrong, it is in the "Aryabhatiya" that the decimal place-value system first appears. Several centuries later, the Muslim mathematician Abu Rayhan Biruni described the "Aryabhatiya" as a "mix of common pebbles and costly crystals".
In the 7th century, Brahmagupta identified the Brahmagupta theorem, Brahmagupta's identity and Brahmagupta's formula, and for the first time, in "Brahma-sphuta-siddhanta", he lucidly explained the use of zero as both a placeholder and decimal digit, and explained the Hindu–Arabic numeral system. It was from a translation of this Indian text on mathematics (c. 770) that Islamic mathematicians were introduced to this numeral system, which they adapted as Arabic numerals. Islamic scholars carried knowledge of this number system to Europe by the 12th century, and it has now displaced all older number systems throughout the world. Various symbol sets are used to represent numbers in the Hindu–Arabic numeral system, all of which evolved from the Brahmi numerals. Each of the roughly dozen major scripts of India has its own numeral glyphs. In the 10th century, Halayudha's commentary on Pingala's work contains a study of the Fibonacci sequence and Pascal's triangle, and describes the formation of a matrix.
In the 12th century, Bhāskara II lived in southern India and wrote extensively on all then known branches of mathematics. His work contains mathematical objects equivalent or approximately equivalent to infinitesimals, derivatives, the mean value theorem and the derivative of the sine function. To what extent he anticipated the invention of calculus is a controversial subject among historians of mathematics.
In the 14th century, Madhava of Sangamagrama, the founder of the so-called Kerala School of Mathematics, found the Madhava–Leibniz series and obtained from it a transformed series, whose first 21 terms he used to compute the value of π as 3.14159265359. Madhava also found the Madhava-Gregory series to determine the arctangent, the Madhava-Newton power series to determine sine and cosine and the Taylor approximation for sine and cosine functions. In the 16th century, Jyesthadeva consolidated many of the Kerala School's developments and theorems in the "Yukti-bhāṣā".
It has been argued that the advances of the Kerala school, which laid the foundations of the calculus, were transmitted to Europe in the 16th century. via Jesuit missionaries and traders who were active around the ancient port of Muziris at the time and, as a result, directly influenced later European developments in analysis and calculus. However, other scholars argue that the Kerala School did not formulate a systematic theory of differentiation and integration, and that there is any direct evidence of their results being transmitted outside Kerala.
The Islamic Empire established across Persia, the Middle East, Central Asia, North Africa, Iberia, and in parts of India in the 8th century made significant contributions towards mathematics. Although most Islamic texts on mathematics were written in Arabic, most of them were not written by Arabs, since much like the status of Greek in the Hellenistic world, Arabic was used as the written language of non-Arab scholars throughout the Islamic world at the time. Persians contributed to the world of Mathematics alongside Arabs.
In the 9th century, the Persian mathematician Muḥammad ibn Mūsā al-Khwārizmī wrote several important books on the Hindu–Arabic numerals and on methods for solving equations. His book "On the Calculation with Hindu Numerals", written about 825, along with the work of Al-Kindi, were instrumental in spreading Indian mathematics and Indian numerals to the West. The word "algorithm" is derived from the Latinization of his name, Algoritmi, and the word "algebra" from the title of one of his works, "Al-Kitāb al-mukhtaṣar fī hīsāb al-ğabr wa’l-muqābala" ("The Compendious Book on Calculation by Completion and Balancing"). He gave an exhaustive explanation for the algebraic solution of quadratic equations with positive roots, and he was the first to teach algebra in an elementary form and for its own sake. He also discussed the fundamental method of "reduction" and "balancing", referring to the transposition of subtracted terms to the other side of an equation, that is, the cancellation of like terms on opposite sides of the equation. This is the operation which al-Khwārizmī originally described as "al-jabr". His algebra was also no longer concerned "with a series of problems to be resolved, but an exposition which starts with primitive terms in which the combinations must give all possible prototypes for equations, which henceforward explicitly constitute the true object of study." He also studied an equation for its own sake and "in a generic manner, insofar as it does not simply emerge in the course of solving a problem, but is specifically called on to define an infinite class of problems."
In Egypt, Abu Kamil extended algebra to the set of irrational numbers, accepting square roots and fourth roots as solutions and coefficients to quadratic equations. He also developed techniques used to solve three non-linear simultaneous equations with three unknown variables. One unique feature of his works was trying to find all the possible solutions to some of his problems, including one where he found 2676 solutions. His works formed an important foundation for the development of algebra and influenced later mathematicians, such as al-Karaji and Fibonacci.
Further developments in algebra were made by Al-Karaji in his treatise "al-Fakhri", where he extends the methodology to incorporate integer powers and integer roots of unknown quantities. Something close to a proof by mathematical induction appears in a book written by Al-Karaji around 1000 AD, who used it to prove the binomial theorem, Pascal's triangle, and the sum of integral cubes. The historian of mathematics, F. Woepcke, praised Al-Karaji for being "the first who introduced the theory of algebraic calculus." Also in the 10th century, Abul Wafa translated the works of Diophantus into Arabic. Ibn al-Haytham was the first mathematician to derive the formula for the sum of the fourth powers, using a method that is readily generalizable for determining the general formula for the sum of any integral powers. He performed an integration in order to find the volume of a paraboloid, and was able to generalize his result for the integrals of polynomials up to the fourth degree. He thus came close to finding a general formula for the integrals of polynomials, but he was not concerned with any polynomials higher than the fourth degree.
In the late 11th century, Omar Khayyam wrote "Discussions of the Difficulties in Euclid", a book about what he perceived as flaws in Euclid's "Elements", especially the parallel postulate. He was also the first to find the general geometric solution to cubic equations. He was also very influential in calendar reform.
In the 13th century, Nasir al-Din Tusi (Nasireddin) made advances in spherical trigonometry. He also wrote influential work on Euclid's parallel postulate. In the 15th century, Ghiyath al-Kashi computed the value of π to the 16th decimal place. Kashi also had an algorithm for calculating "n"th roots, which was a special case of the methods given many centuries later by Ruffini and Horner.
Other achievements of Muslim mathematicians during this period include the addition of the decimal point notation to the Arabic numerals, the discovery of all the modern trigonometric functions besides the sine, al-Kindi's introduction of cryptanalysis and frequency analysis, the development of analytic geometry by Ibn al-Haytham, the beginning of algebraic geometry by Omar Khayyam and the development of an algebraic notation by al-Qalasādī.
During the time of the Ottoman Empire and Safavid Empire from the 15th century, the development of Islamic mathematics became stagnant.
In the Pre-Columbian Americas, the Maya civilization that flourished in Mexico and Central America during the 1st millennium AD developed a unique tradition of mathematics that, due to its geographic isolation, was entirely independent of existing European, Egyptian, and Asian mathematics. Maya numerals utilized a base of 20, the vigesimal system, instead of a base of ten that forms the basis of the decimal system used by most modern cultures. The Mayas used mathematics to create the Maya calendar as well as to predict astronomical phenomena in their native Maya astronomy. While the concept of zero had to be inferred in the mathematics of many contemporary cultures, the Mayas developed a standard symbol for it.
Medieval European interest in mathematics was driven by concerns quite different from those of modern mathematicians. One driving element was the belief that mathematics provided the key to understanding the created order of nature, frequently justified by Plato's "Timaeus" and the biblical passage (in the "Book of Wisdom") that God had "ordered all things in measure, and number, and weight".
Boethius provided a place for mathematics in the curriculum in the 6th century when he coined the term "quadrivium" to describe the study of arithmetic, geometry, astronomy, and music. He wrote "De institutione arithmetica", a free translation from the Greek of Nicomachus's "Introduction to Arithmetic"; "De institutione musica", also derived from Greek sources; and a series of excerpts from Euclid's "Elements". His works were theoretical, rather than practical, and were the basis of mathematical study until the recovery of Greek and Arabic mathematical works.
In the 12th century, European scholars traveled to Spain and Sicily seeking scientific Arabic texts, including al-Khwārizmī's "The Compendious Book on Calculation by Completion and Balancing", translated into Latin by Robert of Chester, and the complete text of Euclid's "Elements", translated in various versions by Adelard of Bath, Herman of Carinthia, and Gerard of Cremona. These and other new sources sparked a renewal of mathematics.
Leonardo of Pisa, now known as Fibonacci, serendipitously learned about the Hindu–Arabic numerals on a trip to what is now Béjaïa, Algeria with his merchant father. (Europe was still using Roman numerals.) There, he observed a system of arithmetic (specifically algorism) which due to the positional notation of Hindu–Arabic numerals was much more efficient and greatly facilitated commerce. Leonardo wrote "Liber Abaci" in 1202 (updated in 1254) introducing the technique to Europe and beginning a long period of popularizing it. The book also brought to Europe what is now known as the Fibonacci sequence (known to Indian mathematicians for hundreds of years before that) which was used as an unremarkable example within the text.
The 14th century saw the development of new mathematical concepts to investigate a wide range of problems. One important contribution was development of mathematics of local motion.
Thomas Bradwardine proposed that speed (V) increases in arithmetic proportion as the ratio of force (F) to resistance (R) increases in geometric proportion. Bradwardine expressed this by a series of specific examples, but although the logarithm had not yet been conceived, we can express his conclusion anachronistically by writing:
V = log (F/R). Bradwardine's analysis is an example of transferring a mathematical technique used by al-Kindi and Arnald of Villanova to quantify the nature of compound medicines to a different physical problem.
One of the 14th-century Oxford Calculators, William Heytesbury, lacking differential calculus and the concept of limits, proposed to measure instantaneous speed "by the path that would be described by [a body] if... it were moved uniformly at the same degree of speed with which it is moved in that given instant".
Heytesbury and others mathematically determined the distance covered by a body undergoing uniformly accelerated motion (today solved by integration), stating that "a moving body uniformly acquiring or losing that increment [of speed] will traverse in some given time a [distance] completely equal to that which it would traverse if it were moving continuously through the same time with the mean degree [of speed]".
Nicole Oresme at the University of Paris and the Italian Giovanni di Casali independently provided graphical demonstrations of this relationship, asserting that the area under the line depicting the constant acceleration, represented the total distance traveled. In a later mathematical commentary on Euclid's "Elements", Oresme made a more detailed general analysis in which he demonstrated that a body will acquire in each successive increment of time an increment of any quality that increases as the odd numbers. Since Euclid had demonstrated the sum of the odd numbers are the square numbers, the total quality acquired by the body increases as the square of the time.
During the Renaissance, the development of mathematics and of accounting were intertwined. While there is no direct relationship between algebra and accounting, the teaching of the subjects and the books published often intended for the children of merchants who were sent to reckoning schools (in Flanders and Germany) or abacus schools (known as "abbaco" in Italy), where they learned the skills useful for trade and commerce. There is probably no need for algebra in performing bookkeeping operations, but for complex bartering operations or the calculation of compound interest, a basic knowledge of arithmetic was mandatory and knowledge of algebra was very useful.
Piero della Francesca (c. 1415–1492) wrote books on solid geometry and linear perspective, including "De Prospectiva Pingendi (On Perspective for Painting)", "Trattato d’Abaco (Abacus Treatise)", and "De corporibus regularibus (Regular Solids)".
Luca Pacioli's "Summa de Arithmetica, Geometria, Proportioni et Proportionalità" (Italian: "Review of Arithmetic, Geometry, Ratio and Proportion") was first printed and published in Venice in 1494. It included a 27-page treatise on bookkeeping, ""Particularis de Computis et Scripturis"" (Italian: "Details of Calculation and Recording"). It was written primarily for, and sold mainly to, merchants who used the book as a reference text, as a source of pleasure from the mathematical puzzles it contained, and to aid the education of their sons. In "Summa Arithmetica", Pacioli introduced symbols for plus and minus for the first time in a printed book, symbols that became standard notation in Italian Renaissance mathematics. "Summa Arithmetica" was also the first known book printed in Italy to contain algebra. Pacioli obtained many of his ideas from Piero Della Francesca whom he plagiarized.
In Italy, during the first half of the 16th century, Scipione del Ferro and Niccolò Fontana Tartaglia discovered solutions for cubic equations. Gerolamo Cardano published them in his 1545 book "Ars Magna", together with a solution for the quartic equations, discovered by his student Lodovico Ferrari. In 1572 Rafael Bombelli published his "L'Algebra" in which he showed how to deal with the imaginary quantities that could appear in Cardano's formula for solving cubic equations.
Simon Stevin's book "De Thiende" ('the art of tenths'), first published in Dutch in 1585, contained the first systematic treatment of decimal notation, which influenced all later work on the real number system.
Driven by the demands of navigation and the growing need for accurate maps of large areas, trigonometry grew to be a major branch of mathematics. Bartholomaeus Pitiscus was the first to use the word, publishing his "Trigonometria" in 1595. Regiomontanus's table of sines and cosines was published in 1533.
During the Renaissance the desire of artists to represent the natural world realistically, together with the rediscovered philosophy of the Greeks, led artists to study mathematics. They were also the engineers and architects of that time, and so had need of mathematics in any case. The art of painting in perspective, and the developments in geometry that involved, were studied intensely.
The 17th century saw an unprecedented increase of mathematical and scientific ideas across Europe. Galileo observed the moons of Jupiter in orbit about that planet, using a telescope based on a toy imported from Holland. Tycho Brahe had gathered an enormous quantity of mathematical data describing the positions of the planets in the sky. By his position as Brahe's assistant, Johannes Kepler was first exposed to and seriously interacted with the topic of planetary motion. Kepler's calculations were made simpler by the contemporaneous invention of logarithms by John Napier and Jost Bürgi. Kepler succeeded in formulating mathematical laws of planetary motion.
The analytic geometry developed by René Descartes (1596–1650) allowed those orbits to be plotted on a graph, in Cartesian coordinates.
Building on earlier work by many predecessors, Isaac Newton discovered the laws of physics explaining Kepler's Laws, and brought together the concepts now known as calculus. Independently, Gottfried Wilhelm Leibniz, who is arguably one of the most important mathematicians of the 17th century, developed calculus and much of the calculus notation still in use today. Science and mathematics had become an international endeavor, which would soon spread over the entire world.
In addition to the application of mathematics to the studies of the heavens, applied mathematics began to expand into new areas, with the correspondence of Pierre de Fermat and Blaise Pascal. Pascal and Fermat set the groundwork for the investigations of probability theory and the corresponding rules of combinatorics in their discussions over a game of gambling. Pascal, with his wager, attempted to use the newly developing probability theory to argue for a life devoted to religion, on the grounds that even if the probability of success was small, the rewards were infinite. In some sense, this foreshadowed the development of utility theory in the 18th–19th century.
The most influential mathematician of the 18th century was arguably Leonhard Euler. His contributions range from founding the study of graph theory with the Seven Bridges of Königsberg problem to standardizing many modern mathematical terms and notations. For example, he named the square root of minus 1 with the symbol "i", and he popularized the use of the Greek letter formula_1 to stand for the ratio of a circle's circumference to its diameter. He made numerous contributions to the study of topology, graph theory, calculus, combinatorics, and complex analysis, as evidenced by the multitude of theorems and notations named for him.
Other important European mathematicians of the 18th century included Joseph Louis Lagrange, who did pioneering work in number theory, algebra, differential calculus, and the calculus of variations, and Laplace who, in the age of Napoleon, did important work on the foundations of celestial mechanics and on statistics.
Throughout the 19th century mathematics became increasingly abstract. Carl Friedrich Gauss (1777–1855) epitomizes this trend. He did revolutionary work on functions of complex variables, in geometry, and on the convergence of series, leaving aside his many contributions to science. He also gave the first satisfactory proofs of the fundamental theorem of algebra and of the quadratic reciprocity law.
This century saw the development of the two forms of non-Euclidean geometry, where the parallel postulate of Euclidean geometry no longer holds.
The Russian mathematician Nikolai Ivanovich Lobachevsky and his rival, the Hungarian mathematician János Bolyai, independently defined and studied hyperbolic geometry, where uniqueness of parallels no longer holds. In this geometry the sum of angles in a triangle add up to less than 180°. Elliptic geometry was developed later in the 19th century by the German mathematician Bernhard Riemann; here no parallel can be found and the angles in a triangle add up to more than 180°. Riemann also developed Riemannian geometry, which unifies and vastly generalizes the three types of geometry, and he defined the concept of a manifold, which generalizes the ideas of curves and surfaces.
The 19th century saw the beginning of a great deal of abstract algebra. Hermann Grassmann in Germany gave a first version of vector spaces, William Rowan Hamilton in Ireland developed noncommutative algebra. The British mathematician George Boole devised an algebra that soon evolved into what is now called Boolean algebra, in which the only numbers were 0 and 1. Boolean algebra is the starting point of mathematical logic and has important applications in electrical engineering and computer science.
Augustin-Louis Cauchy, Bernhard Riemann, and Karl Weierstrass reformulated the calculus in a more rigorous fashion.
Also, for the first time, the limits of mathematics were explored. Niels Henrik Abel, a Norwegian, and Évariste Galois, a Frenchman, proved that there is no general algebraic method for solving polynomial equations of degree greater than four (Abel–Ruffini theorem). Other 19th-century mathematicians utilized this in their proofs that straightedge and compass alone are not sufficient to trisect an arbitrary angle, to construct the side of a cube twice the volume of a given cube, nor to construct a square equal in area to a given circle. Mathematicians had vainly attempted to solve all of these problems since the time of the ancient Greeks. On the other hand, the limitation of three dimensions in geometry was surpassed in the 19th century through considerations of parameter space and hypercomplex numbers.
Abel and Galois's investigations into the solutions of various polynomial equations laid the groundwork for further developments of group theory, and the associated fields of abstract algebra. In the 20th century physicists and other scientists have seen group theory as the ideal way to study symmetry.
In the later 19th century, Georg Cantor established the first foundations of set theory, which enabled the rigorous treatment of the notion of infinity and has become the common language of nearly all mathematics. Cantor's set theory, and the rise of mathematical logic in the hands of Peano, L.E.J. Brouwer, David Hilbert, Bertrand Russell, and A.N. Whitehead, initiated a long running debate on the foundations of mathematics.
The 19th century saw the founding of a number of national mathematical societies: the London Mathematical Society in 1865, the Société Mathématique de France in 1872, the Circolo Matematico di Palermo in 1884, the Edinburgh Mathematical Society in 1883, and the American Mathematical Society in 1888. The first international, special-interest society, the Quaternion Society, was formed in 1899, in the context of a vector controversy.
In 1897, Hensel introduced p-adic numbers.
The 20th century saw mathematics become a major profession. Every year, thousands of new Ph.D.s in mathematics were awarded, and jobs were available in both teaching and industry. An effort to catalogue the areas and applications of mathematics was undertaken in Klein's encyclopedia.
In a 1900 speech to the International Congress of Mathematicians, David Hilbert set out a list of 23 unsolved problems in mathematics. These problems, spanning many areas of mathematics, formed a central focus for much of 20th-century mathematics. Today, 10 have been solved, 7 are partially solved, and 2 are still open. The remaining 4 are too loosely formulated to be stated as solved or not.
Notable historical conjectures were finally proven. In 1976, Wolfgang Haken and Kenneth Appel proved the four color theorem, controversial at the time for the use of a computer to do so. Andrew Wiles, building on the work of others, proved Fermat's Last Theorem in 1995. Paul Cohen and Kurt Gödel proved that the continuum hypothesis is independent of (could neither be proved nor disproved from) the standard axioms of set theory. In 1998 Thomas Callister Hales proved the Kepler conjecture.
Mathematical collaborations of unprecedented size and scope took place. An example is the classification of finite simple groups (also called the "enormous theorem"), whose proof between 1955 and 2004 required 500-odd journal articles by about 100 authors, and filling tens of thousands of pages. A group of French mathematicians, including Jean Dieudonné and André Weil, publishing under the pseudonym "Nicolas Bourbaki", attempted to exposit all of known mathematics as a coherent rigorous whole. The resulting several dozen volumes has had a controversial influence on mathematical education.
Differential geometry came into its own when Albert Einstein used it in general relativity. Entirely new areas of mathematics such as mathematical logic, topology, and John von Neumann's game theory changed the kinds of questions that could be answered by mathematical methods. All kinds of structures were abstracted using axioms and given names like metric spaces, topological spaces etc. As mathematicians do, the concept of an abstract structure was itself abstracted and led to category theory. Grothendieck and Serre recast algebraic geometry using sheaf theory. Large advances were made in the qualitative study of dynamical systems that Poincaré had begun in the 1890s.
Measure theory was developed in the late 19th and early 20th centuries. Applications of measures include the Lebesgue integral, Kolmogorov's axiomatisation of probability theory, and ergodic theory. Knot theory greatly expanded. Quantum mechanics led to the development of functional analysis. Other new areas include Laurent Schwartz's distribution theory, fixed point theory, singularity theory and René Thom's catastrophe theory, model theory, and Mandelbrot's fractals. Lie theory with its Lie groups and Lie algebras became one of the major areas of study.
Non-standard analysis, introduced by Abraham Robinson, rehabilitated the infinitesimal approach to calculus, which had fallen into disrepute in favour of the theory of limits, by extending the field of real numbers to the Hyperreal numbers which include infinitesimal and infinite quantities. An even larger number system, the surreal numbers were discovered by John Horton Conway in connection with combinatorial games.
The development and continual improvement of computers, at first mechanical analog machines and then digital electronic machines, allowed industry to deal with larger and larger amounts of data to facilitate mass production and distribution and communication, and new areas of mathematics were developed to deal with this: Alan Turing's computability theory; complexity theory; Derrick Henry Lehmer's use of ENIAC to further number theory and the Lucas-Lehmer test; Rózsa Péter's recursive function theory; Claude Shannon's information theory; signal processing; data analysis; optimization and other areas of operations research. In the preceding centuries much mathematical focus was on calculus and continuous functions, but the rise of computing and communication networks led to an increasing importance of discrete concepts and the expansion of combinatorics including graph theory. The speed and data processing abilities of computers also enabled the handling of mathematical problems that were too time-consuming to deal with by pencil and paper calculations, leading to areas such as numerical analysis and symbolic computation. Some of the most important methods and algorithms of the 20th century are: the simplex algorithm, the Fast Fourier Transform, error-correcting codes, the Kalman filter from control theory and the RSA algorithm of public-key cryptography.
At the same time, deep insights were made about the limitations to mathematics. In 1929 and 1930, it was proved the truth or falsity of all statements formulated about the natural numbers plus one of addition and multiplication, was decidable, i.e. could be determined by some algorithm. In 1931, Kurt Gödel found that this was not the case for the natural numbers plus both addition and multiplication; this system, known as Peano arithmetic, was in fact incompletable. (Peano arithmetic is adequate for a good deal of number theory, including the notion of prime number.) A consequence of Gödel's two incompleteness theorems is that in any mathematical system that includes Peano arithmetic (including all of analysis and geometry), truth necessarily outruns proof, i.e. there are true statements that cannot be proved within the system. Hence mathematics cannot be reduced to mathematical logic, and David Hilbert's dream of making all of mathematics complete and consistent needed to be reformulated.
One of the more colorful figures in 20th-century mathematics was Srinivasa Aiyangar Ramanujan (1887–1920), an Indian autodidact who conjectured or proved over 3000 theorems, including properties of highly composite numbers, the partition function and its asymptotics, and mock theta functions. He also made major investigations in the areas of gamma functions, modular forms, divergent series, hypergeometric series and prime number theory.
Paul Erdős published more papers than any other mathematician in history, working with hundreds of collaborators. Mathematicians have a game equivalent to the Kevin Bacon Game, which leads to the Erdős number of a mathematician. This describes the "collaborative distance" between a person and Paul Erdős, as measured by joint authorship of mathematical papers.
Emmy Noether has been described by many as the most important woman in the history of mathematics. She studied the theories of rings, fields, and algebras.
As in most areas of study, the explosion of knowledge in the scientific age has led to specialization: by the end of the century there were hundreds of specialized areas in mathematics and the Mathematics Subject Classification was dozens of pages long. More and more mathematical journals were published and, by the end of the century, the development of the World Wide Web led to online publishing.
In 2000, the Clay Mathematics Institute announced the seven Millennium Prize Problems, and in 2003 the Poincaré conjecture was solved by Grigori Perelman (who declined to accept an award, as he was critical of the mathematics establishment).
Most mathematical journals now have online versions as well as print versions, and many online-only journals are launched. There is an increasing drive towards open access publishing, first popularized by the arXiv.
There are many observable trends in mathematics, the most notable being that the subject is growing ever larger, computers are ever more important and powerful, the application of mathematics to bioinformatics is rapidly expanding, and the volume of data being produced by science and industry, facilitated by computers, is explosively expanding.
|
https://en.wikipedia.org/wiki?curid=14220
|
Hydrogen atom
A hydrogen atom is an atom of the chemical element hydrogen. The electrically neutral atom contains a single positively charged proton and a single negatively charged electron bound to the nucleus by the Coulomb force. Atomic hydrogen constitutes about 75% of the baryonic mass of the universe.
In everyday life on Earth, isolated hydrogen atoms (called "atomic hydrogen") are extremely rare. Instead, a hydrogen atom tends to combine with other atoms in compounds, or with another hydrogen atom to form ordinary (diatomic) hydrogen gas, H2. "Atomic hydrogen" and "hydrogen atom" in ordinary English use have overlapping, yet distinct, meanings. For example, a water molecule contains two hydrogen atoms, but does not contain atomic hydrogen (which would refer to isolated hydrogen atoms).
Atomic spectroscopy shows that there is a discrete infinite set of states in which a hydrogen (or any) atom can exist, contrary to the predictions of classical physics. Attempts to develop a theoretical understanding of the states of the hydrogen atom have been important to the history of quantum mechanics, since all other atoms can be roughly understood by knowing in detail about this simplest atomic structure.
The most abundant isotope, hydrogen-1, protium, or light hydrogen, contains no neutrons and is simply a proton and an electron. Protium is stable and makes up 99.985% of naturally occurring hydrogen atoms.
Deuterium contains one neutron and one proton. Deuterium is stable and makes up 0.0156% of naturally occurring hydrogen and is used in industrial processes like nuclear reactors and Nuclear Magnetic Resonance.
Tritium contains two neutrons and one proton and is not stable, decaying with a half-life of 12.32 years. Because of its short half-life, tritium does not exist in nature except in trace amounts.
Heavier isotopes of hydrogen are only created artificially in particle accelerators and have half-lives on the order of 10−22 seconds. They are unbound resonances located beyond the neutron drip line; this results in prompt emission of a neutron.
The formulas below are valid for all three isotopes of hydrogen, but slightly different values of the Rydberg constant (correction formula given below) must be used for each hydrogen isotope.
Lone neutral hydrogen atoms are rare under normal conditions. However, neutral hydrogen is common when it is covalently bound to another atom, and hydrogen atoms can also exist in cationic and anionic forms.
If a neutral hydrogen atom loses its electron, it becomes a cation. The resulting ion, which consists solely of a proton for the usual isotope, is written as "H+" and sometimes called "hydron". Free protons are common in the interstellar medium, and solar wind. In the context of aqueous solutions of classical Brønsted–Lowry acids, such as hydrochloric acid, it is actually hydronium, H3O+, that is meant. Instead of a literal ionized single hydrogen atom being formed, the acid transfers the hydrogen to H2O, forming H3O+.
If instead a hydrogen atom gains a second electron, it becomes an anion. The hydrogen anion is written as "H–" and called "hydride".
The hydrogen atom has special significance in quantum mechanics and quantum field theory as a simple two-body problem physical system which has yielded many simple analytical solutions in closed-form.
Experiments by Ernest Rutherford in 1909 showed the structure of the atom to be a dense, positive nucleus with a tenuous negative charge cloud around it. This immediately raised questions about how such a system could be stable. Classical electromagnetism had shown that any accelerating charge radiates energy, as shown by the Larmor formula. If the electron is assumed to orbit in a perfect circle and radiates energy continuously, the electron would rapidly spiral into the nucleus with a fall time of:
where formula_2 is the Bohr radius and formula_3 is the classical electron radius. If this were true, all atoms would instantly collapse, however atoms seem to be stable. Furthermore, the spiral inward would release a smear of electromagnetic frequencies as the orbit got smaller. Instead, atoms were observed to only emit discrete frequencies of radiation. The resolution would lie in the development of quantum mechanics.
In 1913, Niels Bohr obtained the energy levels and spectral frequencies of the hydrogen atom after making a number of simple assumptions in order to correct the failed classical model. The assumptions included:
Bohr supposed that the electron's angular momentum is quantized with possible values:
and formula_6 is Planck constant over formula_7. He also supposed that the centripetal force which keeps the electron in its orbit is provided by the Coulomb force, and that energy is conserved. Bohr derived the energy of each orbit of the hydrogen atom to be:
where formula_9 is the electron mass, formula_10 is the electron charge, formula_11 is the vacuum permittivity, and formula_12 is the quantum number (now known as the principal quantum number). Bohr's predictions matched experiments measuring the hydrogen spectral series to the first order, giving more confidence to a theory that used quantized values.
For formula_13, the value
is called the Rydberg unit of energy. It is related to the Rydberg constant formula_15 of atomic physics by formula_16
The exact value of the Rydberg constant assumes that the nucleus is infinitely massive with respect to the electron. For hydrogen-1, hydrogen-2 (deuterium), and hydrogen-3 (tritium) which have finite mass, the constant must be slightly modified to use the reduced mass of the system, rather than simply the mass of the electron. This includes the kinetic energy of the nucleus in the problem, because the total (electron plus nuclear) kinetic energy is equivalent to the kinetic energy of the reduced mass moving with a velocity equal to the electron velocity relative to the nucleus. However, since the nucleus is much heavier than the electron, the electron mass and reduced mass are nearly the same. The Rydberg constant "RM" for a hydrogen atom (one electron), "R" is given by
where formula_18 is the mass of the atomic nucleus. For hydrogen-1, the quantity formula_19 is about 1/1836 (i.e. the electron-to-proton mass ratio). For deuterium and tritium, the ratios are about 1/3670 and 1/5497 respectively. These figures, when added to 1 in the denominator, represent very small corrections in the value of "R", and thus only small corrections to all energy levels in corresponding hydrogen isotopes.
There were still problems with Bohr's model:
Most of these shortcomings were resolved by Arnold Sommerfeld's modification of the Bohr model. Sommerfeld introduced two additional degrees of freedom, allowing an electron to move on an elliptical orbit characterized by its eccentricity and declination with respect to a chosen axis. This introduced two additional quantum numbers, which correspond to the orbital angular momentum and its projection on the chosen axis. Thus the correct multiplicity of states (except for the factor 2 accounting for the yet unknown electron spin) was found. Further, by applying special relativity to the elliptic orbits, Sommerfeld succeeded in deriving the correct expression for the fine structure of hydrogen spectra (which happens to be exactly the same as in the most elaborate Dirac theory). However, some observed phenomena, such as the anomalous Zeeman effect, remained unexplained. These issues were resolved with the full development of quantum mechanics and the Dirac equation. It is often alleged that the Schrödinger equation is superior to the Bohr–Sommerfeld theory in describing hydrogen atom. This is not the case, as most of the results of both approaches coincide or are very close (a remarkable exception is the problem of hydrogen atom in crossed electric and magnetic fields, which cannot be self-consistently solved in the framework of the Bohr–Sommerfeld theory), and in both theories the main shortcomings result from the absence of the electron spin. It was the complete failure of the Bohr–Sommerfeld theory to explain many-electron systems (such as helium atom or hydrogen molecule) which demonstrated its inadequacy in describing quantum phenomena.
The Schrödinger equation allows one to calculate the stationary states and also the time evolution of quantum systems. Exact analytical answers are available for the nonrelativistic hydrogen atom. Before we go to present a formal account, here we give an elementary overview.
Given that the hydrogen atom contains a nucleus and an electron, quantum mechanics allows one to predict the probability of finding the electron at any given radial distance formula_22. It is given by the square of a mathematical function known as the "wavefunction," which is a solution of the Schrödinger equation. The lowest energy equilibrium state of the hydrogen atom is known as the ground state. The ground state wave function is known as the formula_23 wavefunction. It is written as:
Here, formula_2 is the numerical value of the Bohr radius. The probability of finding the electron at a distance formula_22 in any radial direction is the squared value of the wavefunction:
The formula_28 wavefunction is spherically symmetric, and the surface area of a shell at distance formula_22 is formula_30, so the total probability formula_31 of the electron being in a shell at a distance formula_22 and thickness formula_33 is
It turns out that this is a maximum at formula_35. That is, the Bohr picture of an electron orbiting the nucleus at radius formula_2 is recovered as a statistically valid result. However, although the electron is most likely to be on a Bohr orbit, there is a finite probability that the electron may be at any other place formula_22, with the
probability indicated by the square of the wavefunction. Since the probability of finding the electron "somewhere" in the whole volume is unity, the integral of formula_31 is unity. Then we say that the wavefunction is properly normalized.
As discussed below, the ground state formula_28 is also indicated by the quantum numbers formula_40. The second lowest energy states, just above the ground state, are given by the quantum numbers formula_41, formula_42, and formula_43. These formula_44 states all have the same energy and are known as the formula_45 and formula_46 states. There is one formula_45 state:
and there are three formula_46 states:
An electron in the formula_45 or formula_46 state is most likely to be found in the second Bohr orbit with energy given by the Bohr formula.
The Hamiltonian of the hydrogen atom is the radial kinetic energy operator and Coulomb attraction force between the positive proton and negative electron. Using the time-independent Schrödinger equation, ignoring all spin-coupling interactions and using the reduced mass formula_54, the equation is written as:
Expanding the Laplacian in spherical coordinates:
This is a separable, partial differential equation which can be solved in terms of special functions. The normalized position wavefunctions, given in spherical coordinates are:
where:
The quantum numbers can take the following values:
Additionally, these wavefunctions are "normalized" (i.e., the integral of their modulus square equals 1) and orthogonal:
where formula_72 is the state represented by the wavefunction formula_73 in Dirac notation, and formula_74 is the Kronecker delta function.
The wavefunctions in momentum space are related to the wavefunctions in position space through a Fourier transform
which, for the bound states, results in
where formula_77 denotes a Gegenbauer polynomial and formula_78 is in units of formula_79.
The solutions to the Schrödinger equation for hydrogen are analytical, giving a simple expression for the hydrogen energy levels and thus the frequencies of the hydrogen spectral lines and fully reproduced the Bohr model and went beyond it. It also yields two other quantum numbers and the shape of the electron's wave function ("orbital") for the various possible quantum-mechanical states, thus explaining the anisotropic character of atomic bonds.
The Schrödinger equation also applies to more complicated atoms and molecules. When there is more than one electron or nucleus the solution is not analytical and either computer calculations are necessary or simplifying assumptions must be made.
Since the Schrödinger equation is only valid for non-relativistic quantum mechanics, the solutions it yields for the hydrogen atom are not entirely correct. The Dirac equation of relativistic quantum theory improves these solutions (see below).
The solution of the Schrödinger equation (wave equation) for the hydrogen atom uses the fact that the Coulomb potential produced by the nucleus is isotropic (it is radially symmetric in space and only depends on the distance to the nucleus). Although the resulting energy eigenfunctions (the "orbitals") are not necessarily isotropic themselves, their dependence on the angular coordinates follows completely generally from this isotropy of the underlying potential: the eigenstates of the Hamiltonian (that is, the energy eigenstates) can be chosen as simultaneous eigenstates of the angular momentum operator. This corresponds to the fact that angular momentum is conserved in the orbital motion of the electron around the nucleus. Therefore, the energy eigenstates may be classified by two angular momentum quantum numbers, formula_64 and formula_65 (both are integers). The angular momentum quantum number formula_82 determines the magnitude of the angular momentum. The magnetic quantum number formula_83 determines the projection of the angular momentum on the (arbitrarily chosen) formula_84-axis.
In addition to mathematical expressions for total angular momentum and angular momentum projection of wavefunctions, an expression for the radial dependence of the wave functions must be found. It is only here that the details of the formula_85 Coulomb potential enter (leading to Laguerre polynomials in formula_22). This leads to a third quantum number, the principal quantum number formula_68. The principal quantum number in hydrogen is related to the atom's total energy.
Note that the maximum value of the angular momentum quantum number is limited by the principal quantum number: it can run only up to formula_88, i.e., formula_89.
Due to angular momentum conservation, states of the same formula_64 but different formula_65 have the same energy (this holds for all problems with rotational symmetry). In addition, for the hydrogen atom, states of the same formula_92 but different formula_64 are also degenerate (i.e., they have the same energy). However, this is a specific property of hydrogen and is no longer true for more complicated atoms which have an (effective) potential differing from the form formula_85 (due to the presence of the inner electrons shielding the nucleus potential).
Taking into account the spin of the electron adds a last quantum number, the projection of the electron's spin angular momentum along the formula_84-axis, which can take on two values. Therefore, any eigenstate of the electron in the hydrogen atom is described fully by four quantum numbers. According to the usual rules of quantum mechanics, the actual state of the electron may be any superposition of these states. This explains also why the choice of formula_84-axis for the directional quantization of the angular momentum vector is immaterial: an orbital of given formula_64 and formula_98 obtained for another preferred axis formula_99 can always be represented as a suitable superposition of the various states of different formula_65 (but same formula_64) that have been obtained for formula_84.
In 1928, Paul Dirac found an equation that was fully compatible with special relativity, and (as a consequence) made the wave function a 4-component "Dirac spinor" including "up" and "down" spin components, with both positive and "negative" energy (or matter and antimatter). The solution to this equation gave the following results, more accurate than the Schrödinger solution.
The energy levels of hydrogen, including fine structure (excluding Lamb shift and hyperfine structure), are given by the Sommerfeld fine structure expression:
where formula_21 is the fine-structure constant and formula_105 is the total angular momentum quantum number, which is equal to formula_106, depending on the orientation of the electron spin relative to the orbital angular momentum. This formula represents a small correction to the energy obtained by Bohr and Schrödinger as given above. The factor in square brackets in the last expression is nearly one; the extra term arises from relativistic effects (for details, see #Features going beyond the Schrödinger solution). It is worth noting that this expression was first obtained by A. Sommerfeld in 1916 based on the relativistic version of the old Bohr theory. Sommerfeld has however used different notation for the quantum numbers.
The coherent states have been proposed as
which satisfies formula_108 and takes the form
The image to the right shows the first few hydrogen atom orbitals (energy eigenfunctions). These are cross-sections of the probability density that are color-coded (black represents zero density and white represents the highest density). The angular momentum (orbital) quantum number "ℓ" is denoted in each column, using the usual spectroscopic letter code ("s" means "ℓ" = 0, "p" means "ℓ" = 1, "d" means "ℓ" = 2). The main (principal) quantum number "n" (= 1, 2, 3, ...) is marked to the right of each row. For all pictures the magnetic quantum number "m" has been set to 0, and the cross-sectional plane is the "xz"-plane ("z" is the vertical axis). The probability density in three-dimensional space is obtained by rotating the one shown here around the "z"-axis.
The "ground state", i.e. the state of lowest energy, in which the electron is usually found, is the first one, the 1"s" state (principal quantum level "n" = 1, "ℓ" = 0).
Black lines occur in each but the first orbital: these are the nodes of the wavefunction, i.e. where the probability density is zero. (More precisely, the nodes are spherical harmonics that appear as a result of solving Schrödinger equation in spherical coordinates.)
The quantum numbers determine the layout of these nodes. There are:
There are several important effects that are neglected by the Schrödinger equation and which are responsible for certain small but measurable deviations of the real spectral lines from the predicted ones:
Both of these features (and more) are incorporated in the relativistic Dirac equation, with predictions that come still closer to experiment. Again the Dirac equation may be solved analytically in the special case of a two-body system, such as the hydrogen atom. The resulting solution quantum states now must be classified by the total angular momentum number "j" (arising through the coupling between electron spin and orbital angular momentum). States of the same "j" and the same "n" are still degenerate. Thus, direct analytical solution of Dirac equation predicts 2S() and 2P() levels of Hydrogen to have exactly the same energy, which is in a contradiction with observations (Lamb-Retherford experiment).
For these developments, it was essential that the solution of the Dirac equation for the hydrogen atom could be worked out exactly, such that any experimentally observed deviation had to be taken seriously as a signal of failure of the theory.
In the language of Heisenberg's matrix mechanics, the hydrogen atom was first solved by Wolfgang Pauli using a rotational symmetry in four dimensions [O(4)-symmetry] generated by the angular momentum
and the Laplace–Runge–Lenz vector. By extending the symmetry group O(4) to the dynamical group O(4,2),
the entire spectrum and all transitions were embedded in a single irreducible group representation.
In 1979 the (non relativistic) hydrogen atom was solved for the first time within Feynman's path integral formulation
of quantum mechanics by Duru and Kleinert. This work greatly extended the range of applicability of Feynman's method.
|
https://en.wikipedia.org/wiki?curid=14225
|
Elagabalus
Elagabalus ( ) or Heliogabalus ( ; 204 – 11 March 222) was Roman emperor from 218 to 222. A member of the Severan dynasty, he was the second son of Julia Soaemias and Sextus Varius Marcellus. In his early youth he served the god Elagabalus as a priest in Emesa, Syria, the hometown of his mother's family, the Arab priest-king Emesan dynasty. As a private citizen, he was probably named Sextus Varius Avitus Bassianus. Upon becoming emperor he took the name Marcus Aurelius Antoninus Augustus. He was called Elagabalus only after his death.
In 217, the emperor Caracalla was assassinated and replaced by his Praetorian prefect, Marcus Opellius Macrinus. Caracalla's maternal aunt, Julia Maesa, successfully instigated a revolt among the Third Legion to have her eldest grandson (and Caracalla's cousin), Elagabalus, declared emperor in his place. Macrinus was defeated on 8 June 218 at the Battle of Antioch. Elagabalus, barely 14 years old, became emperor, initiating a reign remembered mainly for sex scandals and religious controversy.
Later historians suggest Elagabalus showed a disregard for Roman religious traditions and sexual taboos. He replaced the traditional head of the Roman pantheon, Jupiter, with the deity Elagabalus, of whom he had been high priest. He forced leading members of Rome's government to participate in religious rites celebrating this deity, over which he personally presided. Elagabalus was supposedly "married" as many as five times, and lavished favours on male courtiers popularly thought to have been his lovers. He was also reported to have prostituted himself in the imperial palace. His behavior estranged the Praetorian Guard, the Senate, and the common people alike. Amidst growing opposition, Elagabalus, just 18 years old, was assassinated and replaced by his cousin Severus Alexander on 11 March 222, who ruled for 13 years before his own assassination, which marked the epoch event for the Crisis of the Third Century. The assassination plot against Elagabalus was devised by his grandmother, Julia Maesa, and carried out by disaffected members of the Praetorian Guard.
Elagabalus developed a reputation among his contemporaries for extreme eccentricity, decadence, and zealotry. This tradition has persisted, and with writers of the early modern age he suffers one of the worst reputations among Roman emperors. Edward Gibbon, for example, wrote that Elagabalus "abandoned himself to the grossest pleasures with ungoverned fury". According to Barthold Georg Niebuhr, "The name Elagabalus is branded in history above all others" because of his "unspeakably disgusting life". An example of a modern historian's assessment would be Adrian Goldsworthy's: "Elagabalus was not a tyrant, but he was an incompetent, probably the least able emperor Rome had ever had."
Elagabalus is considered by some to be an early transgender figure and one of the first on record as seeking sex reassignment surgery.
Elagabalus was born around the year 204 to Sextus Varius Marcellus and Julia Soaemias Bassiana. His father was initially a member of the Equites class, but was later elevated to the rank of senator. His grandmother, Julia Maesa, was the widow of the consul Julius Avitus, the sister of Julia Domna, and the sister-in-law of the emperor Septimius Severus. He had at least one sibling: an unnamed elder brother. His mother, Julia Soaemias, was a cousin of the emperor Caracalla. His other relatives included his aunt Julia Avita Mamaea and uncle Marcus Julius Gessius Marcianus and among their children, their son Severus Alexander. Elagabalus's family held hereditary rights to the priesthood of the sun god Elagabal, of whom Elagabalus was the high priest at Emesa (modern Homs) in Roman Syria as part of the Arab Emesan dynasty.
The deity Elagabalus was initially venerated at Emesa. This form of the god's name is a Latinized version of the Arabic "Ilāh hag-Gabal", which derives from "Ilāh" (an Arabic word for "god") and "gabal" (an Arabic word for "mountain"), resulting in "the God of the Mountain", the Emesene manifestation of the deity.
The cult of the deity spread to other parts of the Roman Empire in the 2nd century; a dedication has been found as far away as Woerden (Netherlands), near the Roman "limes". The god was later imported and assimilated with the Roman sun god known as Sol Indiges in republican times and as Sol Invictus during the second and third centuries CE. In Greek the sun god is Helios, hence "Heliogabalus", a hybrid conjunction of "Helios" and "Elagabalus".
When the Emperor Macrinus came to power, he suppressed the threat against his reign from the family of his assassinated predecessor, Caracalla, by exiling them—Julia Maesa, her two daughters, and her eldest grandson Elagabalus—to their estate at Emesa in Syria. Almost upon arrival in Syria, Maesa began a plot with her advisor and Elagabalus' tutor, Gannys, to overthrow Macrinus and elevate the fourteen-year-old Elagabalus to the imperial throne.
His mother publicly declared that he was the illegitimate son of Caracalla, and therefore deserving the loyalty of Roman soldiers and senators who had sworn allegiance to Caracalla. After Julia Maesa displayed her wealth to the Third Legion at Raphana they swore allegiance to Elagabalus. At sunrise on 16 May 218, Publius Valerius Comazon, commander of the legion, declared him emperor. To strengthen his legitimacy Elagabalus assumed Caracalla's names, "Marcus Aurelius Antoninus".
In response Macrinus dispatched his Praetorian prefect Ulpius Julianus to the region with a contingent of troops he considered strong enough to crush the rebellion. However, this force soon joined the faction of Elagabalus when, during the battle, they turned on their own commanders. The officers were killed and Julianus' head was sent back to the emperor.
Macrinus now sent letters to the Senate denouncing Elagabalus as the "False Antoninus" and claiming he was insane. Both consuls and other high-ranking members of Rome's leadership condemned Elagabalus, and the Senate subsequently declared war on both Elagabalus and Julia Maesa.
Macrinus and his son, weakened by the desertion of the Second Legion due to bribes and promises circulated by Julia Maesa, were defeated on 8 June 218 at the Battle of Antioch by troops commanded by Gannys. Macrinus fled to Italy, disguised as a courier, but was intercepted near Chalcedon and executed in Cappadocia. His son Diadumenian, sent as a friendly hostage to the Parthian court as a guarantee of peace between the states, was captured at Zeugma and also put to death.
Elagabalus declared the date of the victory at Antioch to be the beginning of his reign and assumed the imperial titles without prior senatorial approval. This violated tradition but was a common practice among 3rd-century emperors nonetheless. Letters of reconciliation were dispatched to Rome extending amnesty to the Senate and recognizing the laws, while also condemning the administration of Macrinus and his son.
The senators responded by acknowledging Elagabalus as emperor and accepting his claim to be the son of Caracalla. Caracalla and Julia Domna were both deified by the Senate, both Julia Maesa and Julia Soaemias were elevated to the rank of Augustae, and the memory of both Macrinus and Diadumenian was expunged by the Senate. The former commander of the Third Legion, Comazon, was appointed commander of the Praetorian Guard.
Elagabalus and his entourage spent the winter of 218 in Bithynia at Nicomedia, where the emperor's religious beliefs first presented themselves as a problem. The contemporary historian Cassius Dio suggests that Gannys was in fact killed by the new emperor because he pressured Elagabalus to live "temperately and prudently". To help Romans adjust to having an oriental priest as emperor, Julia Maesa had a painting of Elagabalus in priestly robes sent to Rome and hung over a statue of the goddess Victoria in the Senate House. This placed senators in the awkward position of having to make offerings to Elagabalus whenever they made offerings to Victoria.
The legions were dismayed by his behaviour and quickly came to regret having supported his accession. While Elagabalus was still on his way to Rome, brief revolts broke out by the Fourth Legion at the instigation of Gellius Maximus, and by the Third Legion, which itself had been responsible for the elevation of Elagabalus to the throne, under the command of Senator Verus. The rebellion was quickly put down, and the Third Legion disbanded.
When the entourage reached Rome in the autumn of 219, Comazon and other allies of Julia Maesa and Elagabalus were given powerful and lucrative positions, to the chagrin of many senators who did not consider them worthy of such privileges. After his tenure as Praetorian prefect, Comazon served as the city prefect of Rome three times, and as consul twice. Elagabalus soon devalued the Roman currency. He decreased the silver purity of the "denarius" from 58% to 46.5% — the actual silver weight dropping from 1.82 grams to 1.41 grams. He also demonetized the "antoninianus" during this period in Rome.
Elagabalus tried to have his presumed lover, the charioteer Hierocles, declared Caesar, while another alleged lover, the athlete Aurelius Zoticus, was appointed to the non-administrative but influential position of Master of the Chamber, or "Cubicularius". His offer of amnesty for the Roman upper class was largely honoured, though the jurist Ulpian was exiled.
The relationships between Julia Maesa, Julia Soaemias, and Elagabalus were strong at first. His mother and grandmother became the first women to be allowed into the Senate, and both received senatorial titles: Soaemias the established title of "Clarissima," and Maesa the more unorthodox "Mater Castrorum et Senatus" ("Mother of the army camp and of the Senate"). They held the title of "Augusta" as well, suggesting that they may have been the power behind the throne. Indeed, they exercised great influence over the young emperor throughout his reign, and can be found on many coins and inscriptions—a rare honor for Roman women.
Since the reign of Septimius Severus, sun worship had increased throughout the Empire. Elagabalus saw this as an opportunity to install Elagabal as the chief deity of the Roman pantheon. The god was renamed "Deus Sol Invictus", meaning "God the Undefeated Sun", and honored above Jupiter.
As a token of respect for Roman religion, however, Elagabalus joined either Astarte, Minerva, Urania, or some combination of the three to Elagabal as consort. A union between Elagabal and a traditional goddess would have served to strengthen ties between the new religion and the imperial cult. In fact, there may have been an effort to introduce Elagabal, Urania, and Athena as the new Capitoline triad of Rome—replacing Jupiter, Juno, and Minerva.
He aroused further discontent when he married the Vestal Virgin Aquilia Severa, claiming the marriage would produce "godlike children". This was a flagrant breach of Roman law and tradition, which held that any Vestal found to have engaged in sexual intercourse was to be buried alive.
A lavish temple called the Elagabalium was built on the east face of the Palatine Hill to house Elagabal, who was represented by a black conical meteorite from Emesa. Herodian wrote "this stone is worshipped as though it were sent from heaven; on it there are some small projecting pieces and markings that are pointed out, which the people would like to believe are a rough picture of the sun, because this is how they see them".
In order to become the high priest of his new religion, Elagabalus had himself circumcised. He forced senators to watch while he danced around the altar of Deus Sol Invictus to the accompaniment of drums and cymbals. Each summer solstice he held a festival dedicated to the god, which became popular with the masses because of the free food distributed on these occasions. During this festival, Elagabalus placed the Emesa stone on a chariot adorned with gold and jewels, which he paraded through the city:
The most sacred relics from the Roman religion were transferred from their respective shrines to the Elagabalium, including the emblem of the Great Mother, the fire of Vesta, the Shields of the Salii, and the Palladium, so that no other god could be worshipped except in association with Elagabal.
The question of Elagabalus' sexual orientation is confused, owing to salacious and unreliable sources. Elagabalus married and divorced five women, three of whom are known. His first wife was Julia Cornelia Paula; the second was the Vestal Virgin Julia Aquilia Severa.
Within a year, he abandoned her and married Annia Aurelia Faustina, a descendant of Marcus Aurelius and the widow of a man he had recently had executed. He had returned to his second wife Severa by the end of the year. According to Cassius Dio, his most stable relationship seems to have been with his chariot driver, a blond slave from Caria named Hierocles, whom he referred to as his husband.
The "Augustan History" claims that he also married a man named Zoticus, an athlete from Smyrna, in a public ceremony at Rome. Cassius Dio reported that Elagabalus would paint his eyes, depilate his body hair and wear wigs before prostituting himself in taverns, brothels, and even in the imperial palace:
Herodian commented that Elagabalus enhanced his natural good looks by the regular application of cosmetics. Cassius Dio says Elagabalus delighted in being called Hierocles' mistress, wife, and queen. The emperor reportedly wore makeup and wigs, preferred to be called a lady and not a lord, and offered vast sums to any physician who could provide him with a vagina; for this reason, the emperor is seen by some writers as an early transgender figure and one of the first on record as seeking sex reassignment surgery.
By 221 Elagabalus' eccentricities, particularly his relationship with Hierocles, increasingly provoked the soldiers of the Praetorian Guard. When Elagabalus' grandmother Julia Maesa perceived that popular support for the emperor was waning, she decided that he and his mother, who had encouraged his religious practices, had to be replaced. As alternatives, she turned to her other daughter, Julia Avita Mamaea, and her daughter's son, the fifteen-year-old Severus Alexander.
Prevailing on Elagabalus, she arranged that he appoint his cousin Alexander as his heir and that the boy be given the title of "Caesar". Alexander shared the consulship with the emperor that year. However, Elagabalus reconsidered this arrangement when he began to suspect that the Praetorian Guard preferred his cousin to himself.
Following the failure of various attempts on Alexander's life, Elagabalus stripped his cousin of his titles, revoked his consulship, and invented the rumor that Alexander was near death, in order to see how the Praetorians would react. A riot ensued, and the Guard demanded to see Elagabalus and Alexander in the Praetorian camp.
The Emperor complied and on 11 March 222 he publicly presented his cousin along with his own mother, Julia Soaemias. On their arrival the soldiers started cheering Alexander while ignoring Elagabalus, who ordered the summary arrest and execution of anyone who had taken part in this display of insubordination. In response, members of the Praetorian Guard attacked Elagabalus and his mother:
Following his assassination, many associates of Elagabalus were killed or deposed, including his lover Hierocles. His religious edicts were reversed and the stone of Elagabal was sent back to Emesa. Women were again barred from attending meetings of the Senate. The practice of "damnatio memoriae"—erasing from the public record a disgraced personage formerly of note—was systematically applied in his case. Several images, including an over-life-size statue of him as Hercules that is now in Naples, were re-carved with the face of Alexander Severus.
The source of many of these stories of Elagabalus's depravity is the "Augustan History" ("Historia Augusta"), which includes controversial claims. It is most likely that the "Historia Augusta" was written towards the end of the 4th century, during the reign of Emperor Theodosius I. The life of Elagabalus as described in the "Augustan History" is of uncertain historical merit. Sections 13 to 17, relating to the fall of Elagabalus, are less controversial among historians.
Sources often considered more credible than the "Augustan History" include the contemporary historians Cassius Dio and Herodian. Cassius Dio lived from the second half of the 2nd century until sometime after 229. Born into a patrician family, he spent the greater part of his life in public service. He was a senator under emperor Commodus and governor of Smyrna after the death of Septimius Severus. Afterwards, he served as suffect consul around 205, and as proconsul in Africa and Pannonia.
Severus Alexander held him in high esteem and made him his consul again. His "Roman History" spans nearly a millennium, from the arrival of Aeneas in Italy until the year 229. As a contemporary of Elagabalus, Cassius Dio's account of his reign is generally considered more reliable than the "Augustan History", although by his own admission Dio spent the greater part of the relevant period outside of Rome and had to rely on second-hand information.
Furthermore, the political climate in the aftermath of Elagabalus' reign, as well as Dio's own position within the government of Alexander, likely influenced the truth of this part of his history for the worse. Dio regularly refers to Elagabalus as Sardanapalus, partly to distinguish him from his divine namesake, but chiefly to do his part in maintaining the "damnatio memoriae" and to associate him with another autocrat notorious for a dissolute life.
Another contemporary of Elagabalus' was Herodian, a minor Roman civil servant who lived from c. 170 until 240. His work, "History of the Roman Empire since Marcus Aurelius", commonly abbreviated as "Roman History", is an eyewitness account of the reign of Commodus until the beginning of the reign of Gordian III. His work largely overlaps with Dio's own "Roman History", but the texts, written independently of each other, agree more often than not about the emperor and his short but eventful reign.
Although Herodian is not deemed as reliable as Cassius Dio, his lack of literary and scholarly pretensions make him less biased than senatorial historians. Herodian is considered the most important source for the religious reforms which took place during the reign of Elagabalus, which have been confirmed by numismatic and archaeological evidence.
For readers of the modern age, "The History of the Decline and Fall of the Roman Empire" by Edward Gibbon (1737–1794) further cemented the scandalous reputation of Elagabalus. Gibbon not only accepted and expressed outrage at the allegations of the ancient historians, but he might have added some details of his own; he is the first historian known to claim that Gannys was a eunuch, for example. Gibbon wrote:
The 20th-century anthropologist James George Frazer (author of "The Golden Bough") took seriously the monotheistic aspirations of the emperor, but also ridiculed him: "The dainty priest of the Sun [was] the most abandoned reprobate who ever sat upon a throne ... It was the intention of this eminently religious but crack-brained despot to supersede the worship of all the gods, not only at Rome but throughout the world, by the single worship of Elagabalus or the Sun."
The first book-length biography was "The Amazing Emperor Heliogabalus" (1911) by J. Stuart Hay, "a serious and systematic study" more sympathetic than that of previous historians, which nonetheless stressed the exoticism of Elagabalus, calling his reign one of "enormous wealth and excessive prodigality, luxury and aestheticism, carried to their ultimate extreme, and sensuality in all the refinements of its Eastern habit".
Some recent historians paint a more favorable picture of the emperor's rule. Martijn Icks, in "Images of Elagabalus" (2008; republished as "The Crimes of Elagabalus" in 2012), doubts the reliability of the ancient sources and argues that it was the emperor's unorthodox religious policies that alienated the power elite of Rome, to the point that his grandmother saw fit to eliminate him and replace him with his cousin. Leonardo de Arrizabalaga y Prado, in "The Emperor Elagabalus: Fact or Fiction?" (2008), is also critical of the ancient historians and speculates that neither religion nor sexuality played a role in the fall of the young emperor. He was simply the loser in a power struggle within the imperial family; the loyalty of the Praetorian Guards was up for sale, and Julia Maesa had the resources to outmaneuver and outbribe her grandson. In this version of events, once Elagabalus, his mother, and his immediate circle had been murdered, a campaign of character assassination began, resulting in a grotesque caricature that has persisted to the present day.
Due to the ancient tradition about him, Elagabalus became something of an (anti-)hero in the Decadent movement of the late 19th century. He often appears in literature and other creative media as the epitome of a young, amoral aesthete. His life and character have informed or at least inspired many famous works of art, by Decadents, even by contemporary artists. The most notable of these works include:
|
https://en.wikipedia.org/wiki?curid=14227
|
Homeopathy
Homeopathy or homoeopathy is a pseudoscientific system of alternative medicine. It was created in 1796 by Samuel Hahnemann. Its practitioners, called homeopaths, believe that a substance that causes symptoms of a disease in healthy people would cure similar symptoms in sick people; this doctrine is called "similia similibus curentur", or "like cures like". Homeopathic preparations are termed "remedies" and are made using homeopathic dilution. In this process, a chosen substance is repeatedly and thoroughly diluted. The final product is chemically indistinguishable from the diluent, which is usually either distilled water, ethanol or sugar; often, not even a single molecule of the original substance can be expected to remain in the product. Between the dilution iterations homeopaths practice hitting and/or violently shaking the product, and claim that it makes the diluent remember the original substance after its removal. Practitioners claim that such preparations, upon oral intake, can treat or cure disease.
All relevant scientific knowledge about physics, chemistry, biochemistry and biology gained since at least the mid-19th century confirms that homeopathic remedies have no active content. They are biochemically inert, and have no effect on any known disease. Hahnemann's theory of disease, centered around principles he termed miasms, is inconsistent with subsequent identification of viruses and bacteria as causes of disease. Clinical trials have been conducted, and generally demonstrated no objective effect from homeopathic preparations. The fundamental implausibility of homeopathy as well as a lack of demonstrable effectiveness has led to it being characterized within the scientific and medical communities as quackery and nonsense.
After assessments of homeopathy, national and international bodies have recommended the withdrawal of government funding. Australia, the United Kingdom, Switzerland and France, as well as the European Academies' Science Advisory Council, and the Commission on Pseudoscience and Research Fraud of Russian Academy of Sciences each concluded that homeopathy is ineffective, and recommended against the practice receiving any further funding. Notably, France and England, where homeopathy was formerly prevalent, are in the process of removing all public funding. The National Health Service in England ceased funding homeopathic remedies in November 2017 and asked the Department of Health in the UK to add homeopathic remedies to the blacklist of forbidden prescription items, and France will remove funding by 2021. In November 2018, Spain also announced moves to ban homeopathy and other pseudotherapies.
Hahnemann rejected the mainstream medicine of the late 18th century as irrational and inadvisable
because it was largely ineffective and often harmful.
He advocated the use of single drugs at lower doses and promoted an immaterial, vitalistic view of how living organisms function. The outcome from no treatment and adequate rest was usually superior to mainstream medicine as practiced at the time of homeopathy's inception.
The term "homeopathy" was coined by Hahnemann and first appeared in print in 1807.
Hahnemann conceived of homeopathy while translating a medical treatise by the Scottish physician and chemist William Cullen into German. Being sceptical of Cullen's theory concerning cinchona's use for curing malaria, Hahnemann ingested some bark specifically to investigate what would happen. He experienced fever, shivering and joint pain: symptoms similar to those of malaria itself. From this, Hahnemann came to believe that all effective drugs produce symptoms in healthy individuals similar to those of the diseases that they treat, in accord with the "law of similars" that had been proposed by ancient physicians. An account of the effects of eating cinchona bark noted by Oliver Wendell Holmes, and published in 1861, failed to reproduce the symptoms Hahnemann reported. Hahnemann's law of similars is an "ipse dixit" that does not derive from the scientific method. This led to the name ""homeopathy"", which comes from the "hómoios", "-like" and "páthos", "suffering".
Subsequent scientific work showed that cinchona cures malaria because it contains quinine, which kills the "Plasmodium falciparum" parasite that causes the disease; the mechanism of action is unrelated to Hahnemann's ideas.
Hahnemann began to test what effects substances may have produced in humans, a procedure later called "homeopathic proving". These tests required subjects to test the effects of ingesting substances by clearly recording all of their symptoms as well as the ancillary conditions under which they appeared. He published a collection of provings in 1805, and a second collection of 65 preparations appeared in his book, "Materia Medica Pura" (1810).
Because Hahnemann believed that large doses of drugs that caused similar symptoms would only aggravate illness, he advocated extreme dilutions of the substances; he devised a technique for making dilutions that he believed would preserve a substance's therapeutic properties while removing its harmful effects. Hahnemann believed that this process aroused and enhanced "the spirit-like medicinal powers of the crude substances". He gathered and published a complete overview of his new medical system in his book, "The Organon of the Healing Art" (1810), whose 6th edition, published in 1921, is still used by homeopaths today.
In the "Organon", Hahnemann introduced the concept of "miasms" as "infectious principles" underlying chronic disease. Hahnemann associated each miasm with specific diseases, and thought that initial exposure to miasms causes local symptoms, such as skin or venereal diseases. If, however, these symptoms were suppressed by medication, the cause went deeper and began to manifest itself as diseases of the internal organs. Homeopathy maintains that treating diseases by directly alleviating their symptoms, as is sometimes done in conventional medicine, is ineffective because all "disease can generally be traced to some latent, deep-seated, underlying chronic, or inherited tendency". The underlying imputed miasm still remains, and deep-seated ailments can be corrected only by removing the deeper disturbance of the vital force.
Hahnemann's hypotheses for the direct or remote cause of all chronic diseases (miasms) originally presented only three: psora (the itch), syphilis (venereal disease) or sycosis (fig-wart disease). Of these three the most important was "psora" (Greek for "itch"), described as being related to any itching diseases of the skin, supposed to be derived from suppressed scabies, and claimed to be the foundation of many further disease conditions. Hahnemann believed psora to be the cause of such diseases as epilepsy, cancer, jaundice, deafness, and cataracts.
Since Hahnemann's time, other miasms have been proposed, some replacing one or more of psora's proposed functions, including tuberculosis and cancer miasms.
The law of susceptibility implies that a negative state of mind can attract hypothetical disease entities called "miasms" to invade the body and produce symptoms of diseases. Hahnemann rejected the notion of a disease as a separate thing or invading entity, and insisted it was always part of the "living whole". outdated source, statement can be reintroduced when a better source is found -->The homeopath then attempts to translate this information into a complex formula of mental and physical symptoms, including likes, dislikes, innate predispositions and even body type.
From these symptoms, the homeopath chooses how to treat the patient using "materia medica" and repertories. In classical homeopathy, the practitioner attempts to match a single preparation to the totality of symptoms (the "simlilum"), while "clinical homeopathy" involves combinations of preparations based on the various symptoms of an illness.
Homeopathic pills are made from an inert substance (often sugars, typically lactose), upon which a drop of liquid homeopathic preparation is placed and allowed to evaporate.
The process of homeopathic dilution results in no objectively detectable active ingredient in most cases, but some preparations (e.g. calendula and arnica creams) do contain pharmacologically active doses. One product, Zicam Cold Remedy, which was marketed as an "unapproved homeopathic" product, contains two ingredients that are only "slightly" diluted: zinc acetate (2X = 1/100 dilution) and zinc gluconate (1X = 1/10 dilution), which means both are present in a biologically active concentration strong enough to have caused some people to lose their sense of smell, a condition termed anosmia. Zicam also listed several normal homeopathic potencies as "inactive ingredients", including "galphimia glauca", histamine dihydrochloride (homeopathic name, "histaminum hydrochloricum"), "luffa operculata", and sulfur.
Isopathy is a therapy derived from homeopathy, invented by Johann Joseph Wilhelm Lux in the 1830s. Isopathy differs from homeopathy in general in that the preparations, known as "nosodes", are made up either from things that cause the disease or from products of the disease, such as pus. Many so-called "homeopathic vaccines" are a form of isopathy. Tautopathy is a form of isopathy where the preparations are composed of drugs or vaccines that a person has consumed in the past, in the belief that this can reverse lingering damage caused by the initial use. There is no convincing scientific evidence for isopathy as an effective method of treatment.
Flower preparations can be produced by placing flowers in water and exposing them to sunlight. The most famous of these are the Bach flower remedies, which were developed by the physician and homeopath Edward Bach. Although the proponents of these preparations share homeopathy's vitalist world-view and the preparations are claimed to act through the same hypothetical "vital force" as homeopathy, the method of preparation is different. Bach flower preparations are manufactured in allegedly "gentler" ways such as placing flowers in bowls of sunlit water, and the preparations are not succussed. There is no convincing scientific or clinical evidence for flower preparations being effective.
The idea of using homeopathy as a treatment for other animals termed "veterinary homeopathy", dates back to the inception of homeopathy; Hahnemann himself wrote and spoke of the use of homeopathy in animals other than humans. The FDA has not approved homeopathic products as veterinary medicine in the U.S. In the UK, veterinary surgeons who use homeopathy may belong to the Faculty of Homeopathy and/or to the British Association of Homeopathic Veterinary Surgeons. Animals may be treated only by qualified veterinary surgeons in the UK and some other countries. Internationally, the body that supports and represents homeopathic veterinarians is the International Association for Veterinary Homeopathy.
The use of homeopathy in veterinary medicine is controversial; the little existing research on the subject is not of a high enough scientific standard to provide reliable data on efficacy. Given that homeopathy's effects in humans appear to be mainly due to the placebo effect and the counseling aspects of the consultation, it is unlikely that homeopathic treatments would be effective in animals. Other studies have also found that giving animals placebos can play active roles in influencing pet owners to believe in the effectiveness of the treatment when none exists. The British Veterinary Association's position statement on alternative medicines says that it "cannot endorse" homeopathy, and the Australian Veterinary Association includes it on its list of "ineffective therapies". A 2016 review of peer-reviewed articles from 1981 to 2014 by scientists from the University of Kassel, Germany, concluded that there was insufficient evidence to support the use of homeopathy in livestock as a way to prevent or treat infectious diseases.
The UK's Department for Environment, Food and Rural Affairs (Defra) has adopted a robust position against use of "alternative" pet preparations including homeopathy.
Electrohomoeopathy is a derivative of homeopathy. The name refers to an electric bio-energy supposedly extracted from plants and of therapeutic value, rather than electricity in its conventional sense, combined with homeopathy. Popular in the late nineteenth century, electrohomeopathy is considered pseudo scientific and has been described as "utter idiocy".
In 2012, the Allahabad High Court in Uttar Pradesh, India, handed down a decree which stated that electrohomeopathy was an unrecognized system of medicine which was quackery.
The use of homeopathy as a preventive for serious infectious diseases is especially controversial, in the context of ill-founded public alarm over the safety of vaccines stoked by the anti-vaccination movement. Promotion of homeopathic alternatives to vaccines has been characterized as dangerous, inappropriate and irresponsible. In December 2014, the Australian homeopathy supplier Homeopathy Plus! was found to have acted deceptively in promoting homeopathic alternatives to vaccines. In 2019, an investigative journalism piece by the Telegraph revealed that homeopathy practitioners were actively discouraging patients from vaccinating their children.
The "very" low concentration of homeopathic preparations, which often lack even a single molecule of the diluted substance, has been the basis of questions about the effects of the preparations since the 19th century. Modern advocates of homeopathy have proposed a concept of "water memory", according to which water "remembers" the substances mixed in it, and transmits the effect of those substances when consumed. This concept is inconsistent with the current understanding of matter, and water memory has never been demonstrated to have any detectable effect, biological or otherwise.
James Randi and the groups have highlighted the lack of active ingredients in most homeopathic products by taking large 'overdoses'. None of the hundreds of demonstrators in the UK, Australia, New Zealand, Canada and the US were injured and "no one was cured of anything, either".
Outside of the alternative medicine community, scientists have long considered homeopathy a sham or a pseudoscience, and the mainstream medical community regards it as quackery. There is an overall absence of sound statistical evidence of therapeutic efficacy, which is consistent with the lack of any biologically plausible pharmacological agent or mechanism.
Abstract concepts within theoretical physics have been invoked to suggest explanations of how or why preparations might work, including quantum entanglement, quantum nonlocality, the theory of relativity and chaos theory. Contrariwise, quantum superposition has been invoked to explain why homeopathy does "not" work in double-blind trials. However, the explanations are offered by nonspecialists within the field, and often include speculations that are incorrect in their application of the concepts and not supported by actual experiments. Several of the key concepts of homeopathy conflict with fundamental concepts of physics and chemistry. The use of quantum entanglement to explain homeopathy's purported effects is "patent nonsense", as entanglement is a delicate state that rarely lasts longer than a fraction of a second. While entanglement may result in certain aspects of individual subatomic particles acquiring linked quantum states, this does not mean the particles will mirror or duplicate each other, nor cause health-improving transformations.
The proposed mechanisms for homeopathy are precluded from having any effect by the laws of physics and physical chemistry. The extreme dilutions used in homeopathic preparations usually leave not one molecule of the original substance in the final product.
A number of speculative mechanisms have been advanced to counter this, the most widely discussed being water memory, though this is now considered erroneous since short-range order in water only persists for about 1 picosecond. No evidence of stable clusters of water molecules was found when homeopathic preparations were studied using nuclear magnetic resonance, and many other physical experiments in homeopathy have been found to be of low methodological quality, which precludes any meaningful conclusion. Existence of a pharmacological effect in the absence of any true active ingredient is inconsistent with the law of mass action and the observed dose-response relationships characteristic of therapeutic drugs (whereas placebo effects are non-specific and unrelated to pharmacological activity).
Homeopaths contend that their methods produce a therapeutically active preparation, selectively including only the intended substance, though critics note that any water will have been in contact with millions of different substances throughout its history, and homeopaths have not been able to account for a reason why only the selected homeopathic substance would be a special case in their process. For comparison, ISO 3696:1987 defines a standard for water used in laboratory analysis; this allows for a contaminant level of ten parts per billion, 4C in homeopathic notation. This water may not be kept in glass as contaminants will leach out into the water.
Practitioners of homeopathy hold that higher dilutions―described as being of higher "potency"―produce stronger medicinal effects. This idea is also inconsistent with observed dose-response relationships, where effects are dependent on the concentration of the active ingredient in the body. This dose-response relationship has been confirmed in myriad experiments on organisms as diverse as nematodes, rats, and humans. Some homeopaths contend that the phenomenon of hormesis may support the idea of dilution increasing potency, but the dose-response relationship outside the zone of hormesis declines with dilution as normal, and nonlinear pharmacological effects do not provide any credible support for homeopathy.
Physicist Robert L. Park, former executive director of the American Physical Society, is quoted as saying:"since the least amount of a substance in a solution is one molecule, a 30C solution would have to have at least one molecule of the original substance dissolved in a minimum of 1,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 [or 1060] molecules of water. This would require a container more than 30,000,000,000 times the size of the Earth." Park is also quoted as saying that, "to expect to get even one molecule of the 'medicinal' substance allegedly present in 30X pills, it would be necessary to take some two billion of them, which would total about a thousand tons of lactose plus whatever impurities the lactose contained".
The laws of chemistry state that there is a limit to the dilution that can be made without losing the original substance altogether. This limit, which is related to Avogadro's number, is roughly equal to homeopathic dilutions of 12C or 24X (1 part in 1024).
Scientific tests run by both the BBC's "Horizon" and ABC's "20/20" programmes were unable to differentiate homeopathic dilutions from water, even when using tests suggested by homeopaths themselves.
In May 2018, the German skeptical organization GWUP issued an invitation to individuals and groups to respond to its challenge "to identify homeopathic preparations in high potency and to give a detailed description on how this can be achieved reproducibly." The first participant to correctly identify selected homeopathic preparations under an agreed-upon protocol will receive €50,000.
No individual homeopathic preparation has been unambiguously shown by research to be different from placebo. The methodological quality of the primary research was generally low, with such problems as weaknesses in study design and reporting, small sample size, and selection bias. Since better quality trials have become available, the evidence for efficacy of homeopathy preparations has diminished; the highest-quality trials indicate that the preparations themselves exert no intrinsic effect. A review conducted in 2010 of all the pertinent studies of "best evidence" produced by the Cochrane Collaboration concluded that "the most reliable evidence – that produced by Cochrane reviews – fails to demonstrate that homeopathic medicines have effects beyond placebo."
Government-level reviews have been conducted in recent years by Switzerland (2005), the United Kingdom (2009), Australia (2015) and the European Academies' Science Advisory Council (2017).
The Swiss "programme for the evaluation of complementary medicine" (PEK) resulted in the peer-reviewed Shang publication (see "Systematic reviews and meta-analyses of efficacy") and a controversial competing analysis by homeopaths and advocates led by Gudrun Bornhöft and Peter Matthiessen, which has misleadingly been presented as a Swiss government report by homeopathy proponents, a claim that has been repudiated by the Swiss Federal Office of Public Health. The Swiss Government terminated reimbursement, though it was subsequently reinstated after a political campaign and referendum for a further six-year trial period.
The United Kingdom's House of Commons Science and Technology Committee sought written evidence and submissions from concerned parties and, following a review of all submissions, concluded that there was no compelling evidence of effect other than placebo and recommended that the Medicines and Healthcare products Regulatory Agency (MHRA) should not allow homeopathic product labels to make medical claims, that homeopathic products should no longer be licensed by the MHRA, as they are not medicines, and that further clinical trials of homeopathy could not be justified. They recommended that funding of homeopathic hospitals should not continue, and NHS doctors should not refer patients to homeopaths. By February 2011 only one-third of primary care trusts still funded homeopathy and by 2012 no British universities offered homeopathy courses. In July 2017, as part of a plan to save £200m a year by preventing the "misuse of scarce" funding, the NHS announced that it would no longer provide homeopathic medicines. A legal appeal by the British Homeopathic Association against the decision was rejected in June 2018.
The Australian National Health and Medical Research Council completed a comprehensive review of the effectiveness of homeopathic preparations in 2015, in which it concluded that "there were no health conditions for which there was reliable evidence that homeopathy was effective. No good-quality, well-designed studies with enough participants for a meaningful result reported either that homeopathy caused greater health improvements than placebo, or caused health improvements equal to those of another treatment."
On September 20, 2017, the European Academies' Science Advisory Council (EASAC) published its official analysis and conclusion on the use of homeopathic products, finding a lack of evidence that homeopathic products are effective, and raising concerns about quality control.
The fact that individual randomized controlled trials have given positive results is not in contradiction with an overall lack of statistical evidence of efficacy. A small proportion of randomized controlled trials inevitably provide false-positive outcomes due to the play of chance: a statistically significant positive outcome is commonly adjudicated when the probability of it being due to chance rather than a real effect is no more than 5%―a level at which about 1 in 20 tests can be expected to show a positive result in the absence of any therapeutic effect. Furthermore, trials of low methodological quality (i.e. ones that have been inappropriately designed, conducted or reported) are prone to give misleading results. In a systematic review of the methodological quality of randomized trials in three branches of alternative medicine, Linde "et al." highlighted major weaknesses in the homeopathy sector, including poor randomization. A separate 2001 systematic review that assessed the quality of clinical trials of homeopathy found that such trials were generally of lower quality than trials of conventional medicine.
A related issue is publication bias: researchers are more likely to submit trials that report a positive finding for publication, and journals prefer to publish positive results. Publication bias has been particularly marked in alternative medicine journals, where few of the published articles (just 5% during the year 2000) tend to report null results. Regarding the way in which homeopathy is represented in the medical literature, a systematic review found signs of bias in the publications of clinical trials (towards negative representation in mainstream medical journals, and "vice versa" in alternative medicine journals), but not in reviews.
Positive results are much more likely to be false if the prior probability of the claim under test is low.
Both meta-analyses, which statistically combine the results of several randomized controlled trials, and other systematic reviews of the literature are essential tools to summarize evidence of therapeutic efficacy. Early systematic reviews and meta-analyses of trials evaluating the efficacy of homeopathic preparations in comparison with placebo more often tended to generate positive results, but appeared unconvincing overall. In particular, reports of three large meta-analyses warned readers that firm conclusions could not be reached, largely due to methodological flaws in the primary studies and the difficulty in controlling for publication bias. The positive finding of one of the most prominent of the early meta-analyses, published in "The Lancet" in 1997 by Linde et al., was later reframed by the same research team, who wrote:
The evidence of bias [in the primary studies] weakens the findings of our original meta-analysis. Since we completed our literature search in 1995, a considerable number of new homeopathy trials have been published. The fact that a number of the new high-quality trials ... have negative results, and a recent update of our review for the most "original" subtype of homeopathy (classical or individualized homeopathy), seem to confirm the finding that more rigorous trials have less-promising results. It seems, therefore, likely that our meta-analysis at least overestimated the effects of homeopathic treatments.
Subsequent work by John Ioannidis and others has shown that for treatments with no prior plausibility, the chances of a positive result being a false positive are much higher, and that any result not consistent with the null hypothesis should be assumed to be a false positive.
A systematic review of the available systematic reviews confirmed in 2002 that higher-quality trials tended to have less positive results, and found no convincing evidence that any homeopathic preparation exerts clinical effects different from placebo.
In 2005, "The Lancet" medical journal published a meta-analysis of 110 placebo-controlled homeopathy trials and 110 matched medical trials based upon the Swiss government's Programme for Evaluating Complementary Medicine, or PEK. The study concluded that its findings were "compatible with the notion that the clinical effects of homeopathy are placebo effects". This was accompanied by an editorial pronouncing "The end of homoeopathy". A 2017 systematic review and meta-analysis found that the most reliable evidence did not support the effectiveness of non-individualized homeopathy. The authors noted that "the quality of the body of evidence is low."
Other meta-analyses include homeopathic treatments to reduce cancer therapy side-effects following radiotherapy and chemotherapy, allergic rhinitis, attention-deficit hyperactivity disorder and childhood diarrhoea, adenoid vegetation, asthma, upper respiratory tract infection in children, insomnia, fibromyalgia, psychiatric conditions and Cochrane Library systematic reviews of homeopathic treatments for asthma, dementia, attention-deficit hyperactivity disorder, induction of labour, upper respiratory tract infections in children, and irritable bowel syndrome. Other reviews covered osteoarthritis, migraines, postoperative ecchymosis and edema, delayed-onset muscle soreness, preventing postpartum haemorrhage, or eczema and other dermatological conditions.
Some clinical trials have tested individualized homeopathy, and there have been reviews of this, specifically. A 1998 review found 32 trials that met their inclusion criteria, 19 of which were placebo-controlled and provided enough data for meta-analysis. These 19 studies showed a pooled odds ratio of 1.17 to 2.23 in favour of individualized homeopathy over the placebo, but no difference was seen when the analysis was restricted to the methodologically best trials. The authors concluded that "the results of the available randomized trials suggest that individualized homeopathy has an effect over placebo. The evidence, however, is not convincing because of methodological shortcomings and inconsistencies." Jay Shelton, author of a book on homeopathy, has stated that the claim assumes without evidence that classical, individualized homeopathy works better than nonclassical variations. A 2014 systematic review and meta-analysis found that individualized homeopathic remedies may be slightly more effective than placebos, though the authors noted that their findings were based on low- or unclear-quality evidence. The same research team later reported that taking into account model validity did not significantly affect this conclusion.
The results of reviews are generally negative or only weakly positive, and reviewers consistently report the poor quality of trials. The finding of Linde "et. al." that more rigorous studies produce less positive results is supported in several and contradicted by none.
Health organizations such as the UK's National Health Service, the American Medical Association, the FASEB, and the National Health and Medical Research Council of Australia, have issued statements of their conclusion that there is "no good-quality evidence that homeopathy is effective as a treatment for any health condition". In 2009, World Health Organization official Mario Raviglione criticized the use of homeopathy to treat tuberculosis; similarly, another WHO spokesperson argued there was no evidence homeopathy would be an effective treatment for diarrhoea. They warned against the use of homeopathy for serious conditions such as depression, HIV and malaria.
The American College of Medical Toxicology and the American Academy of Clinical Toxicology recommend that no one use homeopathic treatment for disease or as a preventive health measure. These organizations report that no evidence exists that homeopathic treatment is effective, but that there is evidence that using these treatments produces harm and can bring indirect health risks by delaying conventional treatment.
Science offers a variety of explanations for how homeopathy may appear to cure diseases or alleviate symptoms even though the preparations themselves are inert:
While some articles have suggested that homeopathic solutions of high dilution can have statistically significant effects on organic processes including the growth of grain, histamine release by leukocytes, and enzyme reactions, such evidence is disputed since attempts to replicate them have failed. A 2007 systematic review of high-dilution experiments found that none of the experiments with positive results could be reproduced by all investigators.
In 1987, French immunologist Jacques Benveniste submitted a paper to the journal "Nature" while working at INSERM. The paper purported to have discovered that basophils, a type of white blood cell, released histamine when exposed to a homeopathic dilution of anti-immunoglobulin E antibody. The journal editors, sceptical of the results, requested that the study be replicated in a separate laboratory. Upon replication in four separate laboratories the study was published. Still sceptical of the findings, "Nature" assembled an independent investigative team to determine the accuracy of the research, consisting of "Nature" editor and physicist Sir John Maddox, American scientific fraud investigator and chemist Walter Stewart, and sceptic James Randi. After investigating the findings and methodology of the experiment, the team found that the experiments were "statistically ill-controlled", "interpretation has been clouded by the exclusion of measurements in conflict with the claim", and concluded, "We believe that experimental data have been uncritically assessed and their imperfections inadequately reported." James Randi stated that he doubted that there had been any conscious fraud, but that the researchers had allowed "wishful thinking" to influence their interpretation of the data.
In 2001 and 2004, Madeleine Ennis published a number of studies that reported that homeopathic dilutions of histamine exerted an effect on the activity of basophils. In response to the first of these studies, "Horizon" aired a programme in which British scientists attempted to replicate Ennis' results; they were unable to do so.
The provision of homeopathic preparations has been described as unethical. Michael Baum, Professor Emeritus of Surgery and visiting Professor of Medical Humanities at University College London (UCL), has described homoeopathy as a "cruel deception".
Edzard Ernst, the first Professor of Complementary Medicine in the United Kingdom and a former homeopathic practitioner, has expressed his concerns about pharmacists who violate their ethical code by failing to provide customers with "necessary and relevant information" about the true nature of the homeopathic products they advertise and sell:
Patients who choose to use homeopathy rather than evidence-based medicine risk missing timely diagnosis and effective treatment of serious conditions such as cancer.
In 2013 the UK Advertising Standards Authority concluded that the Society of Homeopaths were targeting vulnerable ill people and discouraging the use of essential medical treatment while making misleading claims of efficacy for homeopathic products.
In 2015 the Federal Court of Australia imposed penalties on a homeopathic company, Homeopathy Plus! Pty Ltd and its director, for making false or misleading statements about the efficacy of the whooping cough vaccine and homeopathic remedies as an alternative to the whooping cough vaccine, in breach of the Australian Consumer Law.
Some homeopathic preparations involve poisons such as Belladonna, arsenic, and poison ivy, which are highly diluted in the homeopathic preparation. In rare cases, the original ingredients are present at detectable levels. This may be due to improper preparation or intentional low dilution. Serious adverse effects such as seizures and death have been reported or associated with some homeopathic preparations.
On September 30, 2016 the FDA issued a safety alert to consumers warning against the use of homeopathic teething gels and tablets following reports of adverse events after their use. The agency recommended that parents discard these products and "seek advice from their health care professional for safe alternatives" to homeopathy for teething. The pharmacy CVS announced, also on September 30, that it was voluntarily withdrawing the products from sale and on October 11 Hyland's (the manufacturer) announced that it was discontinuing their teething medicine in the United States though the products remain on sale in Canada. On October 12, Buzzfeed reported that the regulator had "examined more than 400 reports of seizures, fever and vomiting, as well as 10 deaths" over a six-year period. The investigation (including analyses of the products) is still ongoing and the FDA does not know yet if the deaths and illnesses were caused by the products. However a previous FDA investigation in 2010, following adverse effects reported then, found that these same products were improperly diluted and contained "unsafe levels of belladonna, also known as deadly nightshade" and that the reports of serious adverse events in children using this product were "consistent with belladonna toxicity".
Instances of arsenic poisoning have occurred after use of arsenic-containing homeopathic preparations. Zicam Cold remedy Nasal Gel, which contains 2X (1:100) zinc gluconate, reportedly caused a small percentage of users to lose their sense of smell; 340 cases were settled out of court in 2006 for . In 2009, the FDA advised consumers to stop using three discontinued cold remedy Zicam products because it could cause permanent damage to users' sense of smell.
|
https://en.wikipedia.org/wiki?curid=14229
|
Hate speech
Hate speech is defined by Cambridge Dictionary as "public speech that expresses hate or encourages violence towards a person or group based on something such as race, religion, sex, or sexual orientation". Hate speech is "usually thought to include communications of animosity or disparagement of an individual or a group on account of a group characteristic such as race, color, national origin, sex, disability, religion, or sexual orientation".
There has been much debate over freedom of speech, hate speech and hate speech legislation. The laws of some countries describe hate speech as speech, gestures, conduct, writing, or displays that incite violence or prejudicial actions against a group or individuals on the basis of their membership in the group, or which disparage or intimidate a group or individuals on the basis of their membership in the group. The law may identify a group based on certain characteristics. In some countries, hate speech is not a legal term. Additionally in some countries, including the United States, much of what falls under the category of "hate speech" is constitutionally protected. In other countries, a victim of hate speech may seek redress under civil law, criminal law, or both.
Laws against hate speech can be divided into two types: those intended to preserve public order and those intended to protect human dignity.
The laws designed to protect public order require that a higher threshold be violated, so they are not often enforced. For example, in Northern Ireland, as of 1992, only one person has been prosecuted for violating the regulation in twenty-one years.
The laws meant to protect human dignity have a much lower threshold for violation, so those in Canada, Denmark, France, Germany and the Netherlands tend to be more frequently enforced.
The global nature of the internet makes it extremely difficult to set limits or boundaries to cyberspace.
The International Covenant on Civil and Political Rights (ICCPR) states that "any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence shall be prohibited by law". The Convention on the Elimination of All Forms of Racial Discrimination (ICERD) prohibits all incitement to racism. Concerning the debate over how freedom of speech applies to the Internet, conferences concerning such sites have been sponsored by the United Nations High Commissioner for Refugees. "Direct and public incitement to commit genocide" is prohibited by the 1948 Genocide Convention.
Australia's hate speech laws vary by jurisdiction, and seek especially to prevent victimisation on account of race.
The Belgian Anti-Racism Law, in full, the "Law of 30 July 1981 on the Punishment of Certain Acts inspired by Racism or Xenophobia", is a law against hate speech and discrimination that the Federal Parliament of Belgium passed in 1981. It made certain acts motivated by racism or xenophobia illegal. It is also known as the "Moureaux Law".
The Belgian Holocaust denial law, passed on 23 March 1995, bans public Holocaust denial. Specifically, the law makes it illegal to publicly "deny, play down, justify or approve of the genocide committed by the Nazi German regime during the Second World War." Prosecution is led by the Belgian Centre for Equal Opportunities. The offense is punishable by imprisonment of up to one year and fines of up to €2,500.
In Brazil, according to the 1988 Brazilian Constitution, racism is an "Offense with no statute of limitations and no right to bail for the defendant." In 2019, Brazil's Supreme Court (STF) ruled that the racism crime law should be applied to homophobia and transphobia as well.
In Canada, advocating genocide against any "identifiable group" is an indictable offence under the "Criminal Code" and it carries a maximum sentence of five years' imprisonment. There is no minimum sentence.
Publicly inciting hatred against any identifiable group is also an offence. It can be prosecuted either as an indictable offence with a maximum sentence of two years' imprisonment, or as a summary conviction offence with a maximum sentence of six months' imprisonment. There are no minimum sentences in either case. The offence of publicly inciting hatred makes exceptions for cases of statements of truth, and subjects of public debate and religious doctrine. The landmark judicial decision on the constitutionality of this law was "R v Keegstra" (1990).
An "identifiable group" is defined for both offences as "any section of the public distinguished by colour, race, religion, national or ethnic origin, age, sex, sexual orientation, gender identity or expression or mental or physical disability".
Article 31 of the "Ley sobre Libertades de Opinión e Información y Ejercicio del Periodismo" (statute on freedom of opinion and information and the performance of journalism), punishes with a large fine those who “through any means of social communication makes publications or transmissions intended to promote hatred or hostility towards persons or a group of persons due to their race, sex, religion or nationality". This law has been applied to expressions transmitted via the internet. There is also a rule increasing the penalties for crimes motivated by discriminatory hatred.
The Croatian Constitution guarantees freedom of speech, but the Croatian penal code prohibits discrimination and punishes anyone "who based on differences of race, religion, language, political or other belief, wealth, birth, education, social status or other properties, gender, skin color, nationality or ethnicity violates basic human rights and freedoms recognized by the international community."
Denmark prohibits hate speech, and defines it as publicly making statements by which a group is threatened ('), insulted (') or degraded ("") due to race, skin colour, national or ethnic origin, faith or sexual orientation.
The Council of Europe sponsored "No Hate Speech" movement actively raises awareness about hate speech, in order to help combat the problem. While Article 10 of the European Convention on Human Rights does not prohibit criminal laws against revisionism such as denial or minimization of genocides or crimes against humanity, as interpreted by the European Court of Human Rights (ECtHR), the Committee of Ministers of the Council of Europe went further and recommended in 1997 that member governments "take appropriate steps to combat hate speech" under its Recommendation R (97) 20. The ECtHR does not offer an accepted definition for "hate speech" but instead offers only parameters by which prosecutors can decide if the "hate speech" is entitled to the protection of freedom of speech.
A growing awareness of this topic has resulted from educational programs in schools, which has enhanced reporting of hate speech incidences. The Council of Europe also created the European Commission against Racism and Intolerance, which has produced country reports and several general policy recommendations, for instance against anti-Semitism and intolerance against Muslims.
There has been considerable debate over the definition of "hate speech" ("vihapuhe") in the Finnish language.
If "hate speech" is taken to mean ethnic agitation, it is prohibited in Finland and defined in the section 11 of the penal code, "War crimes and crimes against humanity", as published information or as an opinion or other statement that threatens or insults a group because of race, nationality, ethnicity, religion or conviction, sexual orientation, disability, or a comparable basis. Ethnic agitation is punishable with a fine or up to 2 years in prison, or 4 months to 4 years if aggravated (such as incitement to genocide).
Critics claim that, in political contexts, labeling certain opinions and statements "hate speech" can be used to silence unfavorable or critical opinions and suppress debate. Certain politicians, including Member of Parliament and the leader of the Finns Party Jussi Halla-aho, consider the term "hate speech" problematic because of the disagreement over its definition.
France's penal code and press laws prohibit public and private communication that is defamatory or insulting, or that incites discrimination, hatred, or violence against a person or group on account of place of origin, ethnicity or lack thereof, nationality, race, specific religion, sex, sexual orientation, or handicap. The law prohibits declarations that justify or deny crimes against humanity—for example, the Holocaust (Gayssot Act).
In July 2019, Laetitia Avia proposed a bill to fight hate speech on social media. The Avia law was passed on May 13, 2020. It requires websites to remove content that contains hate speech within 24 hours after publication. Failure to comply is punishable by one year of imprisonment and a fine of up to €15,000.
In Germany, "Volksverhetzung" ("incitement to hatred") is a punishable offense under Section 130 of the "Strafgesetzbuch" (Germany's criminal code) and can lead to up to five years' imprisonment. Section 130 makes it a crime to publicly incite hatred against parts of the population or to call for violent or arbitrary measures against them or to insult, maliciously slur or defame them in a manner violating their (constitutionally protected) human dignity. Thus for instance it is illegal to publicly call certain ethnic groups "maggots" or "freeloaders". "Volksverhetzung" is punishable in Germany even if committed abroad and even if committed by non-German citizens, if only the incitement of hatred takes effect within German territory, e.g., the seditious sentiment was expressed in German writing or speech and made accessible in Germany (German criminal code's Principle of Ubiquity, Section 9 §1 Alt. 3 and 4 of the Strafgesetzbuch).
On June 30, 2017, Germany approved a bill criminalizing hate speech on social media sites. Among criminalizing hate speech, the law states that social networking sites may be fined up to €50 million (US$56 million) if they persistently fail to remove illegal content within a week, including defamatory "fake news".
Panos Kammenos, the leader of the national-conservative Independent Greeks party, encouraged his supporters to lynch the mayor of Aristotelis in September 2013 during the Thessaloniki International Fair, leading to a criminal prosecution against Kammenos.
In Iceland, the hate speech law is not confined to inciting hatred, as one can see from Article 233 a. in the Icelandic Penal Code, but includes public denigration:
Freedom of speech and expression is protected by article 19 (1) of the constitution of India, but under article 19(2) "reasonable restrictions" can be imposed on freedom of speech and expression in the interest of "the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, public order, decency or morality, or in relation to contempt of court, defamation or incitement to an offence".
Indonesia has been a signatory to the International Covenant on Civil and Political Rights since 2006, but has not promulgated comprehensive legislation against hate-speech crimes. Calls for a comprehensive anti-hate speech law and associated educational program have followed statements by a leader of a hard-line Islamic organization that Balinese Hindus were mustering forces to protect the "lascivious Miss World pageant" in “a war against Islam" and that "those who fight on the path of Allah are promised heaven". The statements are said to be an example of similar messages intolerance being preached throughout the country by radical clerics. The National Police ordered all of their personnel to anticipate any potential conflicts in society caused by hate speech. The order is stipulated in the circular signed by the National Police chief General Badrodin Haiti on Oct. 8, 2015.
The Constitution of Ireland guarantees Irish citizens the right "to express freely their convictions and opinions"; however, this right is "subject to public order and morality", mass media "shall not be used to undermine public order or morality or the authority of the State", and "publication or utterance of blasphemous, seditious, or indecent matter is an offence". The Prohibition of Incitement to Hatred Act 1989 made it an offence to make, distribute, or broadcast "threatening, abusive or insulting" words, images, or sounds with intent or likelihood to "stir up hatred", where "hatred" is "against a group of persons in the State or elsewhere on account of their race, colour, nationality, religion, ethnic or national origins, membership of the travelling community or sexual orientation". The first conviction was in 2000, of a bus driver who told a Gambian passenger "You should go back to where you came from". This, however, was overturned on appeal due to the strict interpretation of intent to stir up hatred; the judge explained that the bus driver had no intention of "stirring up hate, however racist the comments were". Frustration at the low number of prosecutions (18 by 2011) was attributed to a misconception that the law addressed hate crimes more generally as opposed to incitement in particular. In 2013 the Constitutional Convention considered the constitutional prohibition of blasphemy, and recommended replacing it with a ban on incitement to religious hatred. This was endorsed by the Oireachtas, and in 2017 the Fine Gael-led government planned a referendum for October 2018. The referendum passed, with 64.85% of voters in favour of removing the law, a result which the Irish Times described 'uniquely unanimous in recent years'.
Japanese law covers threats and slander, but it "does not apply to hate speech against general groups of people". Japan became a member of the United Nations International Convention on the Elimination of All Forms of Racial Discrimination in 1995. Article 4 of the convention sets forth provisions calling for the criminalization of hate speech. But the Japanese government has suspended the provisions, saying actions to spread or promote the idea of racial discrimination have not been taken in Japan to such an extent that legal action is necessary. The Foreign Ministry says that this assessment remains unchanged.
In May 2013, the United Nations Committee on Economic, Social and Cultural Rights (CESCR) warned the Japanese government that it needs to take measures to curb hate speech against so-called comfort women. The committee's recommendation called for the Japanese government to better educate Japanese society on the plight of women who were forced into sexual slavery to prevent stigmatization, and to take necessary measures to repair the lasting effects of exploitation, including addressing their right to compensation.
In 2013, following demonstrations, parades, and comments posted on the Internet threatening violence against foreign residents of Japan, especially Koreans, there are concerns that hate speech is a growing problem in Japan. Prime Minister Shinzō Abe and Justice Minister Sadakazu Tanigaki have expressed concerns about the increase in hate speech, saying that it "goes completely against the nation's dignity", but so far have stopped short of proposing any legal action against protesters.
On 22 September 2013 around 2,000 people participated in the "March on Tokyo for Freedom" campaigning against recent hate speech marches. Participants called on the Japanese government to "sincerely adhere" to the International Convention on the Elimination of All Forms of Racial Discrimination. Sexual minorities and the disabled also participated in the march.
On 25 September 2013 a new organization, "An international network overcoming hate speech and racism" (Norikoenet), that is opposed to hate speech against ethnic Koreans and other minorities in Japan was launched.
On 7 October 2013, in a rare ruling on racial discrimination against ethnic Koreans, a Japanese court ordered an anti-Korean group, Zaitokukai, to stop "hate speech" protests against a Korean school in Kyoto and pay the school 12.26 million yen ($126,400 U.S.) in compensation for protests that took place in 2009 and 2010.
A United Nations panel urged Japan to ban hate speech.
In May 2016 Japan passed a law dealing with hate speech. However, it does not ban hate speech and sets no penalty for committing it.
Several Jordanian laws seek to prevent the publication or dissemination of material that could provoke strife or hatred:
In Kenya, hate speech is regulated, but not strictly defined by law, including article 33 of the constitution "and three enabling Acts, such as the National Integration and Cohesion Act, 2008 and Media Act 2007".
The Maltese criminal code through Articles 82A-82D prohibits in substance hate speech comprehensively as follows:
The Dutch penal code prohibits both insulting a group (article 137c) and inciting hatred, discrimination or violence (article 137d). The definition of the offences as outlined in the penal code is as follows:
In January 2009, a court in Amsterdam ordered the prosecution of Geert Wilders, a Dutch Member of Parliament, for breaching articles 137c and 137d. On 23 June 2011, Wilders was acquitted of all charges. In 2016, in a separate case, Wilders was found guilty of both insulting a group and inciting discrimination for promising an audience that he would deliver on their demand for "fewer Moroccans."
New Zealand prohibits hate speech under the Human Rights Act 1993. Section 61 (Racial Disharmony) makes it unlawful to publish or distribute "threatening, abusive, or insulting ... matter or words likely to excite hostility against or bring into contempt any group of persons ... on the ground of the colour, race, or ethnic or national origins of that group of persons". Section 131 (Inciting Racial Disharmony) lists offences for which "racial disharmony" creates liability.
Norway prohibits hate speech, and defines it as publicly making statements that threaten or ridicule someone or that incite hatred, persecution or contempt for someone due to their skin colour, ethnic origin, homosexual orientation, religion or philosophy of life. At the same time, the Norwegian Constitution guarantees the right to free speech, and there has been an ongoing public and judicial debate over where the right balance between the ban against hate speech and the right to free speech lies. Norwegian courts have been restrictive in the use of the hate speech law and only a few persons have been sentenced for violating the law since its implementation in 1970. A public Free Speech committee (1996–1999) recommended to abolish the hate speech law but the Norwegian Parliament instead voted to slightly strengthen it.
The hate speech laws in Poland punish those who offend the feelings of the religious by e.g. disturbing a religious ceremony or creating public calumny. They also prohibit public expression that insults a person or a group on account of national, ethnic, racial, or religious affiliation or the lack of a religious affiliation.
Article 369 of the Criminal Code, titled 'Incitement to hatred or discrimination', prohibits hate speech directed against a group of persons. The offense carries a punishment of 6 months to 3 years' imprisonment, or a fine.
According to Article 282 of the Criminal Code, 'Raising hates or hostility, or equally humiliation of human dignity':
The Serbian constitution guarantees freedom of speech, but restricts it in certain cases to protect the rights of others. The criminal charge of "Provoking ethnic, racial and religion based animosity and intolerance" carries a minimum six months prison term and a maximum of ten years.
Singapore has passed numerous laws that prohibit speech that causes disharmony among various religious groups. The Maintenance of Religious Harmony Act is an example of such legislation. The Penal Code criminalizes the deliberate promotion by someone of enmity, hatred or ill-will between different racial and religious groups on grounds of race or religion. It also makes it an offence for anyone to deliberately wound the religious or racial feelings of any person.
In South Africa, hate speech (along with incitement to violence and propaganda for war) is specifically excluded from protection of free speech in the Constitution. The Promotion of Equality and Prevention of Unfair Discrimination Act, 2000 contains the following clause:
The "prohibited grounds" include race, gender, sex, pregnancy, marital status, ethnic or social origin, colour, sexual orientation, age, disability, religion, conscience, belief, culture, language and birth.
The crime of "crimen injuria" ("unlawfully, intentionally and seriously impairing the dignity of another") may also be used to prosecute hate speech.
In 2011, a South African court banned "Dubula iBhunu (Shoot the Boer)", a derogatory song that degraded Afrikaners, on the basis that it violated a South African law prohibiting speech that demonstrates a clear intention to be hurtful, to incite harm, or to promote hatred.
In October 2016, "the draft Hate Crimes Bill was introduced. It aims to address racism, racial discrimination, xenophobia and discrimination based on gender, sex, sexual orientation and other issues, by providing an offence of hate crime. It includes controversial provisions that criminalize hate speech in ways that could be used to impermissibly restrict the right to freedom of expression". The Foundation of Economic Education views this bill as a repetition of a mistake during the apartheid era, some maintaining that it constitutes "the gravest threat to freedom of expression which South Africans have ever faced."
The Spanish "Código Penal" has article 510, which forbids ill-intended speech against individuals but has been criticized for its vague interpretation. In addition to this specific offence included in the Special Part of the Criminal Code, there exists a generic aggravating circumstance that may be applied to all offences (including slander and defamation) when they are motivated by hatred or discriminatory bias (article 22.4ª of the Spanish Código Penal). Besides those hate speech crimes, Spain also tackles hate speech through non criminal laws, such as article 23 of the Law 19/2007, against violence, racism, xenophobia and intolerance in sports.
The organisation tasked with enforcing hate speech related crimes is the Committee on the Elimination of Racial Discrimination ("Comité para la Eliminación de la Discriminación Racial"). This committee is directed by the ("Convención Internacional sobre la Eliminación de todas las Formas de Discriminación Racial"). In an article published in 2011, it showed concerns about the persistence of stereotypical and unhealthy racial attitudes towards maghrebi and Latino communities living in Spain. The committee has urged the Government to take action by creating a national strategy in order to combat racism, xenophobia and their social consequences.
Sweden prohibits hate speech, and defines it as publicly making statements that threaten or express disrespect for an ethnic group or similar group regarding their race, skin colour, national or ethnic origin, faith, or sexual orientation. The crime does not prohibit a pertinent and responsible debate ("en saklig och vederhäftig diskussion"), nor statements made in a completely private sphere. There are constitutional restrictions pertaining to which acts are criminalized, as well limits set by the European Convention on Human Rights. The crime is called "Hets mot folkgrupp" in Swedish, which directly translates to "Incitement (of hatred/violence) towards population groups."
The sexual orientation provision, added in 2002, was used to convict Pentecostalist pastor Åke Green of hate speech based on a 2003 sermon. His conviction was later overturned.
In Switzerland public discrimination or invoking to rancor against persons or a group of people because of their race, ethnicity, is getting penalized with a term of imprisonment until 3 years or a mulct. In 1934, the authorities of the Basel-Stadt canton criminalized anti-Jewish hate speech, e.g. the accusation of ritual murders, mostly in reaction against a pro-Nazi antisemitic group and newspaper, the .
I. ""Constitution of Ukraine"" :
The most important law in Ukraine, the "Constitution of Ukraine," guarantees protection against Hate crime:
Article 24: "There can be no privileges or restrictions on the grounds of race, color of the skin, political, religious or other beliefs, sex, ethnic or social origin, property status, place of residence, language or other grounds".
Article 37: "The establishment and activity of political parties and public associations are prohibited if their program goals or actions are aimed at ...the propaganda of war and of violence, the incitement of inter-ethnic, racial, or religious enmity, and the encroachment on human rights and freedoms and the health of the population".
II. ""CRIMINAL CODEX OF UKRAINE"" :
in Ukraine, all criminal punishments for crimes committed under the law are required to be registered in only one law, it is the only one: "CRIMINAL CODEX OF UKRAINE"
The crimes committed for Hate crime reinforce the punishment in many articles of the criminal law. There are also separate articles on punishment for Hate crime.
"CRIMINAL CODEX OF UKRAINE" :
Article 161 : "Violations of equality of citizens depending on their race, nationality, religious beliefs, disability and other grounds
1. Intentional acts aimed at incitement to national, racial or religious hatred and violence, to humiliate national honor and dignity, or to repulse citizens' feelings due to their religious beliefs,
as well as direct or indirect restriction of rights or the establishment of direct or indirect privileges citizens on the grounds of race, color, political, religious or other beliefs, sex, disability, ethnic or social origin, property status, place of residence, language or other grounds"(Maximum criminal sentence of up to 8 years in prison)
Article 300 : "Importation, manufacture or distribution of works promoting a cult of violence and cruelty, racial, national or religious intolerance and discrimination" (Maximum criminal sentence of up to 5 years in prison)
In the UAE and many middle eastern countries, certain types of hate speech are not tolerated. Offenders have been jailed, deported or have their nationality revoked. The media is also censored and it does not publish news which might portray a negative picture.
In the United Kingdom, several statutes criminalize hate speech against several categories of people. The statutes forbid communication that is hateful, threatening, or abusive, and targets a person on account of disability, ethnic or national origin, nationality (including citizenship), race, religion, sexual orientation, or skin colour. The penalties for hate speech include fines, imprisonment, or both. Legislation against Sectarian hate in Scotland, which is aimed principally at football matches, does not criminalise jokes about people's beliefs, nor outlaw "harsh" comment about their religious faith.
The United States does not have hate speech laws, since American courts have repeatedly ruled that laws criminalizing hate speech violate the guarantee to freedom of speech contained in the First Amendment to the U.S. Constitution. There are several categories of speech that are not protected by the First Amendment, such as speech that calls for imminent violence upon a person or group. However, the U.S. Supreme Court has ruled that hate speech that does not fall into one of these categories is constitutionally protected.
Proponents of hate speech legislation in the United States have argued that freedom of speech undermines the 14th Amendment by bolstering an oppressive narrative which demeans equality and the Reconstructive Amendment's purpose of guaranteeing equal protection under the law.
A website that contains hate speech (online hate speech) may be called a "hate site".
On May 31, 2016, Facebook, Google, Microsoft, and Twitter, jointly agreed to a European Union code of conduct obligating them to review "[the] majority of valid notifications for removal of illegal hate speech" posted on their services within 24 hours.
Prior to this in 2013, Facebook, with pressure from over 100 advocacy groups including the Everyday Sexism Project, agreed to change their hate speech policies after data released regarding content that promoted domestic and sexual violence against women led to the withdrawal of advertising by 15 large companies.
Many extremist organizations disseminate hate speech by publishing posts that act as prompts. Content analysis algorithms catch obvious hate speech but cannot pick up covert hate speech. Therefore, though an original post may technically adhere to hate speech policy, the ideas the post conveys prompts readers to use the comment section to spread overt hate speech. These sections are not monitored as heavily as main posts.
Nadine Strossen says that while efforts to censor hate speech have the goal of protecting the most vulnerable, these efforts are ineffective and may have the opposite effect. Disadvantaged and ethnic minorities are sometimes the ones charged with violating laws against hate speech.
Critic of hate speech theory Kim Holmes has argued that it "assumes bad faith on the part of people regardless of their stated intentions" and that it "obliterates the ethical responsibility of the individual".
Rebecca Ruth Gould argues that hate speech constitutes viewpoint discrimination, as the legal system punishes some viewpoints but not others.
Research indicates that when people support censoring hate speech, they are motivated by concerns about the effects the speech has on others than they are about its effects on themselves.
|
https://en.wikipedia.org/wiki?curid=14233
|
Henrik Ibsen
Henrik Johan Ibsen (; ; 20 March 1828 – 23 May 1906) was a Norwegian playwright and theatre director. As one of the founders of modernism in theatre, Ibsen is often referred to as "the father of realism" and one of the most influential playwrights of his time. His major works include "Brand", "Peer Gynt", "An Enemy of the People", "Emperor and Galilean", "A Doll's House", "Hedda Gabler", "Ghosts", "The Wild Duck", "When We Dead Awaken", "Rosmersholm", and "The Master Builder". He is the most frequently performed dramatist in the world after Shakespeare, and "A Doll's House" was the world's most performed play in 2006.
Ibsen's early poetic and cinematic play "Peer Gynt" has strong surreal elements. After "Peer Gynt" Ibsen abandoned verse and wrote in realistic prose. Several of his later dramas were considered scandalous to many of his era, when European theatre was expected to model strict morals of family life and propriety. Ibsen's later work examined the realities that lay behind the facades, revealing much that was disquieting to a number of his contemporaries. He had a critical eye and conducted a free inquiry into the conditions of life and issues of morality. In many critics' estimates "The Wild Duck" and "Rosmersholm" are "vying with each other as rivals for the top place among Ibsen's works;" Ibsen himself regarded "Emperor and Galilean" as his masterpiece.
Ibsen is often ranked as one of the most distinguished playwrights in the European tradition. He is widely regarded as the foremost playwright of the nineteenth century. He influenced other playwrights and novelists such as George Bernard Shaw, Oscar Wilde, Arthur Miller, James Joyce, Eugene O'Neill, and Miroslav Krleža. Ibsen was nominated for the Nobel Prize in Literature in 1902, 1903, and 1904.
Ibsen wrote his plays in Danish (the common written language of Denmark and Norway during his lifetime) and they were published by the Danish publisher Gyldendal. Although most of his plays are set in Norway—often in places reminiscent of Skien, the port town where he grew up—Ibsen lived for 27 years in Italy and Germany, and rarely visited Norway during his most productive years. Born into a patrician merchant family, the intertwined Ibsen and Paus family, Ibsen shaped his dramas according to his family background and often modelled characters after family members. He was the father of Prime Minister Sigurd Ibsen. Ibsen's dramas had a strong influence upon contemporary culture.
Ibsen was born into an affluent merchant family in the wealthy port town of Skien in Bratsberg (Telemark). His parents were Knud Ibsen (1797–1877) and Marichen Altenburg (1799–1869). Henrik Ibsen wrote that “my parents were members on both sides of the most respected families in Skien,” explaining that he was closely related with “just about all the patrician families who then dominated the place and its surroundings.”
His parents, though not related by blood, had been raised as something that resembled social siblings. Knud Ibsen's biological father, ship's captain Henrich Ibsen, died at sea when he was newborn in 1797 and his mother married captain Ole Paus the following year; Ole Paus was the brother of Marichen's mother Hedevig Paus, and their families were very close; for example Ole's oldest biological son and Knud's half-brother Henrik Johan Paus was raised in Hedevig's home together with his cousin Marichen, and the biological and social children of the Paus siblings, including Knud and Marichen, spent much of their childhood together. Some Ibsen scholars have claimed that Henrik Ibsen was fascinated by his parents’ “strange, almost incestuous marriage;” he would treat the subject of incestuous relationships in several plays, notably his masterpiece "Rosmersholm".
When Henrik Ibsen was around seven years old, his father's fortunes took a significant turn for the worse, and the family was eventually forced to sell the major Altenburg building in central Skien and move permanently to their large summer house, "Venstøp", outside of the city. Henrik's sister Hedvig would write about their mother: "She was a quiet, lovable woman, the soul of the house, everything to her husband and children. She sacrificed herself time and time again. There was no bitterness or reproach in her." The Ibsen family eventually moved to a city house, Snipetorp, owned by Knud Ibsen's half-brother, wealthy banker and ship-owner Christopher Blom Paus.
His father's financial ruin would have a strong influence on Ibsen's later work; the characters in his plays often mirror his parents, and his themes often deal with issues of financial difficulty as well as moral conflicts stemming from dark secrets hidden from society. Ibsen would both model and name characters in his plays after his own family. A central theme in Ibsen's plays is the portrayal of suffering women, echoing his mother Marichen Altenburg; Ibsen's sympathy with women would eventually find significant expression with their portrayal in dramas such as "A Doll's House" and "Rosmersholm".
At fifteen, Ibsen was forced to leave school. He moved to the small town of Grimstad to become an apprentice pharmacist and began writing plays. In 1846, when Ibsen was 18, he had a liaison with Else Sophie Jensdatter Birkedalen which produced a son, Hans Jacob Hendrichsen Birkdalen, whose upbringing Ibsen paid for until the boy was fourteen, though Ibsen never saw Hans Jacob. Ibsen went to Christiania (later renamed Kristiania and then Oslo) intending to matriculate at the university. He soon rejected the idea (his earlier attempts at entering university were blocked as he did not pass all his entrance exams), preferring to commit himself to writing. His first play, the tragedy "Catilina" (1850), was published under the pseudonym "Brynjolf Bjarme", when he was only 22, but it was not performed. His first play to be staged, "The Burial Mound" (1850), received little attention. Still, Ibsen was determined to be a playwright, although the numerous plays he wrote in the following years remained unsuccessful. Ibsen's main inspiration in the early period, right up to "Peer Gynt", was apparently the Norwegian author Henrik Wergeland and the Norwegian folk tales as collected by Peter Christen Asbjørnsen and Jørgen Moe. In Ibsen's youth, Wergeland was the most acclaimed, and by far the most read, Norwegian poet and playwright.
He spent the next several years employed at Det norske Theater (Bergen), where he was involved in the production of more than 145 plays as a writer, director, and producer. During this period, he published five new, though largely unremarkable, plays. Despite Ibsen's failure to achieve success as a playwright, he gained a great deal of practical experience at the Norwegian Theater, experience that was to prove valuable when he continued writing.
Ibsen returned to Christiania in 1858 to become the creative director of the Christiania Theatre. He married Suzannah Thoresen on 18 June 1858 and she gave birth to their only child Sigurd on 23 December 1859. The couple lived in very poor financial circumstances and Ibsen became very disenchanted with life in Norway. In 1864, he left Christiania and went to Sorrento in Italy in self-imposed exile. He didn't return to his native land for the next 27 years, and when he returned to it he was a noted, but controversial, playwright.
His next play, "Brand" (1865), brought him the critical acclaim he sought, along with a measure of financial success, as did the following play, "Peer Gynt" (1867), to which Edvard Grieg famously composed incidental music and songs. Although Ibsen read excerpts of the Danish philosopher Søren Kierkegaard and traces of the latter's influence are evident in "Brand", it was not until after "Brand" that Ibsen came to take Kierkegaard seriously. Initially annoyed with his friend Georg Brandes for comparing Brand to Kierkegaard, Ibsen nevertheless read "Either/Or" and "Fear and Trembling". Ibsen's next play "Peer Gynt" was consciously informed by Kierkegaard.
With success, Ibsen became more confident and began to introduce more and more of his own beliefs and judgements into the drama, exploring what he termed the "drama of ideas". His next series of plays are often considered his Golden Age, when he entered the height of his power and influence, becoming the center of dramatic controversy across Europe.
Ibsen moved from Italy to Dresden, Germany, in 1868, where he spent years writing the play he regarded as his main work, "Emperor and Galilean" (1873), dramatizing the life and times of the Roman emperor Julian the Apostate. Although Ibsen himself always looked back on this play as the cornerstone of his entire works, very few shared his opinion, and his next works would be much more acclaimed. Ibsen moved to Munich in 1875 and began work on his first contemporary realist drama "The Pillars of Society", first published and performed in 1877. "A Doll's House" followed in 1879. This play is a scathing criticism of the marital roles accepted by men and women which characterized Ibsen's society.
Ibsen was already in his fifties when "A Doll’s House" was published. He himself saw his latter plays as a series. At the end of his career, he described them as “that series of dramas which began with "A Doll’s House" and which is now completed with "When We Dead Awaken"”. Furthermore, it was the reception of "A Doll’s House" which brought Ibsen international acclaim.
"Ghosts" followed in 1881, another scathing commentary on the morality of Ibsen's society, in which a widow reveals to her pastor that she had hidden the evils of her marriage for its duration. The pastor had advised her to marry her fiancé despite his philandering, and she did so in the belief that her love would reform him. But his philandering continued right up until his death, and his vices are passed on to their son in the form of syphilis. The mention of venereal disease alone was scandalous, but to show how it could poison a respectable family was considered intolerable.
In "An Enemy of the People" (1882), Ibsen went even further. In earlier plays, controversial elements were important and even pivotal components of the action, but they were on the small scale of individual households. In "An Enemy", controversy became the primary focus, and the antagonist was the entire community. One primary message of the play is that the individual, who stands alone, is more often "right" than the mass of people, who are portrayed as ignorant and sheeplike. Contemporary society's belief was that the community was a noble institution that could be trusted, a notion Ibsen challenged. In "An Enemy of the People", Ibsen chastised not only the conservatism of society, but also the liberalism of the time. He illustrated how people on both sides of the social spectrum could be equally self-serving. "An Enemy of the People" was written as a response to the people who had rejected his previous work, "Ghosts". The plot of the play is a veiled look at the way people reacted to the plot of "Ghosts". The protagonist is a physician in a vacation spot whose primary draw is a public bath. The doctor discovers that the water is contaminated by the local tannery. He expects to be acclaimed for saving the town from the nightmare of infecting visitors with disease, but instead he is declared an 'enemy of the people' by the locals, who band against him and even throw stones through his windows. The play ends with his complete ostracism. It is obvious to the reader that disaster is in store for the town as well as for the doctor.
As audiences by now expected, Ibsen's next play again attacked entrenched beliefs and assumptions; but this time, his attack was not against society's mores, but against overeager reformers and their idealism. Always an iconoclast, Ibsen saw himself as an objective observer of society, “like a lone franc tireur in the outposts”, playing a lone hand, as he put it. Ibsen, perhaps more than any of his contemporaries, relied upon immediate sources such as newspapers and second-hand report for his contact with intellectual thought. He claimed to be ignorant of books, leaving them to his wife and son, but, as Georg Brandes described, “he seemed to stand in some mysterious correspondence with the fermenting, germinating ideas of the day.
"The Wild Duck" (1884) is by many considered Ibsen's finest work, and it is certainly the most complex. It tells the story of Gregers Werle, a young man who returns to his hometown after an extended exile and is reunited with his boyhood friend Hjalmar Ekdal. Over the course of the play, the many secrets that lie behind the Ekdals' apparently happy home are revealed to Gregers, who insists on pursuing the absolute truth, or the "Summons of the Ideal". Among these truths: Gregers' father impregnated his servant Gina, then married her off to Hjalmar to legitimize the child. Another man has been disgraced and imprisoned for a crime the elder Werle committed. Furthermore, while Hjalmar spends his days working on a wholly imaginary "invention", his wife is earning the household income.
Ibsen displays masterful use of irony: despite his dogmatic insistence on truth, Gregers never says what he thinks but only insinuates, and is never understood until the play reaches its climax. Gregers hammers away at Hjalmar through innuendo and coded phrases until he realizes the truth; Gina's daughter, Hedvig, is not his child. Blinded by Gregers' insistence on absolute truth, he disavows the child. Seeing the damage he has wrought, Gregers determines to repair things, and suggests to Hedvig that she sacrifice the wild duck, her wounded pet, to prove her love for Hjalmar. Hedvig, alone among the characters, recognizes that Gregers always speaks in code, and looking for the deeper meaning in the first important statement Gregers makes which does not contain one, kills herself rather than the duck in order to prove her love for him in the ultimate act of self-sacrifice. Only too late do Hjalmar and Gregers realize that the absolute truth of the "ideal" is sometimes too much for the human heart to bear.
Late in his career, Ibsen turned to a more introspective drama that had much less to do with denunciations of society's moral values and more to do with the problems of individuals. In such later plays as "Hedda Gabler" (1890) and "The Master Builder" (1892), Ibsen explored psychological conflicts that transcended a simple rejection of current conventions. Many modern readers, who might regard anti-Victorian didacticism as dated, simplistic or hackneyed, have found these later works to be of absorbing interest for their hard-edged, objective consideration of interpersonal confrontation. "Hedda Gabler" and "A Doll’s House" are regularly cited as Ibsen's most popular and influential plays, with the title role of Hedda regarded as one of the most challenging and rewarding for an actress even in the present day.
Ibsen had completely rewritten the rules of drama with a realism which was to be adopted by Chekhov and others and which we see in the theatre to this day. From Ibsen forward, challenging assumptions and directly speaking about issues has been considered one of the factors that makes a play art rather than entertainment. His works were brought to an English-speaking audience, largely thanks to the efforts of William Archer and Edmund Gosse. These in turn had a profound influence on the young James Joyce who venerates him in his early autobiographical novel "Stephen Hero". Ibsen returned to Norway in 1891, but it was in many ways not the Norway he had left. Indeed, he had played a major role in the changes that had happened across society. Modernism was on the rise, not only in the theatre, but across public life.. Michael Meyer's translations in the 1950s were welcomed by actors and directors as playable, rather than academic. As The Times newspaper put it, "‘This, one may think, is how Ibsen might have expressed himself in English'."
Ibsen intentionally obscured his influences. However, asked later what he had read when he wrote Catiline, Ibsen replied that he had read only the Danish Norse saga-inspired Romantic tragedian Adam Oehlenschläger and Ludvig Holberg, "the Scandinavian Molière".
At the time when Ibsen was writing, literature was emerging as a formidable force in 19th century society. It was still a relatively new form of popular discussion and entertainment . With the vast increase in literacy towards the end of the century, the possibilities of literature being used for subversion struck horror into the heart of the Establishment. Ibsen's plays, from "A Doll’s House" onwards, caused an uproar: not just in Norway, but throughout Europe, and even across the Atlantic in America. No other artist, apart from Richard Wagner, had such an effect internationally, inspiring almost blasphemous adoration and hysterical abuse.
After the publication of "Ghosts", he wrote: “while the storm lasted, I have made many studies and observations and I shall not hesitate to exploit them in my future writings.” Indeed, his next play "An Enemy of the People" was initially regard by the critics to be simply his response to the violent criticism which had greeted "Ghosts". Ibsen expected criticism: as he wrote to his publisher: “"Ghosts" will probably cause alarm in some circles, but it can’t be helped. If it did not, there would have been no necessity for me to have written it.”
Ibsen didn't just read the critical reaction to his plays, he actively corresponded with critics, publishers, theatre directors and newspaper editors on the subject. The interpretation of his work, both by critics and directors, concerned him greatly. He often advised directors on which actor or actress would be suitable for a particular role. [An example of this is a letter he wrote to Hans Schroder in November 1884, with detailed instructions for the production of "The Wild Duck".]
Ibsen's plays initially reached a far wider audience as read plays rather than in performance. It was 20 years, for instance, before the authorities would allow "Ghosts" to be performed in Norway. Each new play that Ibsen wrote, from 1879 onwards, had an explosive effect on intellectual circles. This was greatest for "A Doll’s House" and "Ghosts", and it did lessen with the later plays, but the translation of Ibsen's works into German, French and English during the decade following the initial publication of each play and frequent new productions as and when permission was granted, meant that Ibsen remained a topic of lively conversation throughout the latter decades of the 19th century. When "A Doll’s House" was published, it had an explosive effect: it was the centre of every conversation at every social gathering in Christiana. One hostess even wrote on the invitations to her soirée, “You are politely requested not to mention Mr Ibsen’s new play”.
On 23 May 1906, Ibsen died in his home at Arbins gade 1 in Kristiania (now Oslo) after a series of strokes in March 1900. When, on 22 May, his nurse assured a visitor that he was a little better, Ibsen spluttered his last words "On the contrary" ("Tvertimod!"). He died the following day at 2:30 pm.
Ibsen was buried in Vår Frelsers gravlund ("The Graveyard of Our Savior") in central Oslo.
The 100th anniversary of Ibsen's death in 2006 was commemorated with an "Ibsen year" in Norway and other countries. In 2006, the homebuilding company Selvaag also opened "Peer Gynt" Sculpture Park in Oslo, Norway, in Henrik Ibsen's honour, making it possible to follow the dramatic play "Peer Gynt" scene by scene. Will Eno's adaptation of Ibsen's "Peer Gynt", titled "Gnit", had its world premiere at the 37th Humana Festival of New American Plays in March 2013.
On 23 May 2006, The Ibsen Museum in Oslo re-opened, to the public, the house where Ibsen had spent his last eleven years, completely restored with the original interior, colours, and decor.
The social questions which concerned Ibsen belonged unequivocally to the 19th century. From a modern perspective, the aspects of his writing that appeal most are the psychological issues which he explored. The social issues, taken up so prominently in his own day, have become dated, as has the late-Victorian middle-class setting of his plays. The fact that, whether read and staged, they still possess a compelling power is testament to his enduring quality as a thinker and a dramatist.
On the occasion of the 100th anniversary of Ibsen's death in 2006, the Norwegian government organised the Ibsen Year, which included celebrations around the world. The NRK produced a miniseries on Ibsen's childhood and youth in 2006, "An Immortal Man". Several prizes are awarded in the name of Henrik Ibsen, among them the International Ibsen Award, the Norwegian Ibsen Award and the Ibsen Centennial Commemoration Award.
Every year, since 2008, the annual "Delhi Ibsen Festival", is held in Delhi, India, organized by the Dramatic Art and Design Academy (DADA) in collaboration with The Royal Norwegian Embassy in India. It features plays by Ibsen, performed by artists from various parts of the world in varied languages and styles.
The Ibsen Society of America (ISA) was founded in 1978 at the close of the Ibsen Sesquicentennial Symposium held in New York City to mark the 150th anniversary of Henrik Ibsen's birth. Distinguished Ibsen translator and critic Rolf Fjelde, Professor of Literature at Pratt Institute and the chief organizer of the Symposium, was elected Founding President. In December 1979, the ISA was certified as a non-profit corporation under the laws of the State of New York. Its purpose is to foster through lectures, readings, performances, conferences, and publications an understanding of Ibsen's works as they are interpreted as texts and produced on stage and in film and other media. An annual newsletter "Ibsen News and Comment" is distributed to all members.
Ibsen's ancestry has been a much studied subject, due to his perceived foreignness and due to the influence of his biography and family on his plays. Ibsen often made references to his family in his plays, sometimes by name, or by modelling characters after them.
The oldest documented member of the Ibsen family was ship's captain Rasmus Ibsen (1632–1703) from Stege, Denmark. His son, ship's captain Peder Ibsen became a burgher of Bergen in Norway in 1726. Henrik Ibsen had Danish, German, Norwegian and some distant Scottish ancestry. Most of his ancestors belonged to the merchant class of original Danish and German extraction, and many of his ancestors were ship's captains.
Ibsen's biographer Henrik Jæger famously wrote in 1888 that Ibsen did not have a drop of Norwegian blood in his veins, stating that "the ancestral Ibsen was a Dane". This, however, is not completely accurate; notably through his grandmother Hedevig Paus, Ibsen was descended from one of the very few families of the patrician class of original Norwegian extraction, known since the 15th century. Ibsen's ancestors had mostly lived in Norway for several generations, even though many had foreign ancestry.
The name Ibsen is originally a patronymic, meaning "son of Ib" (Ib is a Danish variant of Jacob). The patronymic became "frozen", i.e. it became a permanent family name, in the 17th century. The phenomenon of patronymics becoming frozen started in the 17th century in bourgeois families in Denmark, and the practice was only widely adopted in Norway from around 1900.
From his marriage with Suzannah Thoresen, Ibsen had one son, lawyer and government minister Sigurd Ibsen. Sigurd Ibsen married Bergljot Bjørnson, the daughter of Bjørnstjerne Bjørnson. Their son was Tancred Ibsen, who became a film director and was married to Lillebil Ibsen; their only child was diplomat Tancred Ibsen, Jr. Sigurd Ibsen's daughter, Irene Ibsen, married Josias Bille, a member of the Danish ancient noble Bille family; their son was Danish actor Joen Bille.
Ibsen was decorated Knight in 1873, Commander in 1892, and with the Grand Cross of the Order of St. Olav in 1893. He received the Grand Cross of the Danish Order of the Dannebrog, and the Grand Cross of the Swedish Order of the Polar Star, and was Knight, First Class of the Order of Vasa.
Well known stage directors in Austria and Germany as Theodor Lobe (1833–1905), Paul Barnay (1884–1960), Max Burckhard (1854–1912), Otto Brahm (1956–1912), Carl Heine (1861–1927), Paul Albert Glaeser-Wilken (1874–1942), Victor Barnowsky (1875–1952), Eugen Robert (1877–1944), Leopold Jessner (1878–1945), Ludwig Barnay (1884–1960), Alfred Rotter (1886–1933), Fritz Rotter (1888–1939), (1900–1973) and Peter Zadek (1926–2009) performed the work of Ibsen.
In 1995, the asteroid 5696 Ibsen was named in his memory.
Plays entirely or partly in verse are marked v.
The authoritative translation in the English language for Ibsen remains the 1928 ten-volume version of the "Complete Works of Henrik Ibsen" from Oxford University Press. Many other translations of individual plays by Ibsen have appeared since 1928 though none have purported to be a new version of the complete works of Ibsen.
|
https://en.wikipedia.org/wiki?curid=14236
|
Hawaiian language
The Hawaiian language (Hawaiian: ", ) is a Polynesian language that takes its name from Hawaii, the largest island in the tropical North Pacific archipelago where it developed. Hawaiian, along with English, is an official language of the State of Hawaii. King Kamehameha III established the first Hawaiian-language constitution in 1839 and 1840.
For various reasons, including territorial legislation establishing English as the official language in schools, the number of native speakers of Hawaiian gradually decreased during the period from the 1830s to the 1950s. Hawaiian was essentially displaced by English on six of seven inhabited islands. In 2001, native speakers of Hawaiian amounted to less than 0.1% of the statewide population. Linguists were unsure if Hawaiian and other endangered languages would survive.
Nevertheless, from around 1949 to the present day, there has been a gradual increase in attention to and promotion of the language. Public Hawaiian-language immersion preschools called Pūnana Leo were established in 1984; other immersion schools followed soon after that. The first students to start in immersion preschool have now graduated from college and many are fluent Hawaiian speakers. The federal government has acknowledged this development. For example, the Hawaiian National Park Language Correction Act of 2000 changed the names of several national parks in Hawaii, observing the Hawaiian spelling. However, the language is still classified as critically endangered by UNESCO.
A creole language, Hawaiian Pidgin (or Hawaii Creole English, HCE), is more commonly spoken in Hawai than Hawaiian. Some linguists, as well as many locals, argue that Hawaiian Pidgin is a dialect of American English.
The Hawaiian alphabet has 13 letters: five vowels: a e i o u (each with a long pronunciation and a short one) and eight consonants: he ke la mu nu pi we, including a glottal stop called okina.
The Hawaiian language takes its name from the largest island in the Hawaiian state, Hawaii (" in the Hawaiian language). The island name was first written in English in 1778 by British explorer James Cook and his crew members. They wrote it as "Owhyhee" or "Owhyee". Explorers Mortimer (1791) and Otto von Kotzebue (1821) used that spelling.
The initial "O" in the name is a reflection of the fact that Hawaiian predicates unique identity by using a copula form, "o", immediately before a proper noun. Thus, in Hawaiian, the name of the island is expressed by saying "", which means "[This] is Hawaii." The Cook expedition also wrote "Otaheite" rather than "Tahiti."
The spelling "why" in the name reflects the pronunciation of "wh" in 18th-century English (still used in parts of the English-speaking world). "Why" was pronounced . The spelling "hee" or "ee" in the name represents the sounds , or .
Putting the parts together, "O-why-(h)ee" reflects , a reasonable approximation of the native pronunciation, .
American missionaries bound for Hawaii used the phrases "Owhihe Language" and "Owhyhee language" in Boston prior to their departure in October 1819 and during their five-month voyage to Hawaii. They still used such phrases as late as March 1822. However, by July 1823, they had begun using the phrase "Hawaiian Language."
In Hawaiian, "" means "language: Hawaiian ," since adjectives follow nouns.
Hawaiian is a Polynesian member of the Austronesian language family. It is closely related to other Polynesian languages, such as Samoan, Marquesan, Tahitian, Māori, Rapa Nui (the language of Easter Island) and Tongan.
According to Schütz (1994), the Marquesans colonized the archipelago in roughly 300 CE followed by later waves of immigration from the Society Islands and Samoa-Tonga. Their languages, over time, became the Hawaiian language within the Hawaiian Islands. Kimura and Wilson (1983) also state: Linguists agree that Hawaiian is closely related to Eastern Polynesian, with a particularly strong link in the Southern Marquesas, and a secondary link in Tahiti, which may be explained by voyaging between the Hawaiian and Society Islands.
The genetic history of the Hawaiian language is demonstrated primarily through the application of lexicostatistics, which involves quantitative comparison of lexical cognates, and the comparative method. Both the number of cognates and the phonological similarity of cognates are measures of language relationship.
The following table provides a limited lexicostatistical data set for ten numbers. The asterisk (*) is used to show that these are hypothetical, reconstructed forms. In the table, the year date of the modern forms is rounded off to 2000 CE to emphasize the 6000-year time lapse since the PAN era.
Note: For the number "10", the Tongan form in the table is part of the word ('ten'). The Hawaiian cognate is part of the word ('ten days'); however, the more common word for "10" used in counting and quantifying is , a different root.
Application of the lexicostatistical method to the data in the table will show the four languages to be related to one another, with Tagalog having 100% cognacy with PAN, while Hawaiian and Tongan have 100% cognacy with each other, but 90% with Tagalog and PAN. This is because the forms for each number are cognates, except the Hawaiian and Tongan words for the number "1", which are cognate with each other, but not with Tagalog and PAN. When the full set of 200 meanings is used, the percentages will be much lower. For example, Elbert found Hawaiian and Tongan to have 49% (98 ÷ 200) shared cognacy. This points out the importance of data-set size for this method, where less data leads to cruder results, while more data leads to better results.
Application of the comparative method will show partly different genetic relationships. It will point out sound changes, such as:
This method will recognize sound change #1 as a shared innovation of Hawaiian and Tongan. It will also take the Hawaiian and Tongan cognates for "1" as another shared innovation. Due to these exclusively shared features, Hawaiian and Tongan are found to be more closely related to one another than either is to Tagalog or Amis.
The forms in the table show that the Austronesian vowels tend to be relatively stable, while the consonants are relatively volatile. It is also apparent that the Hawaiian words for "3", "5", and "8" have remained essentially unchanged for 6000 years.
In 1778, British explorer James Cook made Europe's initial, recorded first contact with Hawaiʻi, beginning a new phase in the development of Hawaiian. During the next forty years, the sounds of Spanish (1789), Russian (1804), French (1816), and German (1816) arrived in Hawaii via other explorers and businessmen. Hawaiian began to be written for the first time, largely restricted to isolated names and words, and word lists collected by explorers and travelers.
The early explorers and merchants who first brought European languages to the Hawaiian islands also took on a few native crew members who brought the Hawaiian language into new territory. Hawaiians took these nautical jobs because their traditional way of life changed due to plantations, and although there were not enough of these Hawaiian-speaking explorers to establish any viable speech communities abroad, they still had a noticeable presence. One of them, a boy in his teens known as Obookiah (), had a major impact on the future of the language. He sailed to New England, where he eventually became a student at the Foreign Mission School in Cornwall, Connecticut. He inspired New Englanders to support a Christian mission to Hawaii, and provided information on the Hawaiian language to the American missionaries there prior to their departure for Hawaii in 1819.
Like all natural spoken languages, the Hawaiian language was originally just an oral language. The native people of the Hawaiian language relayed religion, traditions, history, and views of their world through stories that were handed down from generation to generation. One form of storytelling most commonly associated with the Hawaiian islands is hula. Nathaniel B. Emerson notes that "It kept the communal imagination in living touch with the nation's legendary past".
The islanders' connection with their stories is argued to be one reason why Captain James Cook received a pleasant welcome. Marshall Sahlins has observed that Hawaiian folktales began bearing similar content to those of the Western world in the eighteenth century. He argues this was caused by the timing of Captain Cook's arrival, which was coincidentally when the indigenous Hawaiians were celebrating the Makahiki festival. The islanders' story foretold of the god Lono's return at the time of the Makahiki festival.
In 1820, Protestant missionaries from New England arrived in Hawaii.
Adelbert von Chamisso might have consulted with a native speaker of Hawaiian in Berlin, Germany, before publishing his grammar of Hawaiian ("") in 1837. When Hawaiian King David Kalākaua took a trip around the world, he brought his native language with him. When his wife, Queen Kapiolani, and his sister, Princess (later Queen) Liliuokalani, took a trip across North America and on to the British Islands, in 1887, Liliuokalani's composition "Aloha Oe" was already a famous song in the U.S.
In 1834, the first Hawaiian-language newspapers were published by missionaries working with locals. The missionaries also played a significant role in publishing a vocabulary (1836) grammar (1854) and dictionary (1865) of Hawaiian. Literacy in Hawaiian was widespread among the local population, especially ethnic Hawaiians. Use of the language among the general population might have peaked around 1881. Even so, some people worried, as early as 1854, that the language was "soon destined to extinction."
The decline of the Hawaiian language dates back to a coup that overthrew the Hawaiian monarchy and dethroned the existing Hawaiian queen. Thereafter, a law was instituted that banned the Hawaiian language from being taught. The law cited as banning the Hawaiian language is identified as Act 57, sec. 30 of the 1896 Laws of the Republic of Hawaii:
This law established English as the medium of instruction for the government-recognized schools both "public and private". While it did not ban or make illegal the Hawaiian language in other contexts, its implementation in the schools had far-reaching effects. Those who had been pushing for English-only schools took this law as licence to extinguish the native language at the early education level. While the law stopped short of making Hawaiian illegal (it was still the dominant language spoken at the time), many children who spoke Hawaiian at school, including on the playground, were disciplined. This included corporal punishment and going to the home of the offending child to advise them strongly to stop speaking it in their home. Moreover, the law specifically provided for teaching languages "in addition to the English language," reducing Hawaiian to the status of a foreign language, subject to approval by the Department. Hawaiian was not taught initially in any school, including the all-Hawaiian Kamehameha Schools. This is largely because when these schools were founded, like Kamehameha Schools founded in 1887 (nine years before this law), Hawaiian was being spoken in the home. Once this law was enacted, individuals at these institutions took it upon themselves to enforce a ban on Hawaiian. Beginning in 1900, Mary Kawena Pukui, who was later the co-author of the Hawaiian–English Dictionary, was punished for speaking Hawaiian by being rapped on the forehead, allowed to eat only bread and water for lunch, and denied home visits on holidays. Winona Beamer was expelled from Kamehameha Schools in 1937 for chanting Hawaiian.
In 1949, the legislature of the Territory of Hawaii commissioned Mary Pukui and Samuel Elbert to write a new dictionary of Hawaiian, either revising the Andrews-Parker work or starting from scratch. Pukui and Elbert took a middle course, using what they could from the Andrews dictionary, but making certain improvements and additions that were more significant than a minor revision. The dictionary they produced, in 1957, introduced an era of gradual increase in attention to the language and culture.
Efforts to promote the language have increased in recent decades. Hawaiian-language "immersion" schools are now open to children whose families want to reintroduce Hawaiian language for future generations. The Aha Pūnana Leo’s Hawaiian language preschools in Hilo, Hawaii, have received international recognition. The local National Public Radio station features a short segment titled "Hawaiian word of the day" and a Hawaiian language news broadcast. Honolulu television station KGMB ran a weekly Hawaiian language program, "Āhai Ōlelo Ola", as recently as 2010. Additionally, the Sunday editions of the "Honolulu Star-Advertiser", the largest newspaper in Hawaii, feature a brief article called "Kauakukalahale" written entirely in Hawaiian by teachers, students, and community members.
Today, the number of native speakers of Hawaiian, which was under 0.1% of the statewide population in 1997, has risen to 2,000, out of 24,000 total who are fluent in the language, according to the US 2011 census. On six of the seven permanently inhabited islands, Hawaiian has been largely displaced by English, but on Niihau, native speakers of Hawaiian have remained fairly isolated and have continued to use Hawaiian almost exclusively.
The isolated island of Niʻihau, located off the southwest coast of Kauai, is the one island where Hawaiian (more specifically a local dialect of Hawaiian known as Niihau dialect) is still spoken as the language of daily life.
Hawaiians had no written language prior to Western contact, except for petroglyph symbols.
The modern Hawaiian alphabet, "ka pīāpā Hawaii", is based on the Latin script. Hawaiian words end "only" in vowels, and every consonant must be followed by a vowel. The Hawaiian alphabetical order has all of the vowels before the consonants, as in the following chart.
This writing system was developed by American Protestant missionaries during 1820–1826. It was the first thing they ever printed in Hawaii, on January 7, 1822, and it originally included the consonants "B, D, R, T," and "V," in addition to the current ones ("H, K, L, M, N, P, W"), and it had "F, G, S, Y" and "Z" for "spelling foreign words". The initial printing also showed the five vowel letters ("A, E, I, O, U") and seven of the short diphthongs ("AE, AI, AO, AU, EI, EU, OU").
In 1826, the developers voted to eliminate some of the letters which represented functionally redundant allophones (called "interchangeable letters"), enabling the Hawaiian alphabet to approach the ideal state of one-symbol-one-phoneme, and thereby optimizing the ease with which people could teach and learn the reading and writing of Hawaiian. For example, instead of spelling one and the same word as "pule, bule, pure," and "bure" (because of interchangeable "p/b" and "l/r"), the word is spelled only as "pule".
However, hundreds of words were very rapidly borrowed into Hawaiian from English, Greek, Hebrew, Latin, and Syriac. Although these loan words were necessarily Hawaiianized, they often retained some of their "non-Hawaiian letters" in their published forms. For example, "Brazil" fully Hawaiianized is "Palakila", but retaining "foreign letters" it is "Barazila". Another example is "Gibraltar", written as "Kipalaleka" or "Gibaraleta". While and are not regarded as Hawaiian sounds, , , and were represented in the original alphabet, so the letters ("b", "r", and "t") for the latter are not truly "non-Hawaiian" or "foreign", even though their post-1826 use in published matter generally marked words of foreign origin.
"ʻOkina" ("oki" 'cut' + "-na" '-ing') is the modern Hawaiian name for the symbol (a letter) that represents the glottal stop. It was formerly known as "uina" ("snap").
For examples of the okina, consider the Hawaiian words "Hawaii" and "Oahu" (often simply "Hawaii" and "Oahu" in English orthography). In Hawaiian, these words are pronounced and , and are written with an okina where the glottal stop is pronounced.
Elbert & Pukui's "Hawaiian Grammar" says "The glottal stop, ‘, is made by closing the glottis or space between the vocal cords, the result being something like the hiatus in English "oh-oh"."
As early as 1823, the missionaries made some limited use of the apostrophe to represent the glottal stop, but they did not make it a letter of the alphabet. In publishing the Hawaiian Bible, they used it to distinguish "kou" ('my') from "kou" ('your'). In 1864, William DeWitt Alexander published a grammar of Hawaiian in which he made it clear that the glottal stop (calling it "guttural break") is definitely a true consonant of the Hawaiian language. He wrote it using an apostrophe. In 1922, the Andrews-Parker dictionary of Hawaiian made limited use of the opening single quote symbol, then called "reversed apostrophe" or "inverse comma", to represent the glottal stop. Subsequent dictionaries and written material associated with the Hawaiian language revitalization have preferred to use this symbol, the "okina", to better represent spoken Hawaiian. Nonetheless, excluding the "okina" may facilitate interface with English-oriented media, or even be preferred stylistically by some Hawaiian speakers, in homage to 19th century written texts. So there is variation today in the use of this symbol.
The okina is written in various ways for electronic uses:
Because many people who want to write the ʻokina are not familiar with these specific characters and/or do not have access to the appropriate fonts and input and display systems, it is sometimes written with more familiar and readily available characters:
A modern Hawaiian name for the macron symbol is "kahakō" ("kaha" 'mark' + "kō" 'long'). It was formerly known as "mekona" (Hawaiianization of "macron"). It can be written as a diacritical mark which looks like a hyphen or dash written above a vowel, i.e., "ā ē ī ō ū" and "Ā Ē Ī Ō Ū". It is used to show that the marked vowel is a "double", or "geminate", or "long" vowel, in phonological terms. (See: Vowel length)
As early as 1821, at least one of the missionaries, Hiram Bingham, was using macrons (and breves) in making handwritten transcriptions of Hawaiian vowels. The missionaries specifically requested their sponsor in Boston to send them some type (fonts) with accented vowel characters, including vowels with macrons, but the sponsor made only one response and sent the wrong font size (pica instead of small pica). Thus, they could not print ā, ē, ī, ō, nor ū (at the right size), even though they wanted to.
Due to extensive allophony, Hawaiian has more than 13 phones. Although vowel length is phonemic, long vowels are not always pronounced as such, even though under the rules for assigning stress in Hawaiian, a long vowel will always receive stress.
Hawaiian is known for having very few consonant phonemes – eight: . It is notable that Hawaiian has allophonic variation of with , with , and (in some dialects) with . The – variation is quite unusual among the world's languages, and is likely a product both of the small number of consonants in Hawaiian, and the recent shift of historical *t to modern –, after historical *k had shifted to . In some dialects, remains as in some words. These variations are largely free, though there are conditioning factors. tends to especially in words with both and , such as in the island name "Lānai" (–), though this is not always the case: "eleele" or "eneene" "black". The allophone is almost universal at the beginnings of words, whereas is most common before the vowel . is also the norm after and , whereas is usual after and . After and initially, however, and are in free variation.
Hawaiian has five short and five long vowels, plus diphthongs.
Hawaiian has five pure vowels. The short vowels are , and the long vowels, if they are considered separate phonemes rather than simply sequences of like vowels, are . When stressed, short and have been described as becoming and , while when unstressed they are and . Parker Jones (2017), however, did not find a reduction of /a/ to in the phonetic analysis of a young speaker from Hilo, Hawaiʻi; so there is at least some variation in how /a/ is realised. also tends to become next to , , and another , as in "Pele" . Some grammatical particles vary between short and long vowels. These include "a" and "o" "of", "ma" "at", "na" and "no" "for". Between a back vowel or and a following non-back vowel (), there is an epenthetic , which is generally not written. Between a front vowel or and a following non-front vowel (), there is an epenthetic (a "y" sound), which is never written.
The short-vowel diphthongs are . In all except perhaps , these are falling diphthongs. However, they are not as tightly bound as the diphthongs of English, and may be considered vowel sequences. (The second vowel in such sequences may receive the stress, but in such cases it is not counted as a diphthong.) In fast speech, tends to and tends to , conflating these diphthongs with and .
There are only a limited number of vowels which may follow long vowels, and some authors treat these sequences as diphthongs as well: .
Hawaiian syllable structure is (C)V. All CV syllables occur except for "wū"; "wu" occurs only in two words borrowed from English. As shown by Schütz, Hawaiian word-stress is predictable in words of one to four syllables, but not in words of five or more syllables. Hawaiian phonological processes include palatalization and deletion of consonants, as well as raising, diphthongization, deletion, and compensatory lengthening of vowels.
Historically, glottal stop developed from *k. Loss of intervocalic consonant phonemes has resulted in Hawaiian long vowels and diphthongs.
Hawaiian is an analytic language with verb–subject–object word order. While there is no use of inflection for verbs, in Hawaiian, like other Austronesian personal pronouns, declension is found in the differentiation between a- and o-class genitive case personal pronouns in order to indicate inalienable possession in a binary possessive class system. Also like many Austronesian languages, Hawaiian pronouns employ separate words for inclusive and exclusive we (clusivity), and distinguish singular, dual, and plural. The grammatical function of verbs is marked by adjacent particles (short words) and by their relative positions, that indicate tense–aspect–mood.
Some examples of verb phrase patterns:
Nouns can be marked with articles:
"ka" and "ke" are singular definite articles. "ke" is used before words beginning with a-, e-, o- and k-, and with some words beginning - and p-. "ka" is used in all other cases. "nā" is the plural definite article.
To show part of a group, the word "kekahi" is used. To show a bigger part, "mau" is inserted to pluralize the subject.
Examples:
|
https://en.wikipedia.org/wiki?curid=14240
|
Second Polish Republic
The Second Polish Republic, commonly known as interwar Poland, refers to the country of Poland in the period between the two World wars (1918–1939). Officially known as the Republic of Poland (), the Polish state was re-established in 1918, in the aftermath of World War I. The Second Republic ceased to exist in 1939, when Poland was invaded by Nazi Germany, the Soviet Union and the Slovak Republic, marking the beginning of the European theatre of World War II.
In 1938, the Second Republic was the sixth largest country in Europe. According to the 1921 census, the number of inhabitants was 27.2 million. By 1939, just before the outbreak of World War II, this had grown to an estimated 35.1 million. Almost a third of population came from minority groups: 13.9% Ruthenians; 10% Ashkenazi Jews; 3.1% Belarusians; 2.3% Germans and 3.4% Czechs and Lithuanians. At the same time, a significant number of ethnic Poles lived outside the country's borders.
When, after several regional conflicts, the borders of the state were finalized in 1922, Poland's neighbors were Czechoslovakia, Germany, the Free City of Danzig, Lithuania, Latvia, Romania and the Soviet Union. It had access to the Baltic Sea via a short strip of coastline either side of the city of Gdynia, known as the Polish Corridor. Between March and August 1939, Poland also shared a border with the then-Hungarian governorate of Subcarpathia. The political conditions of the Second Republic were heavily influenced by the aftermath of World War I and conflicts with neighbouring states as well as the emergence of Nazism in Germany.
The Second Republic maintained moderate economic development. The cultural hubs of interwar PolandWarsaw, Kraków, Poznań, Wilno and Lwówbecame major European cities and the sites of internationally acclaimed universities and other institutions of higher education.
The political borders of the Second Republic were distinctly different from the modern Third Republic, as it contained less territory in the west and more territory in the east.
After more than a century of Partitions between the Austrian, the Prussian, and the Russian imperial powers, Poland re-emerged as a sovereign state at the end of the First World War in Europe in 1917–1918. The victorious Allies of World War I confirmed the rebirth of Poland in the Treaty of Versailles of June 1919. It was one of the great stories of the 1919 Paris Peace Conference. Poland solidified its independence in a series of border wars fought by the newly formed Polish Army from 1918 to 1921. The extent of the eastern half of the interwar territory of Poland was settled diplomatically in 1922 and internationally recognized by the League of Nations.
In the course of World War I (1914-1918), Germany gradually gained overall dominance on the Eastern Front as the Imperial Russian Army fell back. German and Austro-Hungarian armies seized the Russian-ruled part of what became Poland. In a failed attempt to resolve the Polish question as quickly as possible, Berlin set up a German puppet state on 5 November 1916, with a governing Provisional Council of State and (from 15 October 1917) a Regency Council ("Rada Regencyjna Królestwa Polskiego"). The Council administered the country under German auspices (see also Mitteleuropa), pending the election of a king. A month before Germany surrendered on 11 November 1918 and the war ended, the Regency Council had dissolved the Council of State, and announced its intention to restore Polish independence (7 October 1918). With the notable exception of the Marxist-oriented Social Democratic Party of the Kingdom of Poland and Lithuania ("SDKPiL"), most Polish political parties supported this move. On 23 October the Regency Council appointed a new government under Józef Świeżyński and began conscription into the Polish Army.
In 1918–1919, over 100 workers' councils sprang up on Polish territories; on 5 November 1918, in Lublin, the first Soviet of Delegates was established. On 6 November socialists proclaimed the Republic of Tarnobrzeg at Tarnobrzeg in Austrian Galicia. The same day the Socialist, Ignacy Daszyński, set up a Provisional People's Government of the Republic of Poland ("Tymczasowy Rząd Ludowy Republiki Polskiej") in Lublin. On Sunday, 10 November at 7 a.m., Józef Piłsudski, newly freed from 16 months in a German prison in Magdeburg, returned by train to Warsaw. Piłsudski, together with Colonel Kazimierz Sosnkowski, was greeted at Warsaw's railway station by Regent Zdzisław Lubomirski and by Colonel Adam Koc. Next day, due to his popularity and support from most political parties, the Regency Council appointed Piłsudski as Commander in Chief of the Polish Armed Forces. On 14 November, the Council dissolved itself and transferred all its authority to Piłsudski as Chief of State ("Naczelnik Państwa"). After consultation with Piłsudski, Daszyński's government dissolved itself and a new government formed under Jędrzej Moraczewski. In 1918 Italy became the first country in Europe to recognize Poland's renewed sovereignty.
Centers of government that formed at that time in Galicia (formerly Austrian-ruled southern Poland) included the National Council of the Principality of Cieszyn (established in November 1918), the Republic of Zakopane and the Polish Liquidation Committee (28 October). Soon afterward, the Polish–Ukrainian War broke out in Lwów (1 November 1918) between forces of the Military Committee of Ukrainians and the Polish irregular units made up of students known as the Lwów Eaglets, who were later supported by the Polish Army (see Battle of Lwów (1918), Battle of Przemyśl (1918)). Meanwhile, in western Poland, another war of national liberation began under the banner of the Greater Poland uprising (1918–19). In January 1919 Czechoslovakian forces attacked Polish units in the area of Zaolzie (see Polish–Czechoslovak War). Soon afterwards the Polish–Lithuanian War (ca 1919-1920) began, and in August 1919 Polish-speaking residents of Upper Silesia initiated a series of three Silesian Uprisings. The most critical military conflict of that period, however, the Polish–Soviet War (1919-1921), ended in a decisive Polish victory. In 1919 the Warsaw government suppressed the Republic of Tarnobrzeg and the workers' councils.
The Second Polish Republic was a parliamentary democracy from 1919 (see Small Constitution of 1919) to 1926, with the President having limited powers. The Parliament elected him, and he could appoint the Prime Minister as well as the government with the Sejm's (lower house's) approval, but he could only dissolve the Sejm with the Senate's consent. Moreover, his power to pass decrees was limited by the requirement that the Prime Minister and the appropriate other Minister had to verify his decrees with their signatures. Poland was one of the first countries in the world to recognize women's suffrage. Women in Poland were granted the right to vote on 28 November 1918 by a decree of Józef Piłsudski.
The major political parties at this time were the Polish Socialist Party, National Democrats, various Peasant Parties, Christian Democrats, and political groups of ethnic minorities (German: German Social Democratic Party of Poland, Jewish: General Jewish Labour Bund in Poland, United Jewish Socialist Workers Party, and Ukrainian: Ukrainian National Democratic Alliance). Frequently changing governments (see 1919 Polish legislative election, 1922 Polish legislative election) and other negative publicity the politicians received (such as accusations of corruption or 1919 Polish coup attempt), made them increasingly unpopular. Major politicians at this time, in addition to Piłsudski, included peasant activist Wincenty Witos (Prime Minister three times) and right-wing leader Roman Dmowski. Ethnic minorities were represented in the Sejm; e.g. in 1928 – 1930 there was the Ukrainian-Belarusian Club, with 26 Ukrainian and 4 Belarusian members.
After the Polish – Soviet war, Marshal Piłsudski led an intentionally modest life, writing historical books for a living. After he took power by a military coup in May 1926, he emphasized that he wanted to heal the Polish society and politics of excessive partisan politics. His regime, accordingly, was called Sanacja in Polish. The 1928 parliamentary elections were still considered free and fair, although the pro-Piłsudski Nonpartisan Bloc for Cooperation with the Government won them. The following three parliamentary elections (in 1930, 1935 and 1938) were manipulated, with opposition activists sent to Bereza Kartuska prison (see also Brest trials). As a result, pro-government party Camp of National Unity won huge majorities in them. Piłsudski died just after an authoritarian constitution was approved in the spring of 1935. During the last four years of the Second Polish Republic, the major politicians included President Ignacy Mościcki, Foreign Minister Józef Beck and the Commander-in-Chief of the Polish Army, Edward Rydz-Śmigły. The country was divided into 104 electoral districts, and those politicians who were forced to leave Poland, founded Front Morges in 1936. The government that ruled Second Polish Republic in its final years is frequently referred to as Piłsudski's colonels.
The interwar Poland had a considerably large army of 950,000 soldiers on active duty: in 37 infantry divisions, 11 cavalry brigades, and two armored brigades, plus artillery units. Another 700,000 men served in the reserves. At the outbreak of the war, the Polish army was able to put in the field almost one million soldiers, 4,300 guns, 1,280 tanks and 745 aircraft.
The training of the Polish army was thorough. The N.C.O.s were a competent body of men with expert knowledge and high ideals. The officers, both senior and junior, constantly refreshed their training in the field and in the lecture-hall, where modern technical achievement and the lessons of contemporary wars were demonstrated and discussed. The equipment of the Polish army was less developed technically than that of Nazi Germany and its rearmament was slowed down by confidence in Western European military support and by budget difficulties.
After regaining its independence, Poland was faced with major economic difficulties. In addition to the devastation wrought by World War I, the exploitation of the Polish economy by the German and Russian occupying powers, and the sabotage performed by retreating armies, the new republic was faced with the task of economically unifying disparate economic regions, which had previously been part of different countries. Within the borders of the Republic were the remnants of three different economic systems, with five different currencies (the German mark, the Russian ruble, the Austrian crown, the Polish marka and the Ostrubel) and with little or no direct infrastructural links. The situation was so bad that neighboring industrial centers as well as major cities lacked direct railroad links, because they had been parts of different nations. For example, there was no direct railroad connection between Warsaw and Kraków until 1934. This situation was described by Melchior Wańkowicz in his book Sztafeta.
On top of this was the massive destruction left after both World War I and the Polish–Soviet War. There was also a great economic disparity between the eastern (commonly called "Poland B") and western (called "Poland A") parts of the country, with the western half, especially areas that had belonged to the German Empire being much more developed and prosperous. Frequent border closures and a customs war with Germany also had negative economic impacts on Poland. In 1924 Prime Minister and Economic Minister Władysław Grabski introduced the złoty as a single common currency for Poland (it replaced the Polish marka), which remained a stable currency. The currency helped Poland to control the massive hyperinflation. It was the only country in Europe able to do this without foreign loans or aid. The average annual growth rate (GDP per capita) was 5.24% in 1920–29 and 0.34% in 1929–38.
Hostile relations with neighbors were a major problem for the economy of interbellum Poland. In the year 1937, foreign trade with all neighbors amounted to only 21% of Poland's total. Trade with Germany, Poland's most important neighbour, accounted for 14.3% of Polish exchange. Foreign trade with the Soviet Union (0.8%) was virtually nonexistent. Czechoslovakia accounted for 3.9%, Latvia for 0.3%, and Romania for 0.8%. By mid-1938, after the Anschluss of Austria, Greater Germany was responsible for as much as 23% of Polish foreign trade.
The basis of Poland's gradual recovery after the Great Depression was its mass economic development plans (see Four Year Plan), which oversaw the building of three key infrastructural elements. The first was the establishment of the Gdynia seaport, which allowed Poland to completely bypass Gdańsk (which was under heavy German pressure to boycott Polish coal exports). The second was construction of the 500-kilometer rail connection between Upper Silesia and Gdynia, called Polish Coal Trunk-Line, which served freight trains with coal. The third was the creation of a central industrial district, named "COP – Central Industrial Region" (Centralny Okręg Przemysłowy). Unfortunately, these developments were interrupted and largely destroyed by the German and Soviet invasion and the start of World War II. Other achievements of interbellum Poland included Stalowa Wola (a brand new city, built in a forest around a steel mill), Mościce (now a district of Tarnów, with a large nitrate factory), and the creation of a central bank. There were several trade fairs, with the most popular being Poznań International Fair, Lwów's Targi Wschodnie, and Wilno's Targi Północne. Polish Radio had ten stations (see Radio stations in interwar Poland), with the eleventh one planned to be opened in the autumn of 1939. Furthermore, in 1935 Polish engineers began working on the TV services. By early 1939, experts of the Polish Radio built four TV sets. The first movie broadcast by experimental Polish TV was Barbara Radziwiłłówna, and by 1940, regular TV service was scheduled to begin operation.
Interbellum Poland was also a country with numerous social problems. Unemployment was high, and poverty in the countryside was widespread, which resulted in several cases of social unrest, such as the 1923 Kraków riot, and 1937 peasant strike in Poland. There were conflicts with national minorities, such as Pacification of Ukrainians in Eastern Galicia (1930), relations with Polish neighbors were sometimes complicated (see Soviet raid on Stołpce, Polish–Czechoslovak border conflicts, 1938 Polish ultimatum to Lithuania). On top of this, there were natural disasters, such as the 1934 flood in Poland.
Interbellum, Poland was unofficially divided into two parts – better developed "Poland A" in the west, and underdeveloped "Poland B" in the east. Polish industry was concentrated in the west, mostly in Polish Upper Silesia, and the adjacent Lesser Poland's province of Zagłębie Dąbrowskie, where the bulk of coal mines and steel plants was located. Furthermore, heavy industry plants were located in Częstochowa ("Huta Częstochowa", founded in 1896), Ostrowiec Świętokrzyski ("Huta Ostrowiec", founded in 1837–1839), Stalowa Wola (brand new industrial city, which was built from scratch in 1937 – 1938), Chrzanów ("Fablok", founded in 1919), Jaworzno, Trzebinia (oil refinery, opened in 1895), Łódź (the seat of Polish textile industry), Poznań (H. Cegielski – Poznań), Kraków and Warsaw (Ursus Factory). Further east, in Kresy, industrial centers included two major cities of the region – Lwów and Wilno (Elektrit).
Besides coal mining, Poland also had deposits of oil in Borysław, Drohobycz, Jasło and Gorlice (see Polmin), potassium salt (TESP), and basalt (Janowa Dolina). Apart from already-existing industrial areas, in the mid-1930s, an ambitious, state-sponsored project of Central Industrial Region was started under Minister Eugeniusz Kwiatkowski. One of characteristic features of Polish economy in the interbellum was gradual nationalization of major plants. This was the case of Ursus Factory (see Państwowe Zakłady Inżynieryjne), and several steelworks, such as "Huta Pokój" in Ruda Śląska – Nowy Bytom, "Huta Królewska" in Chorzów – Królewska Huta, "Huta Laura" in Siemianowice Śląskie, as well as "Scheibler and Grohman Works" in Łódź.
According to the 1939 Statistical Yearbook of Poland, total length of railways of Poland (as for 31 December 1937) was . Rail density was per . Railways were very dense in western part of the country, while in the east, especially Polesie, rail was non-existent in some counties. During the interbellum period, the Polish government constructed several new lines, mainly in the central part of the country (see also Polish State Railroads Summer 1939). Construction of extensive Warszawa Główna railway station was never finished due to the war, and Polish railroads were famous for their punctuality (see Luxtorpeda, Strzała Bałtyku, Latający Wilnianin).
In the interbellum, road network of Poland was dense, but the quality of the roads was very poor – only 7% of all roads was paved and ready for automobile use, and none of the major cities were connected with each other by a good-quality highway. Poles built in 1939 only one highway, 28 km of straight concrete road connecting villages Warlubie and Osiek (mid-northern Poland). It was designed by Italian engineer Piero Puricelli.
In the mid-1930s, Poland had of roads, but only 58,000 had hard surface (gravel, cobblestone or sett), and 2,500 were modern, with asphalt or concrete surface. In different parts of the country, there were sections of paved roads, which suddenly ended, and were followed by dirt roads. The poor condition of the roads was the result of both long-lasting foreign dominance and inadequate funding. On 29 January 1931, the Polish Parliament created the State Road Fund, the purpose of which was to collect money for the construction and conservation of roads. The government drafted a 10-year plan, with road priorities: a highway from Wilno, through Warsaw and Kraków, to Zakopane (called Marshal Piłsudski Highway), asphalt highways from Warsaw to Poznań and Łódź, as well as a Warsaw ring road. However, the plan turned out to be too ambitious, with insufficient money in the national budget to pay for it. In January 1938, the Polish Road Congress estimated that Poland would need to spend three times as much money on roads to keep up with Western Europe.
In 1939, before the outbreak of the war, LOT Polish Airlines, which was established in 1929, had its hub at Warsaw Okęcie Airport. At that time, LOT maintained several services, both domestic and international. Warsaw had regular domestic connections with Gdynia-Rumia, Danzig-Langfuhr, Katowice-Muchowiec, Kraków-Rakowice-Czyżyny, Lwów-Skniłów, Poznań-Ławica, and Wilno-Porubanek. Furthermore, in cooperation with Air France, LARES, Lufthansa, and Malert, international connections were maintained with Athens, Beirut, Berlin, Bucharest, Budapest, Helsinki, Kaunas, London, Paris, Prague, Riga, Rome, Tallinn, and Zagreb.
Statistically, the majority of citizens lived in the countryside (75% in 1921). Farmers made up 65% of the population. In 1929, agricultural production made up 65% of Poland's GNP. After 123 years of partitions, regions of the country were very unevenly developed. Lands of former German Empire were most advanced; in Greater Poland and Pomerelia, crops were on Western European level. The situation was much worse in parts of Congress Poland, Eastern Borderlands, and former Galicia, where agriculture was most backward and primitive, with a large number of small farms, unable to succeed in either the domestic and international market. Another problem was the overpopulation of the countryside, which resulted in chronic unemployment. Living conditions were so bad in several eastern regions, such as counties inhabited by the Hutsul minority, that there was permanent starvation. Farmers rebelled against the government (see: 1937 peasant strike in Poland), and the situation began to change in the late 1930s, due to construction of several factories for the Central Industrial Region, which gave employment to thousands of countryside residents.
Beginning in June 1925 there was a customs' war with the revanchist Weimar Republic imposing trade embargo against Poland for nearly a decade; involving tariffs, and broad economic restrictions. After 1933 the trade war ended. The new agreements regulated and promoted trade. Germany became Poland's largest trading partner, followed by Britain. In October 1938 Germany granted a credit of Rm 60,000,000 to Poland (120,000,000 zloty, or £4,800,000) which was never realized, due to the outbreak of war. Germany would deliver factory equipment and machinery in return for Polish timber and agricultural produce. This new trade was to be "in addition" to the existing German-Polish trade agreements.
In 1919, the Polish government introduced compulsory education for all children aged 7 to 14, in an effort to limit illiteracy, which was widespread especially in the former Russian Partition and the Austrian Partition of eastern Poland. In 1921, one-third of citizens of Poland remained illiterate (38% in the countryside). The process was slow, but by 1931, the illiteracy level had dropped to 23% overall (27% in the countryside) and further down to 18% in 1937. By 1939, over 90% of children attended school. In 1932, Minister of Religion and Education Janusz Jędrzejewicz carried out a major reform which introduced two main levels of education: "common school" ("szkoła powszechna"), with three levels – 4 grades + 2 grades + 1 grade; and "middle school" ("szkoła średnia"), with two levels – 4 grades of comprehensive middle school and 2 grades of specified high school (classical, humanistic, natural and mathematical). A graduate of middle school received a "small matura", while a graduate of high school received a "big matura", which enabled them to seek university-level education.
Before 1918, Poland had three universities: Jagiellonian University, University of Warsaw and Lwów University. Catholic University of Lublin was established in 1918; Adam Mickiewicz University, Poznań, in 1919; and finally, in 1922, after the annexation of Republic of Central Lithuania, Wilno University became the Republic's sixth university. There were also three technical colleges: the Warsaw University of Technology, Lwów Polytechnic and the AGH University of Science and Technology in Kraków, established in 1919. Warsaw University of Life Sciences was an agricultural institute. By 1939, there were around 50,000 students enrolled in further education. Women made up 28% of university students, the second highest proportion in Europe.
Polish science in the interbellum was renowned for its mathematicians gathered around the Lwów School of Mathematics, the Kraków School of Mathematics, as well as Warsaw School of Mathematics. There were world-class philosophers in the Lwów–Warsaw school of logic and philosophy. Florian Znaniecki founded Polish sociological studies. Rudolf Weigl invented a vaccine against typhus. Bronisław Malinowski counted among the most important anthropologists of the 20th century.
In Polish literature, the 1920s were marked by the domination of poetry. Polish poets were divided into two groups – the Skamanderites (Jan Lechoń, Julian Tuwim, Antoni Słonimski and Jarosław Iwaszkiewicz) and the Futurists (Anatol Stern, Bruno Jasieński, Aleksander Wat, Julian Przyboś). Apart from well-established novelists (Stefan Żeromski, Władysław Reymont), new names appeared in the interbellum – Zofia Nałkowska, Maria Dąbrowska, Jarosław Iwaszkiewicz, Jan Parandowski, Bruno Schultz, Stanisław Ignacy Witkiewicz, Witold Gombrowicz. Among other notable artists there were sculptor Xawery Dunikowski, painters Julian Fałat, Wojciech Kossak and Jacek Malczewski, composers Karol Szymanowski, Feliks Nowowiejski, and Artur Rubinstein, singer Jan Kiepura.
Theatre was immensely popular in the interbellum, with three main centers in the cities of Warsaw, Wilno and Lwów. Altogether, there were 103 theaters in Poland and a number of other theatrical institutions (including 100 folk theaters). In 1936, different shows were seen by 5 million people, and main figures of Polish theatre of the time were Juliusz Osterwa, Stefan Jaracz, and Leon Schiller. Also, before the outbreak of the war, there were approximately one million radios (see Radio stations in interwar Poland).
The administrative division of the Republic was based on a three-tier system. On the lowest rung were the "gminy", local town and village governments akin to districts or parishes. These were then grouped together into "powiaty" (akin to counties), which, in turn, were grouped as "województwa" (voivodeships, akin to provinces).
Historically, Poland was a nation of many nationalities. This was especially true after independence was regained in the wake of World War I and the subsequent Polish–Soviet War ending at Peace of Riga. The census of 1921 shows 30.8% of the population consisted of ethnic minorities, compared with a share of 1.6% (solely identifying with a non-Polish ethnic group) or 3.8% (including those identifying with both the Polish ethnicity and with another ethnic group) in 2011. The first spontaneous flight of about 500,000 Poles from the Soviet Union occurred during the reconstitution of sovereign Poland. In the second wave, between November 1919 and June 1924 some 1,200,000 people left the territory of the USSR for Poland. It is estimated that some 460,000 of them spoke Polish as the first language. According to the 1931 Polish Census: 68.9% of the population was Polish, 13.9% were Ukrainian, around 10% Jewish, 3.1% Belarusian, 2.3% German and 2.8% other, including Lithuanian, Czech, Armenian, Russian, and Romani. The situation of minorities was a complex subject and changed during the period.
Poland was also a nation of many religions. In 1921, 16,057,229 Poles (approx. 62.5%) were Roman (Latin) Catholics, 3,031,057 citizens of Poland (approx. 11.8%) were Eastern Rite Catholics (mostly Ukrainian Greek Catholics and Armenian Rite Catholics), 2,815,817 (approx. 10.95%) were Greek Orthodox, 2,771,949 (approx. 10.8%) were Jewish, and 940,232 (approx. 3.7%) were Protestants (mostly Lutheran).
By 1931, Poland had the second largest Jewish population in the world, with one-fifth of all the world's Jews residing within its borders (approx. 3,136,000). The urban population of interbellum Poland was rising steadily; in 1921, only 24% of Poles lived in the cities, in the late 1930s, that proportion grew to 30%. In more than a decade, the population of Warsaw grew by 200,000, Łódź by 150,000, and Poznań – by 100,000. This was due not only to internal migration, but also to an extremely high birth rate..
From the 1920s the Polish government excluded Jews from receiving government bank loans, public sector employment, and obtaining business licenses. From the 1930s measures were taken against Jewish shops, Jewish export firms, Shechita as well as limitations were placed on Jewish admission to the medical and legal professions, Jews in business associations and the enrollment of Jews into universities. The far-right National Democracy (Endecja from the abbreviation "ND") often organized anti-Jewish boycotts. Following the death of Józef Piłsudski in 1935, the Endecja intensified their efforts which triggered violence in extreme cases in smaller towns across the country. In 1937, the National Democracy movement passed resolutions that "its main aim and duty must be to remove the Jews from all spheres of social, economic, and cultural life in Poland". The government in response organized the Camp of National Unity (OZON), which in 1938 took control of the Polish Sejm and subsequently drafted anti-Semitic legislation similar to the Anti-Jewish laws in Germany, Hungary, and Romania. OZON advocated mass emigration of Jews from Poland, numerus clausus (see also Ghetto benches), and other limitation on Jewish rights. According to William W. Hagen by 1939, prior to the war, Polish Jews were threatened with conditions similar to those in Nazi Germany.
The pre-war government also restricted rights of people who declared Ukrainian nationality, belonged to the Eastern Orthodox Church and inhabited the Eastern Borderlands of the Second Polish Republic. The Ukrainian language was restricted in every field possible, especially in governmental institutions, and the term "Ruthenian" was enforced in an attempt to ban the use of the term "Ukrainian". Ukrainians were categorized as uneducated second-class peasants or third world people and rarely settled outside the Eastern Borderland region due to the prevailing Ukrainophobia and restrictions imposed. Numerous attempts at restoring the Ukrainian state were suppressed and any existent violence or terrorism initiated by the Organization of Ukrainian Nationalists was emphasized to create an image of a "brutal Eastern savage".
The Second Polish Republic was mainly flat with average elevation of above sea level, except for the southernmost Carpathian Mountains (after World War II and its border changes, the average elevation of Poland decreased to ). Only 13% of territory, along the southern border, was higher than . The highest elevation in the country was Mount Rysy, which rises in the Tatra Range of the Carpathians, approximately south of Kraków. Between October 1938 and September 1939, the highest elevation was Lodowy Szczyt (known in the Slovak language as "Ľadový štít"), which rises above sea level. The largest lake was Lake Narach.
The country's total area, after the annexation of Zaolzie, was . It extended from north to south and from east to west. On 1 January 1938, total length of boundaries was , including: of coastline (out of which were made by the Hel Peninsula), the with Soviet Union, 948 kilometers with Czechoslovakia (until 1938), with Germany (together with East Prussia), and with other countries (Lithuania, Romania, Latvia, Danzig). The warmest yearly average temperature was in Kraków among major cities of the Second Polish Republic, at in 1938; and the coldest in Wilno ( in 1938). Extreme geographical points of Poland included Przeświata River in Somino to the north (located in the Braslaw county of the Wilno Voivodeship); Manczin River to the south (located in the Kosów county of the Stanisławów Voivodeship); Spasibiorki near railway to Połock to the east (located in the Dzisna county of the Wilno Voivodeship); and Mukocinek near Warta River and Meszyn Lake to the west (located in the Międzychód county of the Poznań Voivodeship).
Almost 75% of the territory of interbellum Poland was drained northward into the Baltic Sea by the Vistula (total area of drainage basin of the Vistula within boundaries of the Second Polish Republic was , the Niemen (), the Odra () and the Daugava (). The remaining part of the country was drained southward, into the Black Sea, by the rivers that drain into the Dnieper (Pripyat, Horyn and Styr, all together ) as well as Dniester ()
The Second World War in 1939 ended the sovereign Second Polish Republic. The German invasion of Poland began on 1 September 1939, one week after Nazi Germany and the Soviet Union signed the secret Molotov–Ribbentrop Pact. On that day, Germany and Slovakia attacked Poland, and on 17 September the Soviets attacked eastern Poland. Warsaw fell to the Nazis on 28 September after a twenty-day siege. Open organized Polish resistance ended on 6 October 1939 after the Battle of Kock, with Germany and the Soviet Union occupying most of the country. Lithuania annexed the area of Wilno, and Slovakia seized areas along Poland's southern border - including Górna Orawa and Tatranská Javorina - which Poland had annexed from Czechoslovakia in October 1938. Poland did not surrender to the invaders, but continued fighting under the auspices of the Polish government-in-exile and of the Polish Underground State. After the signing of the German–Soviet Treaty of Friendship, Cooperation and Demarcation on 28 September 1939, Polish areas occupied by Nazi Germany either became directly annexed to the Third Reich, or became part of the General Government. The Soviet Union, following Elections to the People's Assemblies of Western Ukraine and Western Belarus (22 October 1939), annexed eastern Poland partly to the Byelorussian Soviet Socialist Republic, and partly to the Ukrainian Soviet Socialist Republic (November 1939).
Polish war plans (Plan West and Plan East) failed as soon as Germany invaded in 1939. The Polish losses in combat against Germans (killed and missing in action) amounted to ca. 70,000 men. Some 420,000 of them were taken prisoners. Losses against the Red Army (which invaded Poland on 17 September) added up to 6,000 to 7,000 of casualties and MIA, 250,000 were taken prisoners. Although the Polish army – considering the inactivity of the Allies – was in an unfavorable position – it managed to inflict serious losses to the enemies: 20,000 German soldiers were killed or MIA, 674 tanks and 319 armored vehicles destroyed or badly damaged, 230 aircraft shot down; the Red Army lost (killed and MIA) about 2,500 soldiers, 150 combat vehicles and 20 aircraft. The Soviet invasion of Poland, and lack of promised aid from the Western Allies, contributed to the Polish forces defeat by 6 October 1939.
A popular myth is that Polish cavalry armed with lances charged German tanks during the September 1939 campaign. This often repeated account, first reported by Italian journalists as German propaganda, concerned an action by the Polish 18th Lancer Regiment near Chojnice. This arose from misreporting of a single clash on 1 September 1939 near Krojanty, when two squadrons of the Polish 18th Lancers armed with sabers surprised and wiped out a German infantry formation with a mounted saber charge. Shortly after midnight the 2nd (Motorized) Division was compelled to withdraw by Polish cavalry, before the Poles were caught in the open by German armored cars. The story arose because some German armored cars appeared and gunned down 20 troopers as the cavalry escaped. Even this failed to persuade everyone to reexamine their beliefs—there were some who thought Polish cavalry had been improperly employed in 1939.
Between 1939 and 1990, the Polish government-in-exile operated in Paris and later in London, presenting itself as the only legal and legitimate representative of the Polish nation. In 1990 the last president in exile, Ryszard Kaczorowski handed the presidential insignia to the newly elected President, Lech Wałęsa, signifying continuity between the Second and Third republics.
|
https://en.wikipedia.org/wiki?curid=14245
|
Helen Keller
Helen Adams Keller (June 27, 1880 – June 1, 1968) was an American author, political activist, and lecturer. She was the first deaf-blind person to earn a Bachelor of Arts degree. The story of Keller and her teacher, Anne Sullivan, was made famous by Keller's autobiography, "The Story of My Life", and its adaptations for film and stage, "The Miracle Worker". Her birthplace in West Tuscumbia, Alabama, is now a museum and sponsors an annual "Helen Keller Day". Her June 27 birthday is commemorated as Helen Keller Day in Pennsylvania and, in the centenary year of her birth, was recognized by a presidential proclamation from US President Jimmy Carter.
A prolific author, Keller was well-traveled and outspoken in her convictions. A member of the Socialist Party of America and the Industrial Workers of the World, she campaigned for women's suffrage, labor rights, socialism, antimilitarism, and other similar causes. She was inducted into the Alabama Women's Hall of Fame in 1971 and was one of twelve inaugural inductees to the Alabama Writers Hall of Fame on June 8, 2015.
Helen Adams Keller was born on June 27, 1880 in Tuscumbia, Alabama. Her family lived on a homestead, Ivy Green, that Helen's grandfather had built decades earlier. She had four siblings: two full siblings, Mildred Campbell (Keller) Tyson and Phillip Brooks Keller, and two older half-brothers from her father's prior marriage, James McDonald Keller and William Simpson Keller.
Her father, Arthur Henley Keller (1836–1896), spent many years as an editor of the Tuscumbia "North Alabamian" and had served as a captain in the Confederate Army.
The family were part of the slaveholding elite before the war, but lost status later. Her mother, Catherine Everett (Adams) Keller (1856–1921), known as "Kate", was the daughter of Charles W. Adams, a Confederate general. Her paternal lineage was traced to Casper Keller, a native of Switzerland. One of Helen's Swiss ancestors was the first teacher for the deaf in Zurich. Keller reflected on this irony in her first autobiography, stating "that there is no king who has not had a slave among his ancestors, and no slave who has not had a king among his."
At 19 months old, Keller contracted an unknown illness described by doctors as "an acute congestion of the stomach and the brain", which might have been scarlet fever or meningitis. The illness left her both deaf and blind. She lived, as she recalled in her autobiography, "at sea in a dense fog".
At that time, Keller was able to communicate somewhat with Martha Washington, the two-years older daughter of the family cook, who understood her signs; by the age of seven, Keller had more than 60 home signs to communicate with her family, and could distinguish people by the vibration of their footsteps.
She tyrannized over the African-American servants.
In 1886, Keller's mother, inspired by an account in Charles Dickens' "American Notes" of the successful education of another deaf and blind woman, Laura Bridgman, dispatched the young Keller, accompanied by her father, to seek out physician J. Julian Chisolm, an eye, ear, nose, and throat specialist in Baltimore, for advice.
Chisholm referred the Kellers to Alexander Graham Bell, who was working with deaf children at the time. Bell advised them to contact the Perkins Institute for the Blind, the school where Bridgman had been educated, which was then located in South Boston. Michael Anagnos, the school's director, asked a 20-year-old alumna of the school, Anne Sullivan, herself visually impaired, to become Keller's instructor. It was the beginning of a nearly 50-year-long relationship during which Sullivan evolved into Keller's governess and eventually her companion.
Sullivan arrived at Keller's house on March 5, 1887, a day Keller would forever remember as "my soul's birthday." Sullivan immediately began to teach Helen to communicate by spelling words into her hand, beginning with "d-o-l-l" for the doll that she had brought Keller as a present. Keller was frustrated, at first, because she did not understand that every object had a word uniquely identifying it. In fact, when Sullivan was trying to teach Keller the word for "mug", Keller became so frustrated she broke the mug. But soon she began imitating Sullivan's hand gestures. "I did not know that I was spelling a word or even that words existed," Keller remembered. "I was simply making my fingers go in monkey-like imitation."
Keller's breakthrough in communication came the next month when she realized that the motions her teacher was making on the palm of her hand, while running cool water over her other hand, symbolized the idea of "water". Writing in her autobiography, "The Story of My Life," Keller recalled the moment: "I stood still, my whole attention fixed upon the motions of her fingers. Suddenly I felt a misty consciousness as of something forgotten — a thrill of returning thought; and somehow the mystery of language was revealed to me. I knew then that w-a-t-e-r meant the wonderful cool something that was flowing over my hand. The living word awakened my soul, gave it light, hope, set it free!" Keller then nearly exhausted Sullivan, demanding the names of all the other familiar objects in her world.
Helen Keller was viewed as isolated but was very in touch with the outside world. She was able to enjoy music by feeling the beat and she was able to have a strong connection with animals through touch. She was delayed at picking up language, but that did not stop her from having a voice.
In May 1888, Keller started attending the Perkins Institute for the Blind. In 1894, Keller and Sullivan moved to New York to attend the Wright-Humason School for the Deaf, and to learn from Sarah Fuller at the Horace Mann School for the Deaf. In 1896, they returned to Massachusetts, and Keller entered The Cambridge School for Young Ladies before gaining admittance, in 1900, to Radcliffe College of Harvard University, where she lived in Briggs Hall, South House. Her admirer, Mark Twain, had introduced her to Standard Oil magnate Henry Huttleston Rogers, who, with his wife Abbie, paid for her education. In 1904, at the age of 24, Keller graduated as a member of Phi Beta Kappa from Radcliffe, becoming the first deaf-blind person to earn a Bachelor of Arts degree. She maintained a correspondence with the Austrian philosopher and pedagogue Wilhelm Jerusalem, who was one of the first to discover her literary talent.
Determined to communicate with others as conventionally as possible, Keller learned to speak and spent much of her life giving speeches and lectures on aspects of her life. She learned to "hear" people's speech by reading their lips with her hands—her sense of touch had heightened. She became proficient at using braille and reading sign language with her hands as well. Shortly before World War I, with the assistance of the Zoellner Quartet, she determined that by placing her fingertips on a resonant tabletop she could experience music played close by.
On January 22, 1916, Keller and Sullivan traveled to the small town of Menomonie in western Wisconsin to deliver a lecture at the Mabel Tainter Memorial Building. Details of her talk were provided in the weekly "Dunn County News" on January 22, 1916:
A message of optimism, of hope, of good cheer, and of loving service was brought to Menomonie Saturday—a message that will linger long with those fortunate enough to have received it. This message came with the visit of Helen Keller and her teacher, Mrs. John Macy, and both had a hand in imparting it Saturday evening to a splendid audience that filled The Memorial. The wonderful girl who has so brilliantly triumphed over the triple afflictions of blindness, dumbness and deafness, gave a talk with her own lips on "Happiness", and it will be remembered always as a piece of inspired teaching by those who heard it.
When part of the account was reprinted in the January 20, 2016 edition of the paper under the heading "From the Files", the column compiler added
According to those who attended, Helen Keller spoke of the joy that life gave her. She was thankful for the faculties and abilities that she did possess and stated that the most productive pleasures she had were curiosity and imagination. Keller also spoke of the joy of service and the happiness that came from doing things for others ... Keller imparted that "helping your fellow men were one's only excuse for being in this world and in the doing of things to help one's fellows lay the secret of lasting happiness." She also told of the joys of loving work and accomplishment and the happiness of achievement. Although the entire lecture lasted only a little over an hour, the lecture had a profound impact on the audience.
Anne Sullivan stayed as a companion to Helen Keller long after she taught her. Sullivan married John Macy in 1905, and her health started failing around 1914. Polly Thomson (February 20, 1885 – March 21, 1960) was hired to keep house. She was a young woman from Scotland who had no experience with deaf or blind people. She progressed to working as a secretary as well, and eventually became a constant companion to Keller.
Keller moved to Forest Hills, Queens, together with Sullivan and Macy, and used the house as a base for her efforts on behalf of the American Foundation for the Blind. "While in her thirties Helen had a love affair, became secretly engaged, and defied her teacher and family by attempting an elopement with the man she loved." He was the finger-spelling socialist "Peter Fagan, a young "Boston Herald" reporter who was sent to Helen's home to act as her private secretary when lifelong companion, Anne, fell ill."
At the time, her father had died and Sullivan was recovering in Lake Placid and Puerto Rico.
Keller had moved with her mother in Montgomery, Alabama.
Anne Sullivan died in 1936, with Keller holding her hand, after falling into a coma as a result of coronary thrombosis. Keller and Thomson moved to Connecticut. They traveled worldwide and raised funds for the blind. Thomson had a stroke in 1957 from which she never fully recovered, and died in 1960. Winnie Corbally, a nurse originally hired to care for Thomson in 1957, stayed on after Thomson's death and was Keller's companion for the rest of her life.
Keller went on to become a world-famous speaker and author. She is remembered as an advocate for people with disabilities, amid numerous other causes. The deaf community was widely impacted by her. She traveled to twenty-five different countries giving motivational speeches about Deaf people's conditions. She was a suffragist, pacifist, radical socialist, birth control supporter, and opponent of Woodrow Wilson. In 1915, she and George A. Kessler founded the Helen Keller International (HKI) organization. This organization is devoted to research in vision, health, and nutrition.
In 1916 she sent money to the NAACP ashamed of the Southern un-Christian treatment of "colored people".
In 1920, she helped to found the American Civil Liberties Union (ACLU). Keller traveled to over 40 countries with Sullivan, making several trips to Japan and becoming a favorite of the Japanese people. Keller met every U.S. president from Grover Cleveland to Lyndon B. Johnson and was friends with many famous figures, including Alexander Graham Bell, Charlie Chaplin and Mark Twain. Keller and Twain were both considered political radicals allied with leftist politics and, as a consequence, their political views have been forgotten or glossed over in American high school textbooks and the popular mind.
Keller was a member of the Socialist Party and actively campaigned and wrote in support of the working class from 1909 to 1921. Many of her speeches and writings were about women's right to vote and the impacts of war; in addition, she supported causes that opposed military intervention. She had speech therapy in order to have her voice heard better by the public. When the Rockefeller-owned press refused to print her articles, she protested until her work was finally published. She supported Socialist Party candidate Eugene V. Debs in each of his campaigns for the presidency. Before reading "Progress and Poverty", Helen Keller was already a socialist who believed that Georgism was a good step in the right direction. She later wrote of finding "in Henry George's philosophy a rare beauty and power of inspiration, and a splendid faith in the essential nobility of human nature".
Keller claimed that newspaper columnists who had praised her courage and intelligence before she expressed her socialist views now called attention to her disabilities. The editor of the "Brooklyn Eagle" wrote that her "mistakes sprung out of the manifest limitations of her development". Keller responded to that editor, referring to having met him before he knew of her political views:
Keller joined the Industrial Workers of the World (the IWW, known as the Wobblies) in 1912, saying that parliamentary socialism was "sinking in the political bog". She wrote for the IWW between 1916 and 1918. In "Why I Became an IWW", Keller explained that her motivation for activism came in part from her concern about blindness and other disabilities:
The last sentence refers to prostitution and syphilis, the former a frequent cause of the latter, and the latter a leading cause of blindness. In the same interview, Keller also cited the 1912 strike of textile workers in Lawrence, Massachusetts for instigating her support of socialism.
Keller supported eugenics. In 1915, she wrote in favor of refusing life-saving medical procedures to infants with severe mental impairments or physical deformities, stating that their lives were not worthwhile and they would likely become criminals. Keller also expressed concerns about human overpopulation.
Keller wrote a total of 12 published books and several articles.
One of her earliest pieces of writing, at age 11, was "The Frost King" (1891). There were allegations that this story had been plagiarized from "The Frost Fairies" by Margaret Canby. An investigation into the matter revealed that Keller may have experienced a case of cryptomnesia, which was that she had Canby's story read to her but forgot about it, while the memory remained in her subconscious.
At age 22, Keller published her autobiography, "The Story of My Life" (1903), with help from Sullivan and Sullivan's husband, John Macy. It recounts the story of her life up to age 21 and was written during her time in college.
Keller wrote "The World I Live In" in 1908, giving readers an insight into how she felt about the world. "Out of the Dark", a series of essays on socialism, was published in 1913.
When Keller was young, Anne Sullivan introduced her to Phillips Brooks, who introduced her to Christianity, Keller famously saying: "I always knew He was there, but I didn't know His name!"
Her spiritual autobiography, "My Religion", was published in 1927 and then in 1994 extensively revised and re-issued under the title "Light in My Darkness". It advocates the teachings of Emanuel Swedenborg, the Christian theologian and mystic who gave a spiritual interpretation of the teachings of the Bible and who claimed that the Second Coming of Jesus Christ had already taken place.
Keller described the core of her belief in these words:
But in Swedenborg's teaching it [Divine Providence] is shown to be the government of God's Love and Wisdom and the creation of uses. Since His Life cannot be less in one being than another, or His Love manifested less fully in one thing than another, His Providence must needs be universal ... He has provided religion of some kind everywhere, and it does not matter to what race or creed anyone belongs if he is faithful to his ideals of right living.
Keller visited 35 countries from 1946 to 1957.
In 1948 she went to New Zealand and visited deaf schools in Christchurch and Auckland. She met Deaf Society of Canterbury Life Member Patty Still in Christchurch.
Keller suffered a series of strokes in 1961 and spent the last years of her life at her home.
On September 14, 1964, President Lyndon B. Johnson awarded her the Presidential Medal of Freedom, one of the United States' two highest civilian honors. In 1965 she was elected to the National Women's Hall of Fame at the New York World's Fair.
Keller devoted much of her later life to raising funds for the American Foundation for the Blind. She died in her sleep on June 1, 1968, at her home, Arcan Ridge, located in Easton, Connecticut, a few weeks short of her eighty-eighth birthday. A service was held in her honor at the National Cathedral in Washington, D.C., her body was cremated and her ashes were placed there next to her constant companions, Anne Sullivan and Polly Thomson. She was buried at the Washington National Cathedral in Washington, D.C.
Keller's life has been interpreted many times. She appeared in a silent film, "Deliverance" (1919), which told her story in a melodramatic, allegorical style.
She was also the subject of the documentaries "Helen Keller in Her Story", narrated by her friend and noted theatrical actress Katharine Cornell, and "The Story of Helen Keller", part of the Famous Americans series produced by Hearst Entertainment.
"The Miracle Worker" is a cycle of dramatic works ultimately derived from her autobiography, "The Story of My Life". The various dramas each describe the relationship between Keller and Sullivan, depicting how the teacher led her from a state of almost feral wildness into education, activism, and intellectual celebrity. The common title of the cycle echoes Mark Twain's description of Sullivan as a "miracle worker". Its first realization was the 1957 "Playhouse 90" teleplay of that title by William Gibson. He adapted it for a Broadway production in 1959 and an Oscar-winning feature film in 1962, starring Anne Bancroft and Patty Duke. It was remade for television in 1979 and 2000.
In 1984, Keller's life story was made into a TV movie called "The Miracle Continues". This film, a semi-sequel to "The Miracle Worker", recounts her college years and her early adult life. None of the early movies hint at the social activism that would become the hallmark of Keller's later life, although a Disney version produced in 2000 states in the credits that she became an activist for social equality.
The Bollywood movie "Black" (2005) was largely based on Keller's story, from her childhood to her graduation.
A documentary called "Shining Soul: Helen Keller's Spiritual Life and Legacy" was produced by the Swedenborg Foundation in the same year. The film focuses on the role played by Emanuel Swedenborg's spiritual theology in her life and how it inspired Keller's triumph over her triple disabilities of blindness, deafness and a severe speech impediment.
On March 6, 2008, the New England Historic Genealogical Society announced that a staff member had discovered a rare 1888 photograph showing Helen and Anne, which, although previously published, had escaped widespread attention. Depicting Helen holding one of her many dolls, it is believed to be the earliest surviving photograph of Anne Sullivan Macy.
Video footage showing Helen Keller learning to mimic speech sounds also exists.
A biography of Helen Keller was written by the German Jewish author Hildegard Johanna Kaeser.
A painting titled "The Advocate: Tribute to Helen Keller" was created by three artists from Kerala, India as a tribute to Helen Keller. The Painting was created in association with a non-profit organization Art d'Hope Foundation, artists groups Palette People and XakBoX Design & Art Studio. This painting was created for a fundraising event to help blind students in India and was inaugurated by M. G. Rajamanikyam, IAS (District Collector Ernakulam) on Helen Keller day (June 27, 2016). The painting depicts the major events of Helen Keller's life and is one of the biggest paintings done based on Helen Keller's life.
A preschool for the deaf and hard of hearing in Mysore, India, was originally named after Helen Keller by its founder, K. K. Srinivasan.
In 1999, Keller was listed in Gallup's Most Widely Admired People of the 20th century.
In 2003, Alabama honored its native daughter on its state quarter. The Alabama state quarter is the only circulating U.S. coin to feature braille.
The Helen Keller Hospital in Sheffield, Alabama, is dedicated to her.
Streets are named after Helen Keller in Zürich, Switzerland, in the US, in Getafe, Spain, in Lod, Israel, in Lisbon, Portugal, and in Caen, France.
In 1973, Helen Keller was inducted into the National Women's Hall of Fame.
A stamp was issued in 1980 by the United States Postal Service depicting Keller and Sullivan, to mark the centennial of Keller's birth.
On October 7, 2009, a bronze statue of Keller was added to the National Statuary Hall Collection, as a replacement for the State of Alabama's former 1908 statue of the education reformer Jabez Lamar Monroe Curry.
Archival material of Helen Keller stored in New York was lost when the Twin Towers were destroyed in the September 11 attacks.
The Helen Keller Archives are owned by the American Foundation for the Blind.
|
https://en.wikipedia.org/wiki?curid=14254
|
Haddocks' Eyes
Haddocks' Eyes is a song sung by The White Knight from Lewis Carroll's 1871 novel "Through the Looking-Glass", .
"Haddocks' Eyes" is an example used to elaborate on the symbolic status of the concept of "name": a name as identification marker may be assigned to anything, including another name, thus introducing different levels of symbolization. It was discussed in several works on logic and philosophy.
The White Knight explains to Alice a confusing nomenclature for the song.
‘You are sad,’ the Knight said in an anxious tone: ‘let me sing you a
song to comfort you.’
‘Is it very long?’ Alice asked, for she had heard a good deal of poetry
that day.
‘It’s long,’ said the Knight, ‘but very, VERY beautiful. Everybody that
hears me sing it--either it brings the TEARS into their eyes, or else--’
‘Or else what?’ said Alice, for the Knight had made a sudden pause.
‘Or else it doesn’t, you know. The name of the song is called “HADDOCKS’
EYES.”’
‘Oh, that’s the name of the song, is it?’ Alice said, trying to feel
interested.
‘No, you don’t understand,’ the Knight said, looking a little vexed.
‘That’s what the name is CALLED. The name really IS “THE AGED AGED
MAN.”’
‘Then I ought to have said “That’s what the SONG is called”?’ Alice
corrected herself.
‘No, you oughtn’t: that’s quite another thing! The SONG is called “WAYS
AND MEANS”: but that’s only what it’s CALLED, you know!’
‘Well, what IS the song, then?’ said Alice, who was by this time
completely bewildered.
‘I was coming to that,’ the Knight said. ‘The song really IS “A-SITTING
ON A GATE”: and the tune’s my own invention.’
To summarize:
The complicated terminology distinguishing between 'the song, what the song is called, the name of the song, and what the name of the song is called' both uses and mentions the use–mention distinction.
The White Knight sings the song to a tune he claims as his own invention, but which Alice recognises as "I give thee all, I can no more". By the time Alice heard it, she was already tired of poetry.
The song parodies the plot, but not the style or metre, of "Resolution and Independence" by William Wordsworth.
Like "Jabberwocky," another poem published in "Through the Looking Glass," "Haddocks’ Eyes" appears to have been revised over the course of many years. In 1856, Carroll published the following poem anonymously under the name "Upon the Lonely Moor". It bears an obvious resemblance to "Haddocks' Eyes."
|
https://en.wikipedia.org/wiki?curid=14257
|
Hoosier
Hoosier is the official demonym for a resident of the U.S. state of Indiana. The origin of the term remains a matter of debate within the state, but "Hoosier" was in general use by the 1840s, having been popularized by Richmond resident John Finley's 1833 poem "The Hoosier's Nest". Anyone born in Indiana or a resident at the time is considered to be a Hoosier. Indiana adopted the nickname "The Hoosier State" more than 150 years ago.
"Hoosier" is used in the names of numerous Indiana-based businesses and organizations. "Hoosiers" is also the name of the Indiana University athletic teams and seven active and one disbanded athletic conferences in the Indiana High School Athletic Association have the word "Hoosier" in their name. As there is no accepted embodiment of a Hoosier, the IU schools are represented through their letters and colors alone. In addition to general acceptance by residents of Indiana, the term is also the official demonym according to the U.S. Government Publishing Office.
In addition to "The Hoosier's Nest", the term also appeared in the "Indianapolis Journal"'s "Carrier's Address" on January 1, 1833. There are many suggestions for the derivation of the word but none is universally accepted. In 1833 the Pittsburgh Statesman said the term had been in use for "some time past" and suggested it originated from census workers calling "Who's here?". Also in 1833, former Indiana Governor J.B. Ray began publishing a newspaper titled "The Hoosier".
In 1900, Meredith Nicholson wrote "The Hoosiers", an early attempt to study the etymology of the word as applied to Indiana residents. Jacob Piatt Dunn, longtime secretary of the Indiana Historical Society, published "The Word Hoosier", a similar attempt, in 1907. Both chronicled some of the popular and satirical etymologies circulating at the time and focused much of their attention on the use of the word in the Upland South to refer to woodsmen, yokels, and rough people. Dunn traced the word back to the Cumbrian "hoozer", meaning anything unusually large, derived from the Old English "hoo" (as at Sutton Hoo), meaning "high" and "hill". The importance of immigrants from northern England and southern Scotland was reflected in numerous placenames including the Cumberland Mountains, the Cumberland River, and the Cumberland Gap. Nicholson defended the people of Indiana against such an association, while Dunn concluded that the early settlers had adopted the nickname self-mockingly and that it had lost its negative associations by the time of Finley's poem.
Johnathan Clark Smith subsequently showed that Nicholson and Dunn's earliest sources within Indiana were mistaken. A letter by James Curtis cited by Dunn and others as the earliest known use of the term was actually written in 1846, not 1826. Similarly, the use of the term in an 1859 newspaper item quoting an 1827 diary entry by Sandford Cox was more likely an editorial comment and not from the original diary. Smith's earliest sources led him to argue that the word originated as a term along the Ohio River for flatboatmen from Indiana and did not acquire its pejorative meanings until 1836, "after" Finley's poem.
William Piersen, a history professor at Fisk University, argued for a connection to the Methodist minister Rev. Harry Hosier (–May 1806), who evangelized the American frontier at the beginning of the 19th century as part of the Second Great Awakening. "Black Harry" had been born a slave in North Carolina and sold north to Baltimore, Maryland, before gaining his freedom and beginning his ministry around the end of the American Revolution. He was a close associate and personal friend of Bishop Francis Asbury, the "Father of the American Methodist Church". Benjamin Rush said of him that "making allowances for his illiteracy, he was the greatest orator in America". His sermons called on Methodists to reject slavery and to champion the common working man. Piersen proposed that Methodist communities inspired by his example took or were given a variant spelling of his name (possibly influenced by the "yokel" slang) during the decades after his ministry.
According to Washington County newspaper reports of the time, Abraham Stover was Colonel of the Indiana Militia. He was a colorful figure in early Washington County history. Along with his son-in-law, John B. Brough, he was considered one of the two strongest men in Washington County. He was always being challenged to prove his might, and seems to have won several fights over men half his age. After whipping six or eight men in a fist fight in Louisville, Kentucky, he cracked his fists and said, "Ain't I a husher", which was changed in the news to "Hoosier", and thus originated the name of Hoosier in connection with Indiana men.
Jorge Santander Serrano, a PhD student from Indiana University, has also suggested that "Hoosier" might come from the French words for 'redness', , or 'red-faced', . According to this hypothesis, the early pejorative use of the word "Hoosier" may have a link to the color red ("rouge" in French) which is associated with indigenous peoples, pejoratively called "red men" or "red-skins", and also with poor white people by calling them "red-necks".
Humorous folk etymologies for the term "hoosier" have a long history, as recounted by Dunn in "The Word Hoosier".
One account traces the word to the necessary caution of approaching houses on the frontier. In order to avoid being shot, a traveler would call out from afar to let themselves be known. The inhabitants of the cabin would then reply "Who's here?" which in the Appalachian English of the early settlers slurred into "Who'sh 'ere?" and thence into "Hoosier?" A variant of this account had the Indiana pioneers calling out "Who'sh 'ere?" as a general greeting and warning when hearing someone in the bushes and tall grass, to avoid shooting a relative or friend in error.
The poet James Whitcomb Riley facetiously suggested that the fierce brawling that took place in Indiana involved enough biting that the expression "Whose ear?" became notable. This arose from or inspired the story of two 19th-century French immigrants brawling in a tavern in the foothills of southern Indiana. One was cut and a third Frenchman walked in to see an ear on the dirt floor of the tavern, prompting him to slur out "Whosh ear?"
Two related stories trace the origin of the term to gangs of workers from Indiana under the direction of a Mr. Hoosier.
The account related by Dunn is that a Louisville contractor named Samuel Hoosier preferred to hire workers from communities on the Indiana side of the Ohio River like New Albany rather than Kentuckians. During the excavation of the first canal around the Falls of the Ohio from 1826 to 1833, his employees became known as "Hoosier's men" and then simply "Hoosiers". The usage spread from these hard-working laborers to all of the Indiana boatmen in the area and then spread north with the settlement of the state. The story was told to Dunn in 1901 by a man who had heard it from a Hoosier relative while traveling in southern Tennessee. Dunn could not find any family of the given name in any directory in the region or anyone else in southern Tennessee who had heard the story and accounted himself dubious. This version was subsequently retold by Gov. Evan Bayh and Sen. Vance Hartke, who introduced the story into the "Congressional Record" in 1975, and matches the timing and location of Smith's subsequent research. However, the U.S. Army Corps of Engineers has been unable to find any record of a Hoosier or Hosier in surviving canal company records.
The word "hoosier" is still used in Greater St. Louis to denote a "yokel" or "white trash". The word is also encountered in sea shanties. In the book "Shanties from the Seven Seas" by Stan Hugill, in reference to its former use to denote cotton-stowers, who would move bales of cotton to and from the holds of ships and force them in tightly by means of jackscrews. "To hoosier" is sometimes still encountered as a verb meaning "to trick" or "to swindle".
A Hoosier cabinet, often shortened to "hoosier", is a type of free-standing kitchen cabinet popular in the early decades of the twentieth century. Almost all of these cabinets were produced by companies located in Indiana and the name derives from the largest of them, the Hoosier Manufacturing Co. of New Castle, Indiana. Other Indiana businesses include Hoosier Racing Tire and the Hoosier Bat Company, manufacturer of wooden baseball bats.
The word may have originated from the term for shooing a pig away from its mate at feeding time.
The RCA Dome, former home of the Indianapolis Colts, was known as the "Hoosier Dome" before RCA purchased the naming rights in 1994. The RCA Dome was replaced by Lucas Oil Stadium in 2008.
|
https://en.wikipedia.org/wiki?curid=14260
|
Horner's method
In mathematics, the term Horner's rule (or Horner's method, Horner's scheme etc) refers to a method for approximating the roots of polynomials that was described by William George Horner in 1819. It gave a convenient way for implementing the Newton–Raphson method for polynomials that was suitable for efficient hand calculation and was widely used until computers came into general use in about 1970.
One aspect of this method was the use of synthetic division (aka Ruffini's rule) for implementing polynomial long division, and some elementary level texts use Horner's method as yet another name for this ancient trick which Horner ascribed to Joseph-Louis Lagrange but which can be traced back many hundreds of years to Chinese and Persian mathematicians.
After the introduction of computers Horner's root-finding method went out of use and as a result the term Horner's method (rule etc) has often been used to mean just the algorithm for polynomial evaluation that underlies synthetic division.
The key idea behind this is the identity:
formula_1
This allows evaluation of a polynomial of degree with only formula_2 multiplications and formula_2 additions. This is optimal, since there are polynomials of degree that cannot be evaluated with fewer arithmetic operations
Given the polynomial
where formula_5 are constant coefficients, the problem is to evaluate the polynomial at a specific value formula_6 of formula_7
For this, a new sequence of constants is defined recursively as follows:
Then formula_9 is the value of formula_10.
To see why this works, the polynomial can be written in the form
Thus, by iteratively substituting the formula_12 into the expression,
Now, it can be proven that;
This expression constitutes Horner's practical application, as it offers a very quick way of determining the outcome of;
with b0 (which is equal to p(x0)) being the division's remainder, as is demonstrated by the examples below. if x0 is a root of p(x), then b0 = 0 (meaning the remainder is 0), which means you can factor p(x) with (x-x0).
As to finding the consecutive b-values, you start with determining bn, which is simply equal to an. You then work your way down to the other b's, using the formula;
till you arrive at b0.
Evaluate formula_17 for formula_18
We use synthetic division as follows:
The entries in the third row are the sum of those in the first two. Each entry in the second row is the product of the "x"-value (3 in this example) with the third-row entry immediately to the left. The entries in the first row are the coefficients of the polynomial to be evaluated. Then the remainder of formula_19 on division by formula_20 is 5.
But by the polynomial remainder theorem, we know that the remainder is formula_21. Thus formula_22
In this example, if formula_23 we can see that formula_24, the entries in the third row. So, synthetic division is based on Horner's method.
As a consequence of the polynomial remainder theorem, the entries in the third row are the coefficients of the second-degree polynomial, the quotient of formula_19 on division by formula_26.
The remainder is 5. This makes Horner's method useful for polynomial long division.
Divide formula_27 by formula_28:
The quotient is formula_29.
Let formula_30 and formula_31. Divide formula_32 by formula_33 using Horner's method.
The third row is the sum of the first two rows, divided by 2. Each entry in the second row is the product of 1 with the third-row entry to the left. The answer is
Evaluation using the monomial form of a degree-"n" polynomial requires at most "n" additions and ("n"2 + "n")/2 multiplications, if powers are calculated by repeated multiplication and each monomial is evaluated individually. (This can be reduced to "n" additions and 2"n" − 1 multiplications by evaluating the powers of "x" iteratively.) If numerical data are represented in terms of digits (or bits), then the naive algorithm also entails storing approximately 2"n" times the number of bits of "x" (the evaluated polynomial has approximate magnitude "xn", and one must also store "xn" itself). By contrast, Horner's method requires only "n" additions and "n" multiplications, and its storage requirements are only "n" times the number of bits of "x". Alternatively, Horner's method can be computed with "n" fused multiply–adds. Horner's method can also be extended to evaluate the first "k" derivatives of the polynomial with "kn" additions and multiplications.
Horner's method is optimal, in the sense that any algorithm to evaluate an arbitrary polynomial must use at least as many operations. Alexander Ostrowski proved in 1954 that the number of additions required is minimal. Victor Pan proved in 1966 that the number of multiplications is minimal. However, when "x" is a matrix, Horner's method is not optimal.
This assumes that the polynomial is evaluated in monomial form and no preconditioning of the representation is allowed, which makes sense if the polynomial is evaluated only once. However, if preconditioning is allowed and the polynomial is to be evaluated many times, then faster algorithms are possible. They involve a transformation of the representation of the polynomial. In general, a degree-"n" polynomial can be evaluated using only +2 multiplications and "n" additions.
A disadvantage of Horner's rule is that all of the operations are sequentially dependent, so it is not possible to take advantage of instruction level parallelism on modern computers. In most applications where the efficiency of polynomial evaluation matters, many low-order polynomials are evaluated simultaneously (for each pixel or polygon in computer graphics, or for each grid square in a numerical simulation), so it is not necessary to find parallelism within a single polynomial evaluation.
If, however, one is evaluating a single polynomial of very high order, it may be useful to break it up as follows:
More generally, the summation can be broken into "k" parts:
where the inner summations may be evaluated using separate parallel instances of Horner's method. This requires slightly more operations than the basic Horner's method, but allows "k"-way SIMD execution of most of them.
Horner's method is a fast, code-efficient method for multiplication and division of binary numbers on a microcontroller with no hardware multiplier. One of the binary numbers to be multiplied is represented as a trivial polynomial, where (using the above notation) formula_37, and formula_38. Then, "x" (or "x" to some power) is repeatedly factored out. In this binary numeral system (base 2), formula_38, so powers of 2 are repeatedly factored out.
For example, to find the product of two numbers (0.15625) and "m":
To find the product of two binary numbers "d" and "m":
In general, for a binary number with bit values (formula_41) the product is
At this stage in the algorithm, it is required that terms with zero-valued coefficients are dropped, so that only binary coefficients equal to one are counted, thus the problem of multiplication or division by zero is not an issue, despite this implication in the factored equation:
The denominators all equal one (or the term is absent), so this reduces to
or equivalently (as consistent with the "method" described above)
In binary (base-2) math, multiplication by a power of 2 is merely a register shift operation. Thus, multiplying by 2 is calculated in base-2 by an arithmetic shift. The factor (2−1) is a right arithmetic shift, a (0) results in no operation (since 20 = 1 is the multiplicative identity element), and a (21) results in a left arithmetic shift.
The multiplication product can now be quickly calculated using only arithmetic shift operations, addition and subtraction.
The method is particularly fast on processors supporting a single-instruction shift-and-addition-accumulate. Compared to a C floating-point library, Horner's method sacrifices some accuracy, however it is nominally 13 times faster (16 times faster when the "canonical signed digit" (CSD) form is used) and uses only 20% of the code space.
Horner's method can be used to convert between different positional numeral systems – in which case "x" is the base of the number system, and the "a""i" coefficients are the digits of the base-"x" representation of a given number – and can also be used if "x" is a matrix, in which case the gain in computational efficiency is even greater. In fact, when "x" is a matrix, further acceleration is possible which exploits the structure of matrix multiplication, and only formula_46 instead of "n" multiplies are needed (at the expense of requiring more storage) using the 1973 method of Paterson and Stockmeyer.
Using the long division algorithm in combination with Newton's method, it is possible to approximate the real roots of a polynomial. The algorithm works as follows. Given a polynomial formula_47 of degree formula_2 with zeros formula_49 make some initial guess formula_50 such that formula_51. Now iterate the following two steps:
These two steps are repeated until all real zeros are found for the polynomial. If the approximated zeros are not precise enough, the obtained values can be used as initial guesses for Newton's method but using the full polynomial rather than the reduced polynomials.
Consider the polynomial
which can be expanded to
From the above we know that the largest root of this polynomial is 7 so we are able to make an initial guess of 8. Using Newton's method the first zero of 7 is found as shown in black in the figure to the right. Next formula_61 is divided by formula_62 to obtain
which is drawn in red in the figure to the right. Newton's method is used to find the largest zero of this polynomial with an initial guess of 7. The largest zero of this polynomial which corresponds to the second largest zero of the original polynomial is found at 3 and is circled in red. The degree 5 polynomial is now divided by formula_64 to obtain
which is shown in yellow. The zero for this polynomial is found at 2 again using Newton's method and is circled in yellow. Horner's method is now used to obtain
which is shown in green and found to have a zero at −3. This polynomial is further reduced to
which is shown in blue and yields a zero of −5. The final root of the original polynomial may be found by either using the final zero as an initial guess for Newton's method, or by reducing formula_68 and solving the linear equation. As can be seen, the expected roots of −8, −5, −3, 2, 3, and 7 were found.
Horner's method can be modified to compute the divided difference formula_69 Given the polynomial (as before)
proceed as follows
At completion, we have
This computation of the divided difference is subject to less
round-off error than evaluating formula_61 and formula_74 separately, particularly when
formula_75. Substituting
formula_76 in this method gives formula_77, the derivative of formula_61.
Horner's paper, titled "A new method of solving numerical equations of all orders, by continuous approximation", was read before the Royal Society of London, at its meeting on July 1, 1819, with Davies Gilbert, Vice-President and Treasurer, in the chair; this was the final meeting of the session before the Society adjourned for its Summer recess. When a sequel was read before the Society in 1823, it was again at the final meeting of the session. On both occasions, papers by James Ivory, FRS, were also read. In 1819, it was Horner's paper that got through to publication in the "Philosophical Transactions". later in the year, Ivory's paper falling by the way, despite Ivory being a Fellow; in 1823, when a total of ten papers were read, fortunes as regards publication, were reversed. Gilbert, who had strong connections with the West of England and may have had social contact with Horner, resident as Horner was in Bristol and Bath, published his own survey of Horner-type methods earlier in 1823.
Horner's paper in Part II of "Philosophical Transactions of the Royal Society of London" for 1819 was warmly and expansively welcomed by a reviewer in the issue of "The Monthly Review: or, Literary Journal" for April, 1820; in comparison, a technical paper by Charles Babbage is dismissed curtly in this review. However, the reviewer noted that another, similar method had also recently been published by the architect and mathematical expositor, Peter Nicholson. This theme is developed in a further review of some of Nicholson's books in the issue of "The Monthly Review" for December, 1820, which in turn ends with notice of the appearance of a booklet by Theophilus Holdred, from whom Nicholson acknowledges he obtained the gist of his approach in the first place, although claiming to have improved upon it. The sequence of reviews is concluded in the issue of "The Monthly Review" for September, 1821, with the reviewer concluding that whereas Holdred was the first person to discover a direct and general practical solution of numerical equations, he had not reduced it to its simplest form by the time of Horner's publication, and saying that had Holdred published forty years earlier when he first discovered his method, his contribution could be more easily recognized. The reviewer is exceptionally well-informed, even having cited Horner's preparatory correspondence with Peter Barlow in 1818, seeking work of Budan. The Bodlean Library, Oxford has the Editor's annotated copy of "The Monthly Review" from which it is clear that the most active reviewer in mathematics in 1814 and 1815 (the last years for which this information has been published) was none other than Peter Barlow, one of the foremost specialists on approximation theory of the period, suggesting that it was Barlow, who wrote this sequence of reviews. As it also happened, Henry Atkinson, of Newcastle, devised a similar approximation scheme in 1809; he had consulted his fellow Geordie, Charles Hutton, another specialist and a senior colleague of Barlow at the Royal Military Academy, Woolwich, only to be advised that, while his work was publishable, it was unlikely to have much impact. J. R. Young, writing in the mid-1830s, concluded that Holdred's first method replicated Atkinson's while his improved method was only added to Holdred's booklet some months after its first appearance in 1820, when Horner's paper was already in circulation.
The feature of Horner's writing that most distinguishes it from his English contemporaries is the way he draws on the Continental literature, notably the work of Arbogast. The advocacy, as well as the detraction, of Horner's Method has this as an unspoken subtext. Quite how he gained that familiarity has not been determined. Horner is known to have made a close reading of John Bonneycastle's book on algebra. Bonneycastle recognizes that Arbogast has the general, combinatorial expression for the reversion of series, a project going back at least to Newton. But Bonneycastle's main purpose in mentioning Arbogast is not to praise him, but to observe that Arbogast's notation is incompatible with the approach he adopts. The gap in Horner's reading was the work of Paolo Ruffini, except that, as far as awareness of Ruffini goes, citations of Ruffini's work by authors, including medical authors, in "Philosophical Transactions" speak volumes, as there are none — Ruffini's name only appears in 1814, recording a work he donated to the Royal Society. Ruffini might have done better if his work had appeared in French, as had Malfatti's Problem in the reformulation of Joseph Diez Gergonne, or had he written in French, as had , a source quoted by Bonneycastle on series reversion. (Today, Cagnoli is in the Italian Wikipedia, as shown, but has yet to make it into either French or English.)
Fuller showed that the method in Horner's 1819 paper differs from what afterwards became known as "Horner's method" and that in consequence the priority for this method should go to Holdred (1920). This view may be compared with the remarks concerning the works of Horner and Holdred in the previous paragraph. Fuller also takes aim at Augustus De Morgan. Precocious though Augustus de Morgan was, he was not the reviewer for "The Monthly Review", while several others — Thomas Stephens Davies, J. R. Young, Stephen Fenwick, T. T. Wilkinson — wrote Horner firmly into their records, not least Horner, as he published extensively up until the year of his death in 1837. His paper in 1819 was one that would have been difficult to miss. In contrast, the only other mathematical sighting of Holdred is a single named contribution to "The Gentleman's Mathematical Companion", an answer to a problem.
It is questionable to what extent it was De Morgan's advocacy of Horner's priority in discovery that led to "Horner's method" being so called in textbooks, but it is true that those suggesting this tend to know of Horner largely through intermediaries, of whom De Morgan made himself a prime example. However, this method "qua" method was known long before Horner. In reverse chronological order, Horner's method was already known to:
However, this observation on its own masks significant differences in conception and also, as noted with Ruffini's work, issues of accessibility.
Qin Jiushao, in his "Shu Shu Jiu Zhang" ("Mathematical Treatise in Nine Sections"; 1247), presents a portfolio of methods of Horner-type for solving polynomial equations, which was based on earlier works of the 11th century Song dynasty mathematician Jia Xian; for example, one method is specifically suited to bi-quintics, of which Qin gives an instance, in keeping with the then Chinese custom of case studies. The first person writing in English to note the connection with Horner's method was Alexander Wylie, writing in "The North China Herald" in 1852; perhaps conflating and misconstruing different Chinese phrases, Wylie calls the method "Harmoniously Alternating Evolution" (which does not agree with his Chinese, "linglong kaifang", not that at that date he uses pinyin), working the case of one of Qin's quartics and giving, for comparison, the working with Horner's method. Yoshio Mikami in "Development of Mathematics in China and Japan" published in Leipzig in 1913, gave a detailed description of Qin's method, using the quartic illustrated to the above right in a worked example; he wrote:
However, as Mikami is also aware, it was "not altogether impossible" that a related work, "Si Yuan Yu Jian" ("Jade Mirror of the Four Unknowns; 1303)" by Zhu Shijie might make the shorter journey across to Japan, but seemingly it never did, although another work of Zhu, "Suan Xue Qi Meng", had a seminal influence on the development of traditional mathematics in the Edo period, starting in the mid-1600s. Ulrich Libbrecht (at the time teaching in school, but subsequently a professor of comparative philosophy) gave a detailed description in his doctoral thesis of Qin's method, he concluded: "It is obvious that this procedure is a Chinese invention ... the method was not known in India". He said, Fibonacci probably learned of it from Arabs, who perhaps borrowed from the Chinese. Here, the problems is that there is no more evidence for this speculation than there is of the method being known in India. Of course, the extraction of square and cube roots along similar lines is already discussed by Liu Hui in connection with Problems IV.16 and 22 in "Jiu Zhang Suan Shu", while Wang Xiaotong in the 7th century supposes his readers can solve cubics by an approximation method described in his book Jigu Suanjing.
|
https://en.wikipedia.org/wiki?curid=14263
|
Hapworth 16, 1924
"Hapworth 16, 1924" was the last original work J. D. Salinger published in his lifetime. It appeared in the June 19, 1965, edition of "The New Yorker", infamously taking up almost the entire magazine. It is the "youngest" of Salinger's Glass family stories, in the sense that the narrated events happen chronologically before those in the rest of the series.
Both contemporary and later literary critics harshly panned "Hapworth 16, 1924"; writing in "The New York Times", Michiko Kakutani called it "a sour, implausible and, sad to say, completely charmless story ... filled with digressions, narcissistic asides and ridiculous shaggy-dog circumlocutions." Even kind critics have regarded the work as "a long-winded sob story" that many have found "simply unreadable", and it has been speculated that this response was the reason Salinger decided to quit publishing. But Salinger is also said to have considered the story a "high point of his writing" and made tentative steps to have it reprinted, though those came to nothing.
The story is presented in the form of a letter from camp written by a seven-year-old Seymour Glass (the main character of "A Perfect Day for Bananafish"). In this respect, the plot is identical to Salinger's previous unpublished story "The Ocean Full of Bowling Balls", written 18 years earlier in 1947. In the course of requesting an inordinate quantity of reading matter from home, Seymour predicts his brother's success as a writer as well as his own death and offers critical assessments of a number of major writers.
After the story's appearance in "The New Yorker", Salinger—who had already withdrawn to his home in New Hampshire—stopped publishing altogether. Since the story never appeared in book form, readers had to seek out that issue or find it on microfilm. Finally, with the release of "The Complete New Yorker" on DVD in 2005, the story was once again widely available.
In 1996, Orchises Press, a small Virginia publishing house, started a process of publishing "Hapworth" in book form. Orchises Press owner Roger Lathbury has described the effort in "The Washington Post" and, three months after Salinger's death, in "New York" magazine"." According to Lathbury, Salinger was deeply concerned with the proposed book's appearance, even visiting Washington to examine the cloth for the binding. Salinger also sent Lathbury numerous "infectious and delightful and loving" letters.
Following publishing norms, Lathbury applied for Library of Congress Cataloging in Publication data, unaware of how publicly available the information would be. A writer in Seattle, researching an article on Jeff Bezos, came across the "Hapworth" publication date, and told his sister, a journalist for the "Washington Business Journal", who wrote an article about the upcoming book. This led to substantial coverage in the press. Shortly before the books were to be shipped, Salinger changed his mind, and Orchises withdrew the book. New publication dates were repeatedly announced, but it never appeared. Lathbury said, "I never reached back out. I thought about writing some letters, but it wouldn't have done any good."
|
https://en.wikipedia.org/wiki?curid=14268
|
Hypnotic
Hypnotic (from Greek "Hypnos", sleep), or soporific drugs, commonly known as sleeping pills, are a class of psychoactive drugs whose primary function is to induce sleep and to be used in the treatment of insomnia (sleeplessness), or for surgical anesthesia.
This group is related to sedatives. Whereas the term "sedative" describes drugs that serve to calm or relieve anxiety, the term "hypnotic" generally describes drugs whose main purpose is to initiate, sustain, or lengthen sleep. Because these two functions frequently overlap, and because drugs in this class generally produce dose-dependent effects (ranging from anxiolysis to loss of consciousness) they are often referred to collectively as sedative-hypnotic drugs.
Hypnotic drugs are regularly prescribed for insomnia and other sleep disorders, with over 95% of insomnia patients being prescribed hypnotics in some countries. Many hypnotic drugs are habit-forming and, due to many factors known to disturb the human sleep pattern, a physician may instead recommend changes in the environment before and during sleep, better sleep hygiene, the avoidance of caffeine or other stimulating substances, or behavioral interventions such as cognitive behavioral therapy for insomnia (CBT-I) before prescribing medication for sleep. When prescribed, hypnotic medication should be used for the shortest period of time necessary.
Among individuals with sleep disorders, 13.7% are taking or prescribed nonbenzodiazepines, while 10.8% are taking benzodiazepines, as of 2010. Early classes of drugs, such as barbiturates, have fallen out of use in most practices but are still prescribed for some patients. In children, prescribing hypnotics is not yet acceptable unless used to treat night terrors or sleepwalking. Elderly people are more sensitive to potential side effects of daytime fatigue and cognitive impairments, and a meta-analysis found that the risks generally outweigh any marginal benefits of hypnotics in the elderly. A review of the literature regarding benzodiazepine hypnotics and Z-drugs concluded that these drugs can have adverse effects, such as dependence and accidents, and that optimal treatment uses the lowest effective dose for the shortest therapeutic time period, with gradual discontinuation in order to improve health without worsening of sleep.
Falling outside the above-mentioned categories, the neuro-hormone melatonin has a hypnotic function.
Hypnotica was a class of somniferous drugs and substances tested in medicine of the 1890s and later, including: Urethan, Acetal, Methylal, Sulfonal, paraldehyde, Amylenhydrate, Hypnon, Chloralurethan and Ohloralamid or Chloralimid.
Research about using medications to treat insomnia evolved throughout the last half of the 20th century. Treatment for insomnia in psychiatry dates back to 1869 when chloral hydrate was first used as a soporific. Barbiturates emerged as the first class of drugs that emerged in the early 1900s, after which chemical substitution allowed derivative compounds. Although the best drug family at the time (less toxic and with fewer side effects) they were dangerous in overdose and tended to cause physical and psychological dependence.
During the 1970s, quinazolinones and benzodiazepines were introduced as safer alternatives to replace barbiturates; by the late 1970s benzodiazepines emerged as the safer drug.
Benzodiazepines are not without their drawbacks; substance dependence is possible, and deaths from overdoses sometimes occur, especially in combination with alcohol and/or other depressants. Questions have been raised as to whether they disturb sleep architecture.
Nonbenzodiazepines are the most recent development (1990s–present). Although it's clear that they are less toxic than their predecessors, barbiturates, comparative efficacy over benzodiazepines have not been established. Without longitudinal studies, it is hard to determine; however some psychiatrists recommend these drugs, citing research suggesting they are equally potent with less potential for abuse.
Other sleep remedies that may be considered "sedative-hypnotics" exist; psychiatrists will sometimes prescribe medicines off-label if they have sedating effects. Examples of these include mirtazapine (an antidepressant), clonidine (generally prescribed to regulate blood pressure), quetiapine (an antipsychotic), and the over-the-counter sleep aid diphenhydramine (Benadryl – an antihistamine). Off-label sleep remedies are particularly useful when first-line treatment is unsuccessful or deemed unsafe (for example, in patients with a history of substance abuse).
Barbiturates are drugs that act as central nervous system depressants, and can therefore produce a wide spectrum of effects, from mild sedation to total anesthesia. They are also effective as anxiolytics, hypnotics, and anticonvulsalgesic effects; however, these effects are somewhat weak, preventing barbiturates from being used in surgery in the absence of other analgesics. They have dependence liability, both physical and psychological. Barbiturates have now largely been replaced by benzodiazepines in routine medical practice – for example, in the treatment of anxiety and insomnia – mainly because benzodiazepines are significantly less dangerous in overdose. However, barbiturates are still used in general anesthesia, for epilepsy, and assisted suicide. Barbiturates are derivatives of barbituric acid.
The principal mechanism of action of barbiturates is believed to be positive allosteric modulation of GABAA receptors.
Examples include amobarbital, pentobarbital, phenobarbital, secobarbital, and sodium thiopental.
Quinazolinones are also a class of drugs which function as hypnotic/sedatives that contain a 4-quinazolinone core. Their use has also been proposed in the treatment of cancer.
Examples of quinazolinones include cloroqualone, diproqualone, etaqualone (Aolan, Athinazone, Ethinazone), mebroqualone, mecloqualone (Nubarene, Casfen), and methaqualone (Quaalude).
Benzodiazepines can be useful for short-term treatment of insomnia. Their use beyond 2 to 4 weeks is not recommended due to the risk of dependence. It is preferred that benzodiazepines be taken intermittently and at the lowest effective dose. They improve sleep-related problems by shortening the time spent in bed before falling asleep, prolonging the sleep time, and, in general, reducing wakefulness. Like alcohol, benzodiazepines are commonly used to treat insomnia in the short-term (both prescribed and self-medicated), but worsen sleep in the long-term. While benzodiazepines can put people to sleep (i.e., inhibit NREM stage 1 and 2 sleep), while asleep, the drugs disrupt sleep architecture: decreasing sleep time, delaying time to REM sleep, and decreasing deep slow-wave sleep (the most restorative part of sleep for both energy and mood).
Other drawbacks of hypnotics, including benzodiazepines, are possible tolerance to their effects, rebound insomnia, and reduced slow-wave sleep and a withdrawal period typified by rebound insomnia and a prolonged period of anxiety and agitation. The list of benzodiazepines approved for the treatment of insomnia is fairly similar among most countries, but which benzodiazepines are officially designated as first-line hypnotics prescribed for the treatment of insomnia can vary distinctly between countries. Longer-acting benzodiazepines such as nitrazepam and diazepam have residual effects that may persist into the next day and are, in general, not recommended.
It is not clear as to whether the new nonbenzodiazepine hypnotics (Z-drugs) are better than the short-acting benzodiazepines. The efficacy of these two groups of medications is similar. According to the US Agency for Healthcare Research and Quality, indirect comparison indicates that side-effects from benzodiazepines may be about twice as frequent as from nonbenzodiazepines. Some experts suggest using nonbenzodiazepines preferentially as a first-line long-term treatment of insomnia. However, the UK National Institute for Health and Clinical Excellence (NICE) did not find any convincing evidence in favor of Z-drugs. A NICE review pointed out that short-acting Z-drugs were inappropriately compared in clinical trials with long-acting benzodiazepines. There have been no trials comparing short-acting Z-drugs with appropriate doses of short-acting benzodiazepines. Based on this, NICE recommended choosing the hypnotic based on cost and the patient's preference.
Older adults should not use benzodiazepines to treat insomnia unless other treatments have failed to be effective.
Examples of antipsychotics with sedation as a side effect:
The use of sedative medications in older people generally should be avoided. These medications are associated with poorer health outcomes, including cognitive decline.
Therefore, sedatives and hypnotics should be avoided in people with dementia according to the clinical guidelines known as the Medication Appropriateness Tool for Comorbid Health Conditions in Dementia (MATCH-D). The use of these medications can further impede cognitive function for people with dementia, who are also more sensitive to side effects of medications.
|
https://en.wikipedia.org/wiki?curid=14269
|
HMS Dunraven
HMS "Dunraven" was a Q-Ship of the Royal Navy during World War I.
On 8 August 1917, 130 miles southwest of Ushant in the Bay of Biscay, disguised as the collier "Boverton" and commanded by Gordon Campbell, VC, "Dunraven" spotted , commanded by "Oberleutnant zur See" Reinhold Saltzwedel. Saltzwedel believed the disguised ship was a merchant vessel. The U-boat submerged and closed with "Dunraven" before surfacing astern at 11:43 am and opening fire at long range. "Dunraven" made smoke and sent off a panic party (a small number of men who "abandon ship" during an attack to continue the impersonation of a merchant).
Shells began hitting "Dunraven", detonating her depth charges and setting her stern afire. Her crew remained hidden letting the fires burn. Then a 4-inch (102 mm) gun and crew were blown away revealing "Dunraven"s identity as a warship, and "UC-71" submerged. A second "panic party" abandoned ship. "Dunraven" was hit by a torpedo. A third "panic party" went over the side, leaving only two guns manned. "UC-71" surfaced, shelled "Dunraven" and again submerged. Campbell replied with two torpedoes that missed, and around 3 pm, the undamaged U-boat left that area. Only one of "Dunraven"s crew was killed, but the Q-Ship was sinking.
The British destroyer picked up "Dunraven"s survivors and took her in tow for Plymouth, but "Dunraven" sank at 1:30 am early on 10 August 1917 to the north of Ushant.
In recognition, two Victoria Crosses were awarded, one to the ship's First Lieutenant, Lt. Charles George Bonner RNR, and the other, by ballot, to a gunlayer, Petty Officer Ernest Herbert Pitcher.
Captain Campbell later wrote:
Captain Campbell had been previously awarded the Victoria Cross, in February 1917, for the sinking of .
|
https://en.wikipedia.org/wiki?curid=14273
|
Hacker ethic
Hacker ethic is a term for the moral values and philosophy that are common in hacker culture. Practitioners of the hacker ethic believe that sharing information and data responsibly is beneficial and helpful. Whilst the philosophy originated at the Massachusetts Institute of Technology in the 1950s–1960s, the term "hacker ethic" is attributed to journalist Steven Levy as described in his 1984 book titled "." The key points within this ethic are access, freedom of information, and improvement to quality of life. While some tenets of hacker ethic were described in other texts like "Computer Lib/Dream Machines" (1974) by Ted Nelson, Levy appears to have been the first to document both the philosophy and the founders of the philosophy.
Steven Levy explains that MIT housed an early IBM 704 computer inside the Electronic Accounting Machinery (EAM) room in 1959. This room became the staging grounds for early hackers, as MIT students from the Tech Model Railroad Club sneaked inside the EAM room after hours to attempt programming the 30-ton, computer.
The MIT group defined a "hack" as a project undertaken or a product built to fulfill some constructive goal but also out of pleasure for mere involvement. The term "hack" arose from MIT lingo, as the word had long been used to describe college pranks that MIT students would regularly devise. However, Levy's hacker ethic has often arguably been quoted out of context and misunderstood as a reference to hacking, as in breaking into computers, and thus many sources imply that hacker ethic is describing the ideals of white-hat hackers. However, Levy's philosophy does not necessarily have to do with computer security, and rather seeks to address broader issues.
The hacker ethic was described as a "new way of life, with a philosophy, an ethic and a dream". However, the elements of the hacker ethic were not openly debated and discussed; rather they were implicitly accepted and silently agreed upon.
The free software movement was born in the early 1980s from followers of the hacker ethic. Its founder, Richard Stallman, is referred to by Steven Levy as "the last true hacker". Modern hackers who hold true to the hacker ethics—especially the Hands-On Imperative—are usually supporters of free and open source software. This is because free and open source software allows hackers to get access to the source code used to create the software, to allow it to be improved or reused in other projects.
Richard Stallman describes:
The hacker ethic refers to the feelings of right and wrong, to the ethical ideas this community of people had—that knowledge should be shared with other people who can benefit from it, and that important resources should be utilized rather than wasted.
and states more precisely that hacking (which Stallman defines as playful cleverness) and ethics are two separate issues:
Just because someone enjoys hacking does not mean he has an ethical commitment to treating other people properly. Some hackers care about ethics—I do, for instance—but that is not part of being a hacker, it is a separate trait. [...] Hacking is not primarily about an ethical issue. [...] hacking tends to lead a significant number of hackers to think about ethical questions in a certain way. I would not want to completely deny all connection between hacking and views on ethics.
As Levy summarized in the preface of "Hackers", the general tenets or principles of hacker ethic include:
In addition to those principles, Levy also described more specific hacker ethics and beliefs in chapter 2, "The Hacker Ethic": The ethics he described in chapter 2 are:
From the early days of modern computing through to the 1970s, it was far more common for computer users to have the freedoms that are provided by an ethic of open sharing and collaboration. Software, including source code, was commonly shared by individuals who used computers. Most companies had a business model based on hardware sales, and provided or bundled the associated software free of charge. According to Levy's account, sharing was the norm and expected within the non-corporate hacker culture. The principle of sharing stemmed from the open atmosphere and informal access to resources at MIT. During the early days of computers and programming, the hackers at MIT would develop a program and share it with other computer users.
If the hack was deemed particularly good, then the program might be posted on a board somewhere near one of the computers. Other programs that could be built upon it and improved it were saved to tapes and added to a drawer of programs, readily accessible to all the other hackers. At any time, a fellow hacker might reach into the drawer, pick out the program, and begin adding to it or "bumming" it to make it better. Bumming referred to the process of making the code more concise so that more can be done in fewer instructions, saving precious memory for further enhancements.
In the second generation of hackers, sharing was about sharing with the general public in addition to sharing with other hackers. A particular organization of hackers that was concerned with sharing computers with the general public was a group called Community Memory. This group of hackers and idealists put computers in public places for anyone to use. The first community computer was placed outside of Leopold's Records in Berkeley, California.
Another sharing of resources occurred when Bob Albrecht provided considerable resources for a non-profit organization called the People's Computer Company (PCC). PCC opened a computer center where anyone could use the computers there for fifty cents per hour.
This second generation practice of sharing contributed to the battles of free and open software. In fact, when Bill Gates' version of BASIC for the Altair was shared among the hacker community, Gates claimed to have lost a considerable sum of money because few users paid for the software. As a result, Gates wrote an Open Letter to Hobbyists. This letter was published by several computer magazines and newsletters, most notably that of the Homebrew Computer Club where much of the sharing occurred.
Many of the principles and tenets of hacker ethic contribute to a common goal: the Hands-On Imperative. As Levy described in Chapter 2, "Hackers believe that essential lessons can be learned about the systems—about the world—from taking things apart, seeing how they work, and using this knowledge to create new and more interesting things."
Employing the Hands-On Imperative requires free access, open information, and the sharing of knowledge. To a true hacker, if the Hands-On Imperative is restricted, then the ends justify the means to make it unrestricted "so that improvements can be made". When these principles are not present, hackers tend to work around them. For example, when the computers at MIT were protected either by physical locks or login programs, the hackers there systematically worked around them in order to have access to the machines. Hackers assumed a "willful blindness" in the pursuit of perfection.
This behavior was not malicious in nature: the MIT hackers did not seek to harm the systems or their users. This deeply contrasts with the modern, media-encouraged image of hackers who crack secure systems in order to steal information or complete an act of cyber-vandalism.
Throughout writings about hackers and their work processes, a common value of community and collaboration is present. For example, in Levy's "Hackers", each generation of hackers had geographically based communities where collaboration and sharing occurred. For the hackers at MIT, it was the labs where the computers were running. For the hardware hackers (second generation) and the game hackers (third generation) the geographic area was centered in Silicon Valley where the Homebrew Computer Club and the People's Computer Company helped hackers network, collaborate, and share their work.
The concept of community and collaboration is still relevant today, although hackers are no longer limited to collaboration in geographic regions. Now collaboration takes place via the Internet. Eric S. Raymond identifies and explains this conceptual shift in "The Cathedral and the Bazaar":
Before cheap Internet, there were some geographically compact communities where the culture encouraged Weinberg's egoless programming, and a developer could easily attract a lot of skilled kibitzers and co-developers. Bell Labs, the MIT AI and LCS labs, UC Berkeley: these became the home of innovations that are legendary and still potent.
Raymond also notes that the success of Linux coincided with the wide availability of the World Wide Web. The value of community is still in high practice and use today.
Levy identifies several "true hackers" who significantly influenced the hacker ethic. Some well-known "true hackers" include:
Levy also identified the "hardware hackers" (the "second generation", mostly centered in Silicon Valley) and the "game hackers" (or the "third generation"). All three generations of hackers, according to Levy, embodied the principles of the hacker ethic. Some of Levy's "second-generation" hackers include:
Levy's "third generation" practitioners of hacker ethic include:
In this digital age, and due to our reliance on technology, hackers are able to gather more information on us than before. One way to combat this is to teach students to hack in such a way that they become what are called "white hat hackers". White hat hackers follow ethical guidelines that proscribe harming either other people or the systems on which other people depend. Through their efforts, they have the ability to prevent malicious attacks. The movement of ethical hacking has gained traction through different programs such as the L0pht and GhettoHackers, and courses have become integrated into university- and college-level curriculum.
Security researcher and an application security engineer Joe Gervais pointed out that students who are intellectually curious enough may start to experiment with computers without thinking of the ethical repercussions of their actions. He points out that there are a lot of classes that exist for more gifted students in areas such as math, reading, etc. However, there doesn’t seem to be courses that can address the curiosity that a young hacker may have.
Hacking courses can create a moral compass for young hackers. They require a constructive environment that allows them to satiate their desire to understand computers. Students in these classes have the ability to learn what they are passionate about while also understanding the ethical boundaries that should not be violated. However, the integral part of the curriculum would be to prevent the development of black hat hackers.
There seems to be a lack of skilled cyber security experts. Moreover, there doesn’t seem to be sufficient curriculum to teach individuals the skills required to protect systems against malicious action. Teaching ethical hacking is a plausible way to fill the gap in the supply of technically skilled people; capable of successfully implementing defensive mechanisms to overcome real world threats.
In 2012, Ymir Vigfusson held a talk at TEDxRekjavijk entitled "Why I Teach People to Hack", illustrating reasons for supporting the teaching of ethical hacking.
In 2001, Finnish philosopher Pekka Himanen promoted the hacker ethic in opposition to the Protestant work ethic. In Himanen's opinion, the hacker ethic is more closely related to the virtue ethics found in the writings of Plato and of Aristotle. Himanen explained these ideas in a book, "The Hacker Ethic and the Spirit of the Information Age", with a prologue contributed by Linus Torvalds and an epilogue by Manuel Castells.
In this manifesto, the authors wrote about a hacker ethic centering on passion, hard work, creativity and joy in creating software. Both Himanen and Torvalds were inspired by the Sampo in Finnish mythology. The Sampo, described in the Kalevala saga, was a magical artifact constructed by Ilmarinen, the blacksmith god, that brought good fortune to its holder; nobody knows exactly what it was supposed to be. The Sampo has been interpreted in many ways: a world pillar or world tree, a compass or astrolabe, a chest containing a treasure, a Byzantine coin die, a decorated Vendel period shield, a Christian relic, etc. Kalevala saga compiler Lönnrot interpreted it to be a "quern" or mill of some sort that made flour, salt, and gold out of thin air.
The hacker ethic and its wider context can be associated with liberalism and anarchism.
|
https://en.wikipedia.org/wiki?curid=14275
|
Hotel
A hotel is an establishment that provides paid lodging on a short-term basis. Facilities provided inside a hotel room may range from a modest-quality mattress in a small room to large suites with bigger, higher-quality beds, a dresser, a refrigerator and other kitchen facilities, upholstered chairs, a flat screen television, and en-suite bathrooms. Small, lower-priced hotels may offer only the most basic guest services and facilities. Larger, higher-priced hotels may provide additional guest facilities such as a swimming pool, business centre (with computers, printers, and other office equipment), childcare, conference and event facilities, tennis or basketball courts, gymnasium, restaurants, day spa, and social function services. Hotel rooms are usually numbered (or named in some smaller hotels and B&Bs) to allow guests to identify their room. Some boutique, high-end hotels have custom decorated rooms. Some hotels offer meals as part of a room and board arrangement. In the United Kingdom, a hotel is required by law to serve food and drinks to all guests within certain stated hours. In Japan, capsule hotels provide a tiny room suitable only for sleeping and shared bathroom facilities.
The precursor to the modern hotel was the inn of medieval Europe. For a period of about 200 years from the mid-17th century, coaching inns served as a place for lodging for coach travelers. Inns began to cater to richer clients in the mid-18th century. One of the first hotels in a modern sense was opened in Exeter in 1768. Hotels proliferated throughout Western Europe and North America in the early 19th century, and luxury hotels began to spring up in the later part of the 19th century.
Hotel operations vary in size, function, complexity, and cost. Most hotels and major hospitality companies have set industry standards to classify hotel types. An upscale full-service hotel facility offers luxury amenities, full service accommodations, an on-site restaurant, and the highest level of personalized service, such as a concierge, room service, and clothes pressing staff. Full service hotels often contain upscale full-service facilities with many full-service accommodations, an on-site full-service restaurant, and a variety of on-site amenities. Boutique hotels are smaller independent, non-branded hotels that often contain upscale facilities. Small to medium-sized hotel establishments offer a limited amount of on-site amenities. Economy hotels are small to medium-sized hotel establishments that offer basic accommodations with little to no services. Extended stay hotels are small to medium-sized hotels that offer longer-term full service accommodations compared to a traditional hotel.
Timeshare and destination clubs are a form of property ownership involving ownership of an individual unit of accommodation for seasonal usage. A motel is a small-sized low-rise lodging with direct access to individual rooms from the car park. Boutique hotels are typically hotels with a unique environment or intimate setting. A number of hotels have entered the public consciousness through popular culture, such as the Ritz Hotel in London. Some hotels are built specifically as destinations in themselves, for example casinos and holiday resorts.
Most hotel establishments are run by a General Manager who serves as the head executive (often referred to as the "Hotel Manager"), department heads who oversee various departments within a hotel (e.g., food service), middle managers, administrative staff, and line-level supervisors. The organizational chart and volume of job positions and hierarchy varies by hotel size, function and class, and is often determined by hotel ownership and managing companies.
The word "hotel" is derived from the French "hôtel" (coming from the same origin as "hospital"), which referred to a French version of a building seeing frequent visitors, and providing care, rather than a place offering accommodation. In contemporary French usage, "hôtel" now has the same meaning as the English term, and "hôtel particulier" is used for the old meaning, as well as "hôtel" in some place names such as Hôtel-Dieu (in Paris), which has been a hospital since the Middle Ages. The French spelling, with the circumflex, was also used in English, but is now rare. The circumflex replaces the 's' found in the earlier "hostel" spelling, which over time took on a new, but closely related meaning. Grammatically, hotels usually take the definite article – hence "The Astoria Hotel" or simply "The Astoria."
Facilities offering hospitality to travellers have been a feature of the earliest civilizations. In Greco-Roman culture and ancient Persia, hospitals for recuperation and rest were built at thermal baths. Japan's Nishiyama Onsen Keiunkan, founded in 705, was officially recognised by the Guinness World Records as the oldest hotel in the world. During the Middle Ages, various religious orders at monasteries and abbeys would offer accommodation for travellers on the road.
The precursor to the modern hotel was the inn of medieval Europe, possibly dating back to the rule of Ancient Rome. These would provide for the needs of travellers, including food and lodging, stabling and fodder for the traveller's horse(s) and fresh horses for the mail coach. Famous London examples of inns include the George and the Tabard. A typical layout of an inn had an inner court with bedrooms on the two sides, with the kitchen and parlour at the front and the stables at the back.
For a period of about 200 years from the mid-17th century, coaching inns served as a place for lodging for coach travellers (in other words, a roadhouse). Coaching inns stabled teams of horses for stagecoaches and mail coaches and replaced tired teams with fresh teams. Traditionally they were seven miles apart, but this depended very much on the terrain.
Some English towns had as many as ten such inns and rivalry between them was intense, not only for the income from the stagecoach operators but for the revenue for food and drink supplied to the wealthy passengers. By the end of the century, coaching inns were being run more professionally, with a regular timetable being followed and fixed menus for food.
Inns began to cater to richer clients in the mid-18th century, and consequently grew in grandeur and the level of service provided. One of the first hotels in a modern sense was opened in Exeter in 1768, although the idea only really caught on in the early 19th century. In 1812, Mivart's Hotel opened its doors in London, later changing its name to Claridge's.
Hotels proliferated throughout Western Europe and North America in the 19th century. Luxury hotels, including the 1829 Tremont House in Boston, the 1836 Astor House in New York City, the 1889 Savoy Hotel in London, and the Ritz chain of hotels in London and Paris in the late 1890s, catered to an ever more wealthy clientele.
Hotels cater to travelers from many countries and languages, since no one country dominates the travel industry.
Hotel operations vary in size, function, and cost. Most hotels and major hospitality companies that operate hotels have set widely accepted industry standards to classify hotel types. General categories include the following:
International luxury hotels offer high quality amenities, full service accommodations, on-site full-service restaurants, and the highest level of personalized and professional service in major or capital cities. International luxury hotels are classified with at least a Five Diamond rating or Five Star hotel rating depending on the country and local classification standards. "Examples include: Grand Hyatt, Conrad, InterContinental, Sofitel, Mandarin Oriental, Four Seasons, The Peninsula, Rosewood, JW Marriott and The Ritz-Carlton."
Lifestyle luxury resorts are branded hotels that appeal to a guest with lifestyle or personal image in specific locations. They are typically full-service and classified as luxury. A key characteristic of lifestyle resorts is focus on providing a unique guest experience as opposed to simply providing lodging. Normally, lifestyle resorts are classified with a Five Star hotel rating depending on the country and local classification standards. Examples include: Waldorf Astoria, St. Regis, Shangri-La, Oberoi, Belmond, Jumeirah, Aman, Taj Hotels, Hoshino, Raffles, Fairmont, Banyan Tree, Regent and Park Hyatt.
Upscale full service hotels often provide a wide array of guest services and on-site facilities. Commonly found amenities may include: on-site food and beverage (room service and restaurants), meeting and conference services and facilities, fitness center, and business center. Upscale full-service hotels range in quality from upscale to luxury. This classification is based upon the quality of facilities and amenities offered by the hotel. Examples include: W Hotels, Sheraton, Langham, Kempinski,
Kimpton Hotels, Hilton, Lotte, Renaissance, Marriott and Hyatt Regency brands.
Boutique hotels are smaller independent non-branded hotels that often contain mid-scale to upscale facilities of varying size in unique or intimate settings with full service accommodations. These hotels are generally 100 rooms or fewer.
Small to medium-sized hotel establishments that offer a limited number of on-site amenities that only cater and market to a specific demographic of travelers, such as the single business traveler. Most focused or select service hotels may still offer full service accommodations but may lack leisure amenities such as an on-site restaurant or a swimming pool. Examples include Hyatt Place, Holiday Inn, Courtyard by Marriott and Hilton Garden Inn.
Small to medium-sized hotel establishments that offer a very limited number of on-site amenities and often only offer basic accommodations with little to no services, these facilities normally only cater and market to a specific demographic of travelers, such as the budget-minded traveler seeking a "no frills" accommodation. Limited service hotels often lack an on-site restaurant but in return may offer a limited complimentary food and beverage amenity such as on-site continental breakfast service. Examples include Ibis Budget, Hampton Inn, Aloft, Holiday Inn Express, Fairfield Inn, Four Points by Sheraton.
Extended stay hotels are small to medium-sized hotels that offer longer term full service accommodations compared to a traditional hotel. Extended stay hotels may offer non-traditional pricing methods such as a weekly rate that caters towards travelers in need of short-term accommodations for an extended period of time. Similar to limited and select service hotels, on-site amenities are normally limited and most extended stay hotels lack an on-site restaurant. Examples include Staybridge Suites, Candlewood Suites, Homewood Suites by Hilton, Home2 Suites by Hilton, Residence Inn by Marriott, Element, and Extended Stay America.
Timeshare and Destination clubs are a form of property ownership also referred to as a vacation ownership involving the purchase and ownership of an individual unit of accommodation for seasonal usage during a specified period of time. Timeshare resorts often offer amenities similar that of a Full service hotel with on-site restaurant(s), swimming pools, recreation grounds, and other leisure-oriented amenities. Destination clubs on the other hand may offer more exclusive private accommodations such as private houses in a neighborhood-style setting. Examples of timeshare brands include Hilton Grand Vacations, Marriott Vacation Club International, Westgate Resorts, Disney Vacation Club, and Holiday Inn Club Vacations.
A motel, an abbreviation for "motor hotel", is a small-sized low-rise lodging establishment similar to a limited service, lower-cost hotel, but typically with direct access to individual rooms from the car park. Motels were built to serve road travellers, including travellers on road trip vacations and workers who drive for their job (travelling salespeople, truck drivers, etc.). Common during the 1950s and 1960s, motels were often located adjacent to a major highway, where they were built on inexpensive land at the edge of towns or along stretches of freeway.
New motel construction is rare in the 2000s as hotel chains have been building economy-priced, limited service franchised properties at freeway exits which compete for largely the same clientele, largely saturating the market by the 1990s. Motels are still useful in less populated areas for driving travelers, but the more populated an area becomes, the more hotels move in to meet the demand for accommodation. While many motels are unbranded and independent, many of the other motels which remain in operation joined national franchise chains, often rebranding themselves as hotels, inns or lodges. Some examples of chains with motels include EconoLodge, Motel 6, Super 8, and Travelodge.
Motels in some parts of the world are more often regarded as places for romantic assignations where rooms are often rented by the hour. This is fairly common in parts of Latin America.
Hotels may offer rooms for microstays, a type of booking for less than 24 hours where the customer chooses the check in time and the length of the stay. This allows the hotel increased revenue by reselling the same room several times a day.
Hotel management is a globally accepted professional career field and academic field of study. Degree programs such as hospitality management studies, a business degree, and/or certification programs formally prepare hotel managers for industry practice.
Most hotel establishments consist of a General Manager who serves as the head executive (often referred to as the "Hotel Manager"), department heads who oversee various departments within a hotel, middle managers, administrative staff, and line-level supervisors. The organizational chart and volume of job positions and hierarchy varies by hotel size, function, and is often determined by hotel ownership and managing companies.
Boutique hotels are typically hotels with a unique environment or intimate setting.
Some hotels have gained their renown through tradition, by hosting significant events or persons, such as Schloss Cecilienhof in Potsdam, Germany, which derives its fame from the Potsdam Conference of the World War II allies Winston Churchill, Harry Truman and Joseph Stalin in 1945. The Taj Mahal Palace & Tower in Mumbai is one of India's most famous and historic hotels because of its association with the Indian independence movement. Some establishments have given name to a particular meal or beverage, as is the case with the Waldorf Astoria in New York City, United States where the Waldorf Salad was first created or the Hotel Sacher in Vienna, Austria, home of the Sachertorte. Others have achieved fame by association with dishes or cocktails created on their premises, such as the Hotel de Paris where the crêpe Suzette was invented or the Raffles Hotel in Singapore, where the Singapore Sling cocktail was devised.
A number of hotels have entered the public consciousness through popular culture, such as the Ritz Hotel in London, through its association with Irving Berlin's song, 'Puttin' on the Ritz'. The Algonquin Hotel in New York City is famed as the meeting place of the literary group, the Algonquin Round Table, and Hotel Chelsea, also in New York City, has been the subject of a number of songs and the scene of the stabbing of Nancy Spungen (allegedly by her boyfriend Sid Vicious).
Some hotels are built specifically as a destination in itself to create a captive trade, example at casinos, amusement parks and holiday resorts. Though hotels have always been built in popular destinations, the defining characteristic of a resort hotel is that it exists purely to serve another attraction, the two having the same owners.
On the Las Vegas Strip there is a tradition of one-upmanship with luxurious and extravagant hotels in a concentrated area. This trend now has extended to other resorts worldwide, but the concentration in Las Vegas is still the world's highest: nineteen of the world's twenty-five largest hotels by room count are on the Strip, with a total of over 67,000 rooms.
The Null Stern Hotel in Teufen, Appenzellerland, Switzerland and the Concrete Mushrooms in Albania are former nuclear bunkers transformed into hotels.
The Cuevas Pedro Antonio de Alarcón (named after the author) in Guadix, Spain, as well as several hotels in Cappadocia, Turkey, are notable for being built into natural cave formations, some with rooms underground. The Desert Cave Hotel in Coober Pedy, South Australia is built into the remains of an opal mine.
Located on the coast but high above sea level, these hotels offer unobstructed panoramic views and a great sense of privacy without the feeling of total isolation. Some examples from around the globe are the Riosol Hotel in Gran Canaria, Caruso Belvedere Hotel in Amalfi Coast (Italy), Aman Resorts Amankila in Bali, Birkenhead House in Hermanus (South Africa), The Caves in Jamaica and Caesar Augustus in Capri.
Capsule hotels are a type of economical hotel first introduced in Japan, where people sleep in stacks of rectangular containers.
Some hotels fill daytime occupancy with day rooms, for example, Rodeway Inn and Suites near Port Everglades in Fort Lauderdale, Florida. Day rooms are booked in a block of hours typically between 8 am and 5 pm, before the typical night shift. These are similar to transit hotels in that they appeal to travelers, however, unlike transit hotels, they do not eliminate the need to go through Customs.
Garden hotels, famous for their gardens before they became hotels, include Gravetye Manor, the home of garden designer William Robinson, and Cliveden, designed by Charles Barry with a rose garden by Geoffrey Jellicoe.
The Ice Hotel in Jukkasjärvi, Sweden, was the first ice hotel in the world; first built in 1990, it is built each winter and melts every spring. The Hotel de Glace in Duschenay, Canada, opened in 2001 and it's North America's only ice hotel. It is redesigned and rebuilt in its entirety every year.
Ice hotels can also be included within larger ice complexes; for example, the Mammut Snow Hotel in Finland is located within the walls of the Kemi snow castle; and the Lainio Snow Hotel is part of a snow village near Ylläs, Finland. There is an arctic snowhotel in Rovaniemi in Lapland, Finland, along with glass igloos. The first glass igloos were built in 1999 in Finland,they became the Kakslauttanen Arctic Resort with 65 buildings, 53 small ones for two people and 12 large ones for four people. Glass igloos, with their roof made of thermal glass, allow guests to admire auroras comfortably from their beds.
A love hotel (also 'love motel', especially in Taiwan) is a type of short-stay hotel found around the world, operated primarily for the purpose of allowing guests privacy for sexual activities, typically for one to three hours, but with overnight as an option. Styles of premises vary from extremely low-end to extravagantly appointed. In Japan, love hotels have a history of over 400 years.
A referral hotel is a hotel chain that offers branding to independently operated hotels; the chain itself is founded by or owned by the member hotels as a group. Many former referral chains have been converted to franchises; the largest surviving member-owned chain is Best Western.
The first recorded purpose-built railway hotel was the Great Western Hotel, which opened adjacent to Reading railway station in 1844, shortly after the Great Western Railway opened its line from London. The building still exists, and although it has been used for other purposes over the years, it is now again a hotel and a member of the Malmaison hotel chain.
Frequently, expanding railway companies built grand hotels at their termini, such as the Midland Hotel, Manchester next to the former Manchester Central Station, and in London the ones above St Pancras railway station and Charing Cross railway station. London also has the Chiltern Court Hotel above Baker Street tube station, there are also Canada's grand railway hotels. They are or were mostly, but not exclusively, used by those traveling by rail.
The Maya Guesthouse in Nax Mont-Noble in the Swiss Alps, is the first hotel in Europe built entirely with straw bales. Due to the insulation values of the walls it needs no conventional heating or air conditioning system, although the Maya Guesthouse is built at an altitude of in the Alps.
Transit hotels are short stay hotels typically used at international airports where passengers can stay while waiting to change airplanes. The hotels are typically on the airside and do not require a visa for a stay or re-admission through security checkpoints.
Some hotels are built with living trees as structural elements, for example the Treehotel near Piteå, Sweden, the Costa Rica Tree House in the Gandoca-Manzanillo Wildlife Refuge, Costa Rica; the Treetops Hotel in Aberdare National Park, Kenya; the Ariau Towers near Manaus, Brazil, on the Rio Negro in the Amazon; and Bayram's Tree Houses in Olympos, Turkey.
Some hotels have accommodation underwater, such as Utter Inn in Lake Mälaren, Sweden. Hydropolis, project in Dubai, would have had suites on the bottom of the Persian Gulf, and Jules' Undersea Lodge in Key Largo, Florida requires scuba diving to access its rooms.
A resort island is an island or an archipelago that contains resorts, hotels, overwater bungalows, restaurants, tourist attractions and its amenities. Maldives has the most overwater bungalows resorts.
In 2006, "Guinness World Records" listed the First World Hotel in Genting Highlands, Malaysia, as the world's largest hotel with a total of 6,118 rooms (and which has now expanded to 7,351 rooms). The Izmailovo Hotel in Moscow has the most beds, with 7,500, followed by The Venetian and The Palazzo complex in Las Vegas (7,117 rooms) and MGM Grand Las Vegas complex (6,852 rooms).
According to the Guinness Book of World Records, the oldest hotel in operation is the Nisiyama Onsen Keiunkan in Yamanashi, Japan. The hotel, first opened in AD 707, has been operated by the same family for forty-six generations. The title was held until 2011 by the Hoshi Ryokan, in the Awazu Onsen area of Komatsu, Japan, which opened in the year 718, as the history of the Nisiyama Onsen Keiunkan was virtually unknown.
The Rosewood Guangzhou located on the top 39 floors of the 108-story Guangzhou CTF Finance Centre in Tianhe District, Guangzhou, China. Soaring to 530-meters at its highest point, earns the singular status as the world's highest hotel.
In October 2014, the Anbang Insurance Group, based in China, purchased the Waldorf Astoria New York in Manhattan for US$1.95 billion, making it the world's most expensive hotel ever sold.
A number of public figures have notably chosen to take up semi-permanent or permanent residence in hotels.
|
https://en.wikipedia.org/wiki?curid=14276
|
Michael Milken
Michael Robert Milken (born July 4, 1946) is an American financier and philanthropist. He is noted for his role in the development of the market for high-yield bonds ("junk bonds"), and his conviction and sentence following a guilty plea on felony charges for violating U.S. securities laws. Since his release from prison, he has also become known for his charitable giving. Milken was pardoned by President Donald Trump on February 18, 2020.
Milken was indicted for racketeering and securities fraud in 1989 in an insider trading investigation. As the result of a plea bargain, he pleaded guilty to securities and reporting violations but not to racketeering or insider trading. Milken was sentenced to ten years in prison, fined $600 million, and permanently barred from the securities industry by the Securities and Exchange Commission. His sentence was later reduced to two years for cooperating with testimony against his former colleagues and for good behavior. Since his release from prison, Milken has funded medical research.
He is co-founder of the Milken Family Foundation, chairman of the Milken Institute, and founder of medical philanthropies funding research into melanoma, cancer and other life-threatening diseases. A prostate cancer survivor, Milken has devoted significant resources to research on the disease. In a November 2004 cover article, "Fortune" magazine called him "The Man Who Changed Medicine" for changes in approach to funding and results that he initiated. Milken's compensation while head of the high-yield bond department at Drexel Burnham Lambert in the late 1980s exceeded $1 billion over a four-year period, a record for U.S. income at that time. With an estimated net worth of around $3.7 billion as of 2018, he is ranked by "Forbes" magazine as the 606th richest person in the world.
Milken was born into a middle-class Jewish family in Encino, California.
He graduated from Birmingham High School where he was the head cheerleader and worked while in school at a diner. His classmates included future Disney president Michael Ovitz and actresses Sally Field and Cindy Williams. In 1968, he graduated from the University of California, Berkeley with a B.S. with highest honors where he was elected to Phi Beta Kappa and was a member of the Sigma Alpha Mu fraternity. He received his MBA from the Wharton School of the University of Pennsylvania. While at Berkeley, Milken was influenced by credit studies authored by W. Braddock Hickman, a former president of the Federal Reserve Bank of Cleveland, who noted that a portfolio of non-investment grade bonds offered "risk-adjusted" returns greater than that of an investment-grade portfolio.
Through his Wharton professors, Milken landed a summer job at Drexel Harriman Ripley, an old-line investment bank, in 1969. After completing his MBA, he joined Drexel (by then known as Drexel Firestone) as director of low-grade bond research. He was also given control of some capital and permitted to trade. Over the next 17 years, he had only four down months.
Drexel merged with Burnham and Company in 1973 to form Drexel Burnham. Despite the firm's name, Burnham was the nominal survivor; the Drexel name came first only at the insistence of the more powerful investment banks, whose blessing was necessary for the merged firm to inherit Drexel's position as a "major" firm.
Milken was one of the few prominent holdovers from the Drexel side of the merger, and became the merged firm's head of convertibles. He persuaded his new boss, fellow Wharton alumnus Tubby Burnham, to let him start a high-yield bond trading department—an operation that soon earned a 100% return on investment. By 1976, Milken's income at what was now Drexel Burnham Lambert was estimated at $5 million a year. In 1978, Milken moved the high-yield bond operation to Century City in Los Angeles.
By the mid-1980s, Milken's network of high-yield bond buyers (notably Fred Carr's Executive Life Insurance Company and Tom Spiegel's Columbia Savings & Loan) had reached a size that enabled him to raise large amounts of money quickly.
This money-raising ability also facilitated the activities of leveraged buyout (LBO) firms such as Kohlberg Kravis Roberts and of the so-called "greenmailers". Most of them were armed with a "highly confident letter" from Drexel, a tool Drexel's corporate finance wing crafted that promised to raise the necessary debt in time to fulfill the buyer's obligations. It carried no legal status, but by this time, Milken had a reputation for being able to make markets for any bonds that he underwrote. For this reason, "highly confident letters" were considered to reliably demonstrate capacity to pay. Supporters, like George Gilder in his book, "Telecosm" (2000), state that Milken was "a key source of the organizational changes that have impelled economic growth over the last twenty years. Most striking was the productivity surge in capital, as Milken... and others took the vast sums trapped in old-line businesses and put them back into the markets."
Amongst his significant detractors have been Martin Fridson formerly of Merrill Lynch and author Ben Stein. Milken's high-yield "pioneer" status has proved dubious as studies show "original issue" high-yield issues were common during and after the Great Depression. Milken himself points out that high-yield bonds go back hundreds of years, having been issued by the Massachusetts Bay Colony in the 17th century and by America's first Treasury Secretary Alexander Hamilton. Others such as Stanford Phelps, an early co-associate and rival at Drexel, have also contested his credit as having pioneered the modern high-yield market.
Despite his influence in the financial world during the 1980s (at least one source called him the most powerful American financier since J. P. Morgan), Milken was an intensely private man who shunned publicity; he reportedly owned almost all photographs taken of him.
Milken and his brother Lowell founded Knowledge Universe in 1996, as well as Knowledge Learning Corporation (KLC), the parent company of KinderCare Learning Centers, the largest for-profit child care provider in the country. He is currently chairman of the company.
He established K12 Inc., a publicly traded education management organization (EMO) that provides online schooling, including to charter school students, for whom services are paid by tax dollars, which is the largest EMO in terms of enrollment.
Dan Stone, a former Drexel executive, wrote in his book "April Fools" that Milken was under nearly constant scrutiny from the Securities and Exchange Commission from 1979 onward due to unethical and sometimes illegal behavior in the high-yield department.
Milken's role in such behavior has been much debated. Stone claims that Milken viewed the securities laws, rules and regulations with a degree of contempt, feeling they hindered the free flow of trade. However, Stone said that while Milken condoned questionable and illegal acts by his colleagues, Milken himself personally followed the rules. He often called Drexel's president and CEO, Fred Joseph—known for his strict view of the securities laws—with ethical questions.
On the other hand, several of the sources James B. Stewart used for "Den of Thieves" told him that Milken often tried to get as much as five times the maximum markup on trades than was permitted at the time.
Harvey A. Silverglate, a prominent defense attorney who represented Milken during the appellate process, disputes that view in his book "Three Felonies a Day": "Milken's biggest problem was that some of his most ingenious but entirely lawful maneuvers were viewed, by those who initially did not understand them, as felonious, precisely because they were novel – and often extremely profitable."
The SEC inquiries never got beyond the investigation phase until 1986, when arbitrageur Ivan Boesky pleaded guilty to securities fraud as part of a larger insider trading investigation. As part of his plea, Boesky implicated Milken in several illegal transactions, including insider trading, stock manipulation, fraud and stock parking (buying stocks for the benefit of another). This led to an SEC probe of Drexel, as well as a separate criminal probe by Rudy Giuliani, then United States Attorney for the Southern District of New York. Although both investigations were almost entirely focused on Milken's department, Milken refused to talk with Drexel (which launched its own internal investigation) except through his lawyers. It turned out that Milken's legal team believed Drexel would be forced to cooperate with the government at some point, believing that a securities firm would not survive the bad publicity of a long criminal and SEC probe.
For two years, Drexel insisted that nothing illegal occurred, even when the SEC sued Drexel in 1988. Later that year, Giuliani began considering an indictment of Drexel under the powerful Racketeer Influenced and Corrupt Organizations Act. Drexel management immediately began plea bargain talks, concluding that a financial institution could not possibly survive a RICO indictment. However, talks collapsed on December 19, when Giuliani made several demands that Drexel found too harsh, including one that Milken leave the firm if indicted.
Only a day later, however, Drexel lawyers discovered suspicious activity in one of the limited partnerships Milken set up to allow members of his department to make their own investments. That entity, MacPherson Partners, had acquired several warrants for the stock of Storer Broadcasting in 1985. At the time, Kohlberg Kravis Roberts was in the midst of a leveraged buyout of Storer, and Drexel was lead underwriter for the bonds being issued. One of Drexel's other clients bought several Storer warrants and sold them back to the high-yield bond department. The department in turn sold them to MacPherson. This partnership included Milken, other Drexel executives, and a few Drexel customers. However, it also included several managers of money market funds who had worked with Milken in the past. It appeared that the money managers bought the warrants for themselves and did not offer the same opportunity to the funds they managed. Some of Milken's children also received warrants, according to Stewart, raising the appearance of Milken self-dealing.
However, the warrants to money managers were especially problematic. At the very least, Milken's actions were a serious breach of Drexel's internal regulations, and the money managers had breached their fiduciary duty to their clients. At worst, the warrants could have been construed as bribes to the money managers to influence decisions they made for their funds.
Indeed, several money managers were eventually convicted on bribery charges. The discovery of MacPherson Partners—whose very existence had not been known to the public at the time—seriously eroded Milken's credibility with the board. On December 21, 1988, Drexel entered a guilty plea to six counts of stock parking and stock manipulation. Drexel said it "was not in a position to dispute" the allegations made by the government. As part of the deal, Drexel agreed that Milken had to leave the firm if indicted.
In March 1989, a federal grand jury indicted Milken on 98 counts of racketeering and fraud. The indictment accused Milken of a litany of misconduct, including insider trading, stock parking (concealing the real owner of a stock), tax evasion and numerous instances of repayment of illicit profits. One charge was that Boesky paid Drexel $5.3 million in 1986 for Milken's share of profits from illegal trading. This payment was represented as a consulting fee to Drexel. Shortly afterward, Milken resigned from Drexel and formed his own firm, International Capital Access Group.
On April 24, 1990, Milken pleaded guilty to six counts of securities and tax violations. Three of them involved dealings with Boesky to conceal the real owner of a stock:
Two other counts were related to tax evasion in transactions Milken carried out for a client of the firm, David Solomon, a fund manager
The last count was for conspiracy to commit these five violations.
As part of his plea, Milken agreed to pay $200 million in fines. At the same time, he agreed to a settlement with the SEC in which he paid $400 million to investors who had been hurt by his actions. He also accepted a lifetime ban from any involvement in the securities industry. In a related civil lawsuit against Drexel he agreed to pay $500 million to Drexel's investors.
Critics of the government charge that the government indicted Milken's brother Lowell to pressure Milken to settle, a tactic some legal scholars condemn as unethical. "I am troubled by - and other scholars are troubled by - the notion of putting relatives on the bargaining table," said Vivian Berger, a professor at Columbia University Law School, in a 1990 interview with "The New York Times". As part of the deal, the case against Lowell was dropped. Federal investigators also questioned some of Milken's relatives about their investments.
At Milken's sentencing, Judge Kimba Wood told him:
In statements to a parole board in 1991, Judge Wood estimated that the "total loss from Milken's crimes" was $318,000, less than the government's estimate of $4.7 million and she recommended that he be eligible for parole in three years. Milken's sentence was later reduced to two years from ten; he served 22 months.
In June 2018 it was reported that some of president Donald Trump's advisers, including Rudy Giuliani, the onetime federal prosecutor whose criminal investigation led to Milken's conviction, were urging the president to pardon Milken. Milken's attempts to secure a presidential pardon have spanned multiple administrations.
On February 18, 2020, Trump granted clemency to Milken. However, his previous trading license which he lost following his conviction still remained void, and he would still have to reapply and obtain a new trading license in order to return to trading securities.
In February 2013, the SEC announced that they were investigating whether Milken violated his lifetime ban from the securities industry. The investigation revolved around Milken allegedly providing investment advice through Guggenheim Partners.
Since 2011, the SEC has been investigating Guggenheim's relationship with Milken.
In 1982, Milken and his brother Lowell founded the Milken Family Foundation to support medical research and education. Through the Milken Educator Awards (founded in 1985), the MFF has awarded a total of more than $60 million to more than 2,500 teachers. Among the other initiatives of the Milken Family Foundation are the:
Upon his release from prison in 1993, Milken founded the Prostate Cancer Foundation for prostate cancer research, which by 2010 was "the largest philanthropic source of funds for research into prostate cancer". Milken himself was diagnosed with advanced prostate cancer in the same month he was released. His cancer is currently in remission. The Prostate Cancer Foundation works closely with Major League Baseball through its Home Run Challenge program to promote awareness of prostate cancer and raise money for medical research. Each season in the weeks leading up to Father's Day, Milken visits many ballparks and appears on TV and radio broadcasts during the games.
In 2003, Milken launched a Washington, D.C.-based think tank called FasterCures, which seeks greater efficiency in researching all serious diseases. Key initiatives of FasterCures are TRAIN, Partnering for Cures and the Philanthropy Advisory Service.
Fortune magazine called Milken "The Man Who Changed Medicine" in a 2004 cover story on his philanthropy.
In September 2012, Milken and the director of the National Institutes of Health, Dr. Francis Collins, jointly hosted 1,000 senior medical scientists, patients, activists, philanthropists, regulators, and members of Congress at a three-day conference to demonstrate the return on investment in medical research.
On March 11, 2014, President Steven Knapp of George Washington University in Washington, D.C. announced the university was renaming its public health school after Milken as a result of a total of $80 million in gifts, $50 million from the Milken Institute and the Milken Family Foundation and $30 million gift from Viacom chairman Sumner Redstone. These gifts are designated to research and scholarship on public health issues.
Milken became the first recipient of the Ig Nobel Economics Prize in 1991.
Ayad Akhtar's play, "Junk", set during the bond trading scandals of the 1980s, is partly based on Milken's "fall from grace". Milken is the inspiration for the main character in the play.
In the "Futurama" episode "Future Stock", Milken is mentioned by a character (a stock trader) from the 20th century who says, "Back in the 1980s, I was the toast of Wall Street. I was having whiskey with Boesky and cookies with Milken."
Milken is married to Lori Anne Hackel, whom he had dated in high school. They have three children. He reportedly follows a vegetarian-like diet rich in fruits and vegetables for its presumed health benefits and has co-authored a vegan cookbook with Beth Ginsberg.
|
https://en.wikipedia.org/wiki?curid=20813
|
Musical keyboard
A musical keyboard is the set of adjacent depressible levers or keys on a musical instrument. Keyboards typically contain keys for playing the twelve notes of the Western musical scale, with a combination of larger, longer keys and smaller, shorter keys that repeats at the interval of an octave. Depressing a key on the keyboard makes the instrument produce sounds—either by mechanically striking a string or tine (acoustic and electric piano, clavichord), plucking a string (harpsichord), causing air to flow through a pipe organ, striking a bell (carillon), or, on electric and electronic keyboards, completing a circuit (Hammond organ, digital piano, synthesizer). Since the most commonly encountered keyboard instrument is the piano, the keyboard layout is often referred to as the "piano keyboard".
The twelve notes of the Western musical scale are laid out with the lowest note on the left; The longer keys (for the seven "natural" notes of the C major scale: C, D, E, F, G, A, B) jut forward. Because these keys were traditionally covered in ivory they are often called the "white notes" or "white keys". The keys for the remaining five notes—which are not part of the C major scale—(i.e., C/D, D/E, F/G, G/A, A/B) (see Sharp and Flat) are raised and shorter. Because these keys receive less wear, they are often made of black colored wood and called the "black notes" or "black keys". The pattern repeats at the interval of an octave.
The arrangement of longer keys for C major with intervening, shorter keys for the intermediate semitones dates to the 15th century. Many keyboard instruments dating from before the nineteenth century, such as harpsichords and pipe organs, have a keyboard with the colours of the keys reversed: the "white notes" are made of ebony and the "black notes" are covered with softer white bone. A few electric and electronic instruments from the 1960s and subsequent decades have also done this; Vox's electronic organs of the 1960s, Farfisa's FAST portable organs, Hohner's Clavinet L, one version of Korg's Poly-800 synthesizer and Roland's digital harpsichords.
Some 1960s electronic organs used reverse colors or gray sharps or naturals to indicate the lower part (or parts) of a single keyboard divided into two parts, each controlling a different registration or sound. Such keyboards accommodate melody and contrasting accompaniment without the expense of a second manual, and were a regular feature in Spanish and some English organs of the renaissance and baroque eras. The break was between middle C and C-sharp, or outside of Iberia between B and C. Broken keyboards reappeared in 1842 with the harmonium, the split occurring at E4/F4.
The reverse-colored keys on Hammond organs such as the B3, C3 and A100 are latch-style radio buttons for selecting pre-set sounds.
The chromatic range (also called "compass") of keyboard instruments has tended to increase. Harpsichords often extended over five octaves (61+ keys) in the 18th century, while most pianos manufactured since about 1870 have 88 keys. Some modern pianos have even more notes (a Bösendorfer 290 "Imperial" has 97 keys and a Stuart & Sons model has 108 keys). While modern synthesizer keyboards commonly have either 61, 76 or 88 keys, small MIDI controllers are available with 25 notes. (Digital systems allow shifting octaves, pitch, and "splitting" ranges dynamically, which, in some cases, reduce the need for dedicated keys. However, smaller keyboards will typically limit which musical scores can be played.) Organs normally have 61 keys per manual, though some spinet models have 44 or 49. An organ pedalboard is a keyboard with long pedals played by the organist's feet. Pedalboards vary in size from 12 to 32 notes.
In a typical keyboard layout, "black note" keys have uniform width, and "white note" keys have uniform width and uniform spacing at the front of the keyboard. In the larger gaps between the "black" keys, the width of the natural notes C, D and E differ slightly from the width of keys F, G, A and B. This allows close to uniform spacing of 12 keys per octave while maintaining uniformity of seven "natural" keys per octave.
Over the last three hundred years, the octave span distance found on historical keyboard instruments (organs, virginals, clavichords, harpsichords, and pianos) has ranged from as little as to as much as . Modern piano keyboards ordinarily have an octave span of ; resulting in the width of black keys averaging and white keys about wide at the base, disregarding space between keys. Several reduced-size standards have been proposed and marketed. A 15/16 size ( octave span) and the 7/8 DS Standard ( octave span) keyboard developed by Christopher Donison in the 1970s and developed and marketed by Steinbuhler & Company. U.S. pianist Hannah Reimann has promoted piano keyboards with narrower octave spans and has a U.S. patent on the apparatus and methods for modifying existing pianos to provide interchangeable keyboards of different sizes.
There have been variations in the design of the keyboard to address technical and musical issues. The earliest designs of keyboards were based heavily on the notes used in Gregorian chant (the seven diatonic notes plus B-flat) and as such would often include B and B both as diatonic "white notes," with the B at the leftmost side of the keyboard and the B at the rightmost. Thus, an octave would have "eight" "white keys" and only four "black keys." The emphasis on these eight notes would continue for a few centuries after the "seven and five" system was adopted, in the form of the short octave: the eight aforementioned notes were arranged at the leftmost side of the keyboard, compressed in the keys between E and C (at the time, accidentals that low were very uncommon and thus not needed). During the sixteenth century, when instruments were often tuned in meantone temperament, some harpsichords were constructed with the G and E keys split into two. One portion of the G key operated a string tuned to G and the other operated a string tuned to A, similarly one portion of the E key operated a string tuned to E, the other portion operating a string tuned to D. This type of keyboard layout, known as the enharmonic keyboard, extended the flexibility of the harpsichord, enabling composers to write keyboard music calling for harmonies containing the so-called "wolf" fifth (G-sharp to E-flat), but without producing aural discomfort in the listeners ("see: Split sharp"). The "broken octave," a variation of the aforementioned short octave, similarly used split keys to add accidentals left out of the short octave. Other examples of variations in keyboard design include the Jankó keyboard and the chromatic keyboard systems on the chromatic button accordion and bandoneón.
Simpler electronic keyboards have switches under each key. Depressing a key connects a circuit, which triggers tone generation. Most keyboards use a keyboard matrix circuit, in which eight rows and eight columns of wires cross — thus, 16 wires can provide (8x8=) 64 crossings, which the keyboard controller scans to determine which key was pressed. The problem with this system is that it provides only a crude binary on/off signal for each key. Better electronic keyboards employ two sets of switches for each key that are slightly offset. By determining the timing between the activation of the first and second switches, the velocity of a key press can be determined — greatly improving the performance dynamic of a keyboard. The best electronic keyboards have dedicated circuits for each key, providing polyphonic aftertouch.
Advanced electronic keyboards may provide hundreds of key touch levels and have 88 keys, as most pianos do.
Despite their visual similarity, different keyboard instrument types require different techniques. The piano hammer mechanism produces a louder note the faster the key is pressed, while the harpsichord's plectrum mechanism does not perceptibly vary the volume of the note with different touch on the keyboard. The pipe organ's volume and timbre are controlled by the flow of air from the bellows and the stops preselected by the player. Players of these instruments therefore use different techniques to color the sound. An arranger keyboard may be preset to produce any of a range of voices as well as percussion and other accompaniments that respond to chords played by the left hand.
Even though the keyboard layout is simple and all notes are easily accessible, playing requires skill. A proficient player has undertaken much training to play accurately and in tempo. Beginners seldom produce a passable rendition of even a simple piece due to lack of technique. The sequences of movements of the player's hands can be very complicated. Problems include wide-spanned chords, which can be difficult for people with small hands, chords requiring unusual hand positions that can initially be uncomfortable, and fast scales, trills and arpeggios.
Playing instruments with "velocity sensitive" (or "dynamic") keyboards (i.e., that respond to varying playing velocity) may require finger independence, so that some fingers play "harder" while others play more softly. Pianists call this control of touch velocity "voicing" (not to be confused with a piano technician's "voicing" of a piano by modifying the hardness of the hammers). Keyboardists speak of playing harder and softer, or with more or less force. This may accurately describe the player's experience—but in the mechanics of the keyboard, velocity controls musical dynamics. The faster the player depresses the key, the louder the note. Players must learn to coordinate two hands and use them independently. Most music is written for two hands; typically the right hand plays the melody in the treble range, while the left plays an accompaniment of bass notes and chords in the bass range. Examples of music written for the left hand alone include several of Leopold Godowsky's 53 Studies on Chopin's Etudes, Maurice Ravel's Piano Concerto for the Left Hand and Sergei Prokofiev's Piano Concerto No. 4 for the left hand. In music that uses counterpoint technique, both hands play different melodies at the same time.
A number of percussion instruments—such as the xylophone, marimba, vibraphone, or glockenspiel— have pitched elements arranged in the keyboard layout. Rather than depress a key, the performer typically strikes each element (bell, metal or wood bar, etc.) with a mallet.
There are some examples of a musical keyboard layout used for non-musical devices. For example, some of the earliest printing telegraph machines used a layout similar to a piano keyboard.
There are some rare variations of keyboards with more or fewer than 12 keys per octave, mostly used in microtonal music, after the discoveries and theoretical developments of musician and inventor Julián Carrillo (1875–1965).
Some free-reed instrument keyboards such as accordions and Indian harmoniums include microtones. Electronic music pioneer Pauline Oliveros played one of these. Egyptian belly-dance musicians like Hassam Ramzy use custom-tuned accordions so they can play traditional scales. The small Garmon accordion played in the Music of Azerbaijan sometimes has keys that can play microtones when a "shift" key is pressed.
|
https://en.wikipedia.org/wiki?curid=20816
|
Musical tuning
In music, there are two common meanings for tuning:
Tuning is the process of adjusting the pitch of one or many tones from musical instruments to establish typical intervals between these tones. Tuning is usually based on a fixed reference, such as A = 440 Hz. The term ""out of tune"" refers to a pitch/tone that is either too high (sharp) or too low (flat) in relation to a given reference pitch. While an instrument might be in tune relative to its own range of notes, it may not be considered 'in tune' if it does not match the chosen reference pitch. Some instruments become 'out of tune' with temperature, humidity, damage, or just time, and must be readjusted or repaired.
Different methods of sound production require different methods of adjustment:
The sounds of some instruments such as cymbals are "inharmonic"—they have irregular overtones not conforming to the harmonic series.
Tuning may be done aurally by sounding two pitches and adjusting one of them to match or relate to the other. A tuning fork or electronic tuning device may be used as a reference pitch, though in ensemble rehearsals often a piano is used (as its pitch cannot be adjusted for each performance). Symphony orchestras and concert bands usually tune to an A440 or a B♭, respectively, provided by the principal oboist or clarinetist, who tune to the keyboard if part of the performance.
When only strings are used, then the principal string (violinist) typically has sounded the tuning pitch, but some orchestras have used an electronic tone machine for tuning.
Interference beats are used to objectively measure the accuracy of tuning. As the two pitches approach a harmonic relationship, the frequency of beating decreases. When tuning a unison or octave it is desired to reduce the beating frequency until it cannot be detected. For other intervals, this is dependent on the tuning system being used.
Harmonics may be used to facilitate tuning of strings that are not themselves tuned to the unison. For example, lightly touching the highest string of a cello at the middle (at a node) while bowing produces the same pitch as doing the same a third of the way down its second-highest string. The resulting unison is more easily and quickly judged than the quality of the perfect fifth between the fundamentals of the two strings.
In music, the term open string refers to the fundamental note of the unstopped, full string.
The strings of a guitar are normally tuned to fourths (excepting the G and B strings in standard tuning, which are tuned to a third), as are the strings of the bass guitar and double bass. Violin, viola, and cello strings are tuned to fifths. However, non-standard tunings (called scordatura) exist to change the sound of the instrument or create other playing options.
To tune an instrument, often only one reference pitch is given. This reference is used to tune one string, to which the other strings are tuned in the desired intervals. On a guitar, often the lowest string is tuned to an E. From this, each successive string can be tuned by fingering the fifth fret of an already tuned string and comparing it with the next higher string played open. This works with the exception of the G string, which must be stopped at the fourth fret to sound B against the open B string above. Alternatively, each string can be tuned to its own reference tone.
Note that while the guitar and other modern stringed instruments with fixed frets are tuned in equal temperament, string instruments without frets, such as those of the violin family, are not. The violin, viola, and cello are tuned to beatless just perfect fifths and ensembles such as string quartets and orchestras tend to play in fifths based Pythagorean tuning or to compensate and play in equal temperament, such as when playing with other instruments such as the piano. For example, the cello, which is tuned down from A220, has three more strings (four total) and the just perfect fifth is about two cents off from the equal tempered perfect fifth, making its lowest string, C-, about six cents more flat than the equal tempered C.
This table lists open strings on some common string instruments and their standard tunings from low to high unless otherwise noted.
Violin scordatura was employed in the 17th and 18th centuries by Italian and German composers, namely, Biagio Marini, Antonio Vivaldi, Heinrich Ignaz Franz Biber (who in the "Rosary Sonatas" prescribes a great variety of scordaturas, including crossing the middle strings), Johann Pachelbel and Johann Sebastian Bach, whose "Fifth Suite For Unaccompanied Cello" calls for the lowering of the A string to G. In Mozart's "Sinfonia Concertante" in E-flat major (K. 364), all the strings of the solo viola are raised one half-step, ostensibly to give the instrument a brighter tone so the solo violin does not overshadow it.
Scordatura for the violin was also used in the 19th and 20th centuries in works by Niccolò Paganini, Robert Schumann, Camille Saint-Saëns and Béla Bartók. In Saint-Saëns' "Danse Macabre", the high string of the violin is lower half a tone to the E so as to have the most accented note of the main theme sound on an open string. In Bartók's "Contrasts", the violin is tuned G-D-A-E to facilitate the playing of tritones on open strings.
American folk violinists of the Appalachians and Ozarks often employ alternate tunings for dance songs and ballads. The most commonly used tuning is A-E-A-E. Likewise banjo players in this tradition use many tunings to play melody in different keys. A common alternative banjo tuning for playing in D is A-D-A-D-E. Many Folk guitar players also used different tunings from standard, such as D-A-D-G-A-D, which is very popular for Irish music.
A musical instrument that has had its pitch deliberately lowered during tuning is said to be "down-tuned" or "tuned down". Common examples include the electric guitar and electric bass in contemporary heavy metal music, whereby one or more strings are often tuned lower than concert pitch. This is not to be confused with electronically changing the fundamental frequency, which is referred to as pitch shifting.
Many percussion instruments are "tuned" by the player, including pitched percussion instruments such as timpani and tabla, and unpitched percussion instruments such as the snare drum.
Tuning pitched percussion follows the same patterns as tuning any other instrument, but tuning unpitched percussion does not produce a specific pitch. For this reason and others, the traditional terms "tuned percussion" and "untuned percussion" are avoided in recent organology.
A "tuning system" is the system used to define which tones, or pitches, to use when playing music. In other words, it is the choice of number and spacing of frequency values used.
Due to the psychoacoustic interaction of tones and timbres, various tone combinations sound more or less "natural" in combination with various timbres. For example, using harmonic timbres:
More complex musical effects can be created through other relationships.
The creation of a tuning system is complicated because musicians want to make music with more than just a few differing tones. As the number of tones is increased, conflicts arise in how each tone combines with every other. Finding a successful combination of tunings has been the cause of debate, and has led to the creation of many different tuning systems across the world. Each tuning system has its own characteristics, strengths and weaknesses.
It is impossible to tune the twelve-note chromatic scale so that all intervals are pure. For instance, three pure major thirds stack up to 125/64, which at 1159 cents is nearly a quarter tone away from the octave (1200 cents). So there is no way to have both the octave and the major third in just intonation for all the intervals in the same twelve-tone system. Similar issues arise with the fifth 3/2, and the minor third 6/5 or any other choice of harmonic-series based pure intervals.
Many different compromise methods are used to deal with this, each with its own characteristics, and advantages and disadvantages.
The main ones are:
Tuning systems that are not produced with exclusively just intervals are usually referred to as "temperaments".
|
https://en.wikipedia.org/wiki?curid=20817
|
Miller–Urey experiment
The Miller–Urey experiment (or Miller experiment) was a chemical experiment that simulated the conditions thought at the time (1952) to be present on the early Earth and tested the chemical origin of life under those conditions. The experiment at the time supported Alexander Oparin's and J. B. S. Haldane's hypothesis that putative conditions on the primitive Earth favoured chemical reactions that synthesized more complex organic compounds from simpler inorganic precursors. Considered to be the classic experiment investigating abiogenesis, it was conducted in 1952 by Stanley Miller, with assistance from Harold Urey, at the University of Chicago and later the University of California, San Diego and published the following year.
After Miller's death in 2007, scientists examining sealed vials preserved from the original experiments were able to show that there were actually well over 20 different amino acids produced in Miller's original experiments. That is considerably more than what Miller originally reported, and more than the 20 that naturally occur in life. More recent evidence suggests that Earth's original atmosphere might have had a composition different from the gas used in the Miller experiment, but prebiotic experiments continue to produce racemic mixtures of simple-to-complex compounds under varying conditions.
The experiment used water (H2O), methane (CH4), ammonia (NH3), and hydrogen (H2). The chemicals were all sealed inside a sterile 5-liter glass flask connected to a 500 ml flask half-full of water. The water in the smaller flask was heated to induce evaporation, and the water vapour was allowed to enter the larger flask. Continuous electrical sparks were fired between the electrodes to simulate lightning in the water vapour and gaseous mixture, and then the simulated atmosphere was cooled again so that the water condensed and trickled into a U-shaped trap at the bottom of the apparatus.
After a day, the solution collected at the trap had turned pink in colour, and after a week of continuous operation the solution was deep red and turbid. The boiling flask was then removed, and mercuric chloride was added to prevent microbial contamination. The reaction was stopped by adding barium hydroxide and sulfuric acid, and evaporated to remove impurities. Using paper chromatography, Miller identified five amino acids present in the solution: glycine, α-alanine and β-alanine were positively identified, while aspartic acid and α-aminobutyric acid (AABA) were less certain, due to the spots being faint.
In a 1996 interview, Stanley Miller recollected his lifelong experiments following his original work and stated: "Just turning on the spark in a basic pre-biotic experiment will yield 11 out of 20 amino acids."
The original experiment remained in 2017 under the care of Miller and Urey's former student Jeffrey Bada, a professor at the UCSD, Scripps Institution of Oceanography. , the apparatus used to conduct the experiment was on display at the Denver Museum of Nature and Science.
One-step reactions among the mixture components can produce hydrogen cyanide (HCN), formaldehyde (CH2O), and other active intermediate compounds (acetylene, cyanoacetylene, etc.):
The formaldehyde, ammonia, and HCN then react by Strecker synthesis to form amino acids and other biomolecules:
Furthermore, water and formaldehyde can react, via Butlerov's reaction to produce various sugars like ribose.
The experiments showed that simple organic compounds of building blocks of proteins and other macromolecules can be formed from gases with the addition of energy.
This experiment inspired many others. In 1961, Joan Oró found that the nucleotide base adenine could be made from hydrogen cyanide (HCN) and ammonia in a water solution. His experiment produced a large amount of adenine, the molecules of which were formed from 5 molecules of HCN.
Also, many amino acids are formed from HCN and ammonia under these conditions.
Experiments conducted later showed that the other RNA and DNA nucleobases could be obtained through simulated prebiotic chemistry with a reducing atmosphere.
There also had been similar electric discharge experiments related to the origin of life contemporaneous with Miller–Urey. An article in "The New York Times" (March 8, 1953:E9), titled "Looking Back Two Billion Years" describes the work of Wollman (William) M. MacNevin at The Ohio State University, before the Miller "Science" paper was published in May 1953. MacNevin was passing 100,000 volt sparks through methane and water vapor and produced "resinous solids" that were "too complex for analysis." The article describes other early earth experiments being done by MacNevin. It is not clear if he ever published any of these results in the primary scientific literature.
K. A. Wilde submitted a paper to "Science" on December 15, 1952, before Miller submitted his paper to the same journal on February 10, 1953. Wilde's paper was published on July 10, 1953. Wilde used voltages up to only 600 V on a binary mixture of carbon dioxide (CO2) and water in a flow system. He observed only small amounts of carbon dioxide reduction to carbon monoxide, and no other significant reduction products or newly formed carbon compounds.
Other researchers were studying UV-photolysis of water vapor with carbon monoxide. They have found that various alcohols, aldehydes and organic acids were synthesized in reaction mixture.
More recent experiments by chemists Jeffrey Bada, one of Miller's graduate students, and Jim Cleaves at Scripps Institution of Oceanography of the University of California, San Diego were similar to those performed by Miller. However, Bada noted that in current models of early Earth conditions, carbon dioxide and nitrogen (N2) create nitrites, which destroy amino acids as fast as they form. When Bada performed the Miller-type experiment with the addition of iron and carbonate minerals, the products were rich in amino acids. This suggests the origin of significant amounts of amino acids may have occurred on Earth even with an atmosphere containing carbon dioxide and nitrogen.
Some evidence suggests that Earth's original atmosphere might have contained fewer of the reducing molecules than was thought at the time of the Miller–Urey experiment. There is abundant evidence of major volcanic eruptions 4 billion years ago, which would have released carbon dioxide, nitrogen, hydrogen sulfide (H2S), and sulfur dioxide (SO2) into the atmosphere. Experiments using these gases in addition to the ones in the original Miller–Urey experiment have produced more diverse molecules. The experiment created a mixture that was racemic (containing both L and D enantiomers) and experiments since have shown that "in the lab the two versions are equally likely to appear"; however, in nature, L amino acids dominate. Later experiments have confirmed disproportionate amounts of L or D oriented enantiomers are possible.
Originally it was thought that the primitive secondary atmosphere contained mostly ammonia and methane. However, it is likely that most of the atmospheric carbon was CO2 with perhaps some CO and the nitrogen mostly N2. In practice gas mixtures containing CO, CO2, N2, etc. give much the same products as those containing CH4 and NH3 so long as there is no O2. The hydrogen atoms come mostly from water vapor. In fact, in order to generate aromatic amino acids under primitive earth conditions it is necessary to use less hydrogen-rich gaseous mixtures. Most of the natural amino acids, hydroxyacids, purines, pyrimidines, and sugars have been made in variants of the Miller experiment.
More recent results may question these conclusions. The University of Waterloo and University of Colorado conducted simulations in 2005 that indicated that the early atmosphere of Earth could have contained up to 40 percent hydrogen—implying a much more hospitable environment for the formation of prebiotic organic molecules. The escape of hydrogen from Earth's atmosphere into space may have occurred at only one percent of the rate previously believed based on revised estimates of the upper atmosphere's temperature. One of the authors, Owen Toon notes: "In this new scenario, organics can be produced efficiently in the early atmosphere, leading us back to the organic-rich soup-in-the-ocean concept... I think this study makes the experiments by Miller and others relevant again." Outgassing calculations using a chondritic model for the early earth complement the Waterloo/Colorado results in re-establishing the importance of the Miller–Urey experiment.
In contrast to the general notion of early earth's reducing atmosphere, researchers at the Rensselaer Polytechnic Institute in New York reported the possibility of oxygen available around 4.3 billion years ago. Their study reported in 2011 on the assessment of Hadean zircons from the earth's interior (magma) indicated the presence of oxygen traces similar to modern-day lavas. This study suggests that oxygen could have been released in the earth's atmosphere earlier than generally believed.
Conditions similar to those of the Miller–Urey experiments are present in other regions of the solar system, often substituting ultraviolet light for lightning as the energy source for chemical reactions. The Murchison meteorite that fell near Murchison, Victoria, Australia in 1969 was found to contain over 90 different amino acids, nineteen of which are found in Earth life. Comets and other icy outer-solar-system bodies are thought to contain large amounts of complex carbon compounds (such as tholins) formed by these processes, darkening surfaces of these bodies. The early Earth was bombarded heavily by comets, possibly providing a large supply of complex organic molecules along with the water and other volatiles they contributed. This has been used to infer an origin of life outside of Earth: the panspermia hypothesis.
In recent years, studies have been made of the amino acid composition of the products of "old" areas in "old" genes, defined as those that are found to be common to organisms from several widely separated species, assumed to share only the last universal ancestor (LUA) of all extant species. These studies found that the products of these areas are enriched in those amino acids that are also most readily produced in the Miller–Urey experiment. This suggests that the original genetic code was based on a smaller number of amino acids – only those available in prebiotic nature – than the current one.
Jeffrey Bada, himself Miller's student, inherited the original equipment from the experiment when Miller died in 2007. Based on sealed vials from the original experiment, scientists have been able to show that although successful, Miller was never able to find out, with the equipment available to him, the full extent of the experiment's success. Later researchers have been able to isolate even more different amino acids, 25 altogether. Bada has estimated that more accurate measurements could easily bring out 30 or 40 more amino acids in very low concentrations, but the researchers have since discontinued the testing. Miller's experiment was therefore a remarkable success at synthesizing complex organic molecules from simpler chemicals, considering that all known life uses just 20 different amino acids.
In 2008, a group of scientists examined 11 vials left over from Miller's experiments of the early 1950s. In addition to the classic experiment, reminiscent of Charles Darwin's envisioned "warm little pond", Miller had also performed more experiments, including one with conditions similar to those of volcanic eruptions. This experiment had a nozzle spraying a jet of steam at the spark discharge. By using high-performance liquid chromatography and mass spectrometry, the group found more organic molecules than Miller had. They found that the volcano-like experiment had produced the most organic molecules, 22 amino acids, 5 amines and many hydroxylated molecules, which could have been formed by hydroxyl radicals produced by the electrified steam. The group suggested that volcanic island systems became rich in organic molecules in this way, and that the presence of carbonyl sulfide there could have helped these molecules form peptides.
The main problem of theories based around amino acids is the difficulty in obtaining spontaneous formation of peptides. Since John Desmond Bernal's suggestion that clay surfaces could have played a role in abiogenesis, scientific efforts have been dedicated to investigating clay-mediated peptide bond formation, with limited success. Peptides formed remained over-protected and shown no evidence of inheritance or metabolism. In December 2017 a theoretical model developed by Erastova and collaborators suggested that peptides could form at the interlayers of layered double hydroxides such as green rust in early earth conditions. According to the model, drying of the intercalated layered material should provide energy and co-alignment required for peptide bond formation in a ribosome-like fashion, while re-wetting should allow mobilising the newly formed peptides and repopulate the interlayer with new amino acids. This mechanism is expected to lead to the formation of 12+ amino acid-long peptides within 15-20 washes. Researches also observed slightly different adsorption preferences for different amino acids, and postulated that, if coupled to a diluted solution of mixed amino acids, such preferences could lead to sequencing.
In October 2018, researchers at McMaster University on behalf of the Origins Institute announced the development of a new technology, called a "Planet Simulator", to help study the origin of life on planet Earth and beyond.
Below is a table of amino acids produced and identified in the "classic" 1952 experiment, as published by Miller in 1953, the 2008 re-analysis of vials from the volcanic spark discharge experiment, and the 2010 re-analysis of vials from the H2S-rich spark discharge experiment.
|
https://en.wikipedia.org/wiki?curid=20821
|
Majority function
In Boolean logic, the majority function (also called the median operator) is a function from "n" inputs to one output. The value of the operation is false when "n"/2 or more arguments are false, and true otherwise.
Alternatively, representing true values as 1 and false values as 0, we may use the formula
The "−1/2" in the formula serves to break ties in favor of zeros when "n" is even. If the term "−1/2" is omitted, the formula can be used for a function that breaks ties in favor of ones.
Most applications deliberately force an odd number of inputs so they don't have to deal with the question of what happens when exactly half the inputs are 0 and exactly half the inputs are 1. The few systems that calculate the majority function on an even number of inputs are often biased towards "0"—they produce "0" when exactly half the inputs are 0 -- for example, a 4-input majority gate has a 0 output only when two or more 0's appear at its inputs. In a few systems, the tie can be broken randomly.
A "majority gate" is a logical gate used in circuit complexity and other applications of Boolean circuits. A majority gate returns true if and only if more than 50% of its inputs are true.
For instance, in a full adder, the carry output is found by applying a majority function to the three inputs, although frequently this part of the adder is broken down into several simpler logical gates.
Many systems have triple modular redundancy; they use the majority function for majority logic decoding to implement error correction.
A major result in circuit complexity asserts that the majority function cannot be computed by AC0 circuits of subexponential size.
For any "x", "y", and "z", the ternary median operator 〈"x", "y", "z"〉 satisfies the following equations.
An abstract system satisfying these as axioms is a median algebra.
For "n" = 1 the median operator is just the unary identity operation "x". For "n" = 3 the ternary median operator can be expressed using conjunction and disjunction as "xy" + "yz" + "zx". Remarkably this expression denotes the same operation independently of whether the symbol + is interpreted as inclusive or or exclusive or.
For an arbitrary "n" there exists a monotone formula for majority of size O("n"5.3). This is proved using probabilistic method. Thus, this formula is non-constructive.
Approaches exist for an explicit formula for majority of polynomial size:
|
https://en.wikipedia.org/wiki?curid=20823
|
Modula
The Modula programming language is a descendant of the Pascal language. It was developed in Switzerland, at ETH Zurich, in the mid-1970s by Niklaus Wirth, the same person who designed Pascal. The main innovation of Modula over Pascal is a module system, used for grouping sets of related declarations into program units; hence the name "Modula". The language is defined in a report by Wirth called "Modula. A language for modular multiprogramming" published 1976.
Modula was first implemented by Wirth on a PDP-11. Very soon, other implementations followed, most importantly, the compilers developed for University of York Modula, and one at Philips Laboratories named PL Modula, which generated code for the LSI-11 microprocessor.
The development of Modula was discontinued soon after its publication. Wirth then concentrated his efforts on Modula's successor, Modula-2.
|
https://en.wikipedia.org/wiki?curid=20824
|
Monolithic kernel
A monolithic kernel is an operating system architecture where the entire operating system is working in kernel space. The monolithic model differs from other operating system architectures (such as the microkernel architecture) in that it alone defines a high-level virtual interface over computer hardware. A set of primitives or system calls implement all operating system services such as process management, concurrency, and memory management. Device drivers can be added to the kernel as modules.
Modular operating systems such as OS-9 and most modern monolithic operating systems such as OpenVMS, Linux, BSD, SunOS, AIX, and MULTICS can dynamically load (and unload) executable modules at runtime.
This modularity of the operating system is at the binary (image) level and not at the architecture level. Modular monolithic operating systems are not to be confused with the architectural level of modularity inherent in server-client operating systems (and its derivatives sometimes marketed as hybrid kernel) which use microkernels and servers (not to be mistaken for modules or daemons).
Practically speaking, dynamically loading modules is simply a more flexible way of handling the operating system image at runtime—as opposed to rebooting with a different operating system image. The modules allow easy extension of the operating systems' capabilities as required. Dynamically loadable modules incur a small overhead when compared to building the module into the operating system image.
However, in some cases, loading modules dynamically (as-needed) helps to keep the amount of code running in kernel space to a minimum; for example, to minimize operating system footprint for embedded devices or those with limited hardware resources. Namely, an unloaded module need not be stored in scarce random access memory.
|
https://en.wikipedia.org/wiki?curid=20825
|
Mithraism
Mithraism, also known as the Mithraic mysteries, was a Roman mystery religion centered on the god Mithras. The religion was inspired by Iranian worship of the Zoroastrian Angelic Divinity ("yazata") Mithra, though the Greek "Mithras" was linked to a new and distinctive imagery, and the level of continuity between Persian and Greco-Roman practice is debated. The mysteries were popular among the Roman military from about the 1st to the 4th century CE.
Worshippers of Mithras had a complex system of seven grades of initiation and communal ritual meals. Initiates called themselves "syndexioi", those "united by the handshake". They met in underground temples, now called "mithraea" (singular "mithraeum"), which survive in large numbers. The cult appears to have had its centre in Rome, and was popular throughout the western half of the empire, as far south as Roman Africa and Numidia, as far north as Roman Britain, and to a lesser extent in Roman Syria in the east.
Mithraism is viewed as a rival of early Christianity. In the 4th century, Mithraists faced persecution from Christians and the religion was subsequently suppressed and eliminated in the empire by the end of the century.
Numerous archaeological finds, including meeting places, monuments and artifacts, have contributed to modern knowledge about Mithraism throughout the Roman Empire. The iconic scenes of Mithras show him being born from a rock, slaughtering a bull, and sharing a banquet with the god Sol (the Sun). About 420 sites have yielded materials related to the cult. Among the items found are about 1000 inscriptions, 700 examples of the bull-killing scene (tauroctony), and about 400 other monuments. It has been estimated that there would have been at least 680 mithraea in Rome. No written narratives or theology from the religion survive; limited information can be derived from the inscriptions and brief or passing references in Greek and Latin literature. Interpretation of the physical evidence remains problematic and contested.
The term "Mithraism" is a modern convention. Writers of the Roman era referred to it by phrases such as "Mithraic mysteries", "mysteries of Mithras" or "mysteries of the Persians". Modern sources sometimes refer to the Greco-Roman religion as "Roman Mithraism" or "Western Mithraism" to distinguish it from Persian worship of Mithra.
The name "Mithras" (Latin, equivalent to Greek ""Μίθρας"") is a form of Mithra, the name of an Old Persian god – a relationship understood by Mithraic scholars since the days of Franz Cumont. An early example of the Greek form of the name is in a 4th-century BCE work by Xenophon, the Cyropaedia, which is a biography of the Persian king Cyrus the Great.
The exact form of a Latin or classical Greek word varies due to the grammatical process of declension. There is archaeological evidence that in Latin worshippers wrote the nominative form of the god's name as "Mithras". However, in Porphyry's Greek text "DeAbstinentia" ("Περὶ ἀποχῆς ἐμψύχων"), there is a reference to the now-lost histories of the Mithraic mysteries by Euboulus and Pallas, the wording of which suggests that these authors treated the name "Mithra" as an indeclinable foreign word.
Related deity-names in other languages include
Iranian ""Mithra"" and Sanskrit ""Mitra"" are believed to come from an Indo-Iranian word "mitra" meaning "contract" / "agreement" / "covenant".
Modern historians have different conceptions about whether these names refer to the same god or not. John R. Hinnells has written of Mitra / Mithra / Mithras as a single deity worshipped in several different religions. On the other hand, David Ulansey considers the bull-slaying Mithras to be a new god who began to be worshipped in the 1stcentury BCE, and to whom an old name was applied.
Mary Boyce, a researcher of ancient Iranian religions, writes that even though Roman Empire Mithraism seems to have had less Iranian content than historians used to think, nonetheless "as the name "Mithras" alone shows, this content was of some importance".
Much about the cult of Mithras is only known from reliefs and sculptures. There have been many attempts to interpret this material.
Mithras-worship in the Roman Empire was characterized by images of the god slaughtering a bull. Other images of Mithras are found in the Roman temples, for instance Mithras banqueting with Sol, and depictions of the birth of Mithras from a rock. But the image of bull-slaying (tauroctony) is always in the central niche. Textual sources for a reconstruction of the theology behind this iconography are very rare. (See section Interpretations of the bull-slaying scene below.)
The practice of depicting the god slaying a bull seems to be specific to Roman Mithraism. According to David Ulansey, this is "perhaps the most important example" of evident difference between Iranian and Roman traditions: "...there is no evidence that the Iranian god Mithra ever had anything to do with killing a bull."
In every mithraeum the centrepiece was a representation of Mithras killing a sacred bull, an act called the tauroctony. The image may be a relief, or free-standing, and side details may be present or omitted. The centre-piece is Mithras clothed in Anatolian costume and wearing a Phrygian cap; who is kneeling on the exhausted bull, holding it by the nostrils with his left hand, and stabbing it with his right. As he does so, he looks over his shoulder towards the figure of Sol. A dog and a snake reach up towards the blood. A scorpion seizes the bull's genitals. A raven is flying around or is sitting on the bull. Three ears of wheat are seen coming out from the bull's tail, sometimes from the wound. The bull was often white. The god is sitting on the bull in an unnatural way with his right leg constraining the bull's hoof and the left leg is bent and resting on the bull's back or flank. The two torch-bearers are on either side, dressed like Mithras, Cautes with his torch pointing up and Cautopates with his torch pointing down. An image search for tauroctony will show many examples of the variations. Sometimes Cautes and Cautopates carry shepherds' crooks instead of torches.
The event takes place in a cavern, into which Mithras has carried the bull, after having hunted it, ridden it and overwhelmed its strength. Sometimes the cavern is surrounded by a circle, on which the twelve signs of the zodiac appear. Outside the cavern, top left, is Sol the sun, with his flaming crown, often driving a quadriga. A ray of light often reaches down to touch Mithras. At the top right is Luna, with her crescent moon, who may be depicted driving a biga.
In some depictions, the central tauroctony is framed by a series of subsidiary scenes to the left, top and right, illustrating events in the Mithras narrative; Mithras being born from the rock, the water miracle, the hunting and riding of the bull, meeting Sol who kneels to him, shaking hands with Sol and sharing a meal of bull-parts with him, and ascending to the heavens in a chariot. In some instances, as is the case in the stucco icon at Santa Prisca Mithraeum in Rome, the god is shown heroically nude. Some of these reliefs were constructed so that they could be turned on an axis. On the back side was another, more elaborate feasting scene. This indicates that the bull killing scene was used in the first part of the celebration, then the relief was turned, and the second scene was used in the second part of the celebration. Besides the main cult icon, a number of mithraea had several secondary tauroctonies, and some small portable versions, probably meant for private devotion, have also been found.
The second most important scene after the tauroctony in Mithraic art is the so-called banquet scene. The banquet scene features Mithras and Sol Invictus banqueting on the hide of the slaughtered bull. On the specific banquet scene on the Fiano Romano relief, one of the torchbearers points a caduceus towards the base of an altar, where flames appear to spring up. Robert Turcan has argued that since the caduceus is an attribute of Mercury, and in mythology Mercury is depicted as a psychopomp, the eliciting of flames in this scene is referring to the dispatch of human souls and expressing the Mithraic doctrine on this matter. Turcan also connects this event to the tauroctony: The blood of the slain bull has soaked the ground at the base of the altar, and from the blood the souls are elicited in flames by the caduceus.
Mithras is depicted as being born from a rock. He is shown as emerging from a rock, already in his youth, with a dagger in one hand and a torch in the other. He is nude, standing with his legs together, and is wearing a Phrygian cap.
However, there are variations. Sometimes he is shown as coming out of the rock as a child, and in one instance he has a globe in one hand; sometimes a thunderbolt is seen. There are also depictions in which flames are shooting from the rock and also from Mithras' cap. One statue had its base perforated so that it could serve as a fountain, and the base of another has the mask of the water god. Sometimes Mithras also has other weapons such as bows and arrows, and there are also animals such as dogs, serpents, dolphins, eagles, other birds, lion, crocodiles, lobsters and snails around. On some reliefs, there is a bearded figure identified as Oceanus, the water god, and on some there are the gods of the four winds. In these reliefs, the four elements could be invoked together. Sometimes Victoria, Luna, Sol, and Saturn also seem to play a role. Saturn in particular is often seen handing over the dagger or short sword to Mithras, used later in the tauroctony.
In some depictions, Cautes and Cautopates are also present; sometimes they are depicted as shepherds.
On some occasions, an amphora is seen, and a few instances show variations like an egg birth or a tree birth. Some interpretations show that the birth of Mithras was celebrated by lighting torches or candles.
One of the most characteristic and poorly-understood features of the Mysteries is the naked lion-headed figure often found in Mithraic temples, named by the modern scholars with descriptive terms such as "leontocephaline" (lion-headed) or "leontocephalus" (lion-head).
His body is a naked man's, entwined by a serpent (or two serpents, like a caduceus), with the snake's head often resting on the lion's head. The lion's mouth is often open, giving a horrifying impression. He is usually represented as having four wings, two keys (sometimes a single key), and a sceptre in his hand. Sometimes the figure is standing on a globe inscribed with a diagonal cross. On the figure from the Ostia Antica Mithraeum (left, CIMRM 312), the four wings carry the symbols of the four seasons, and a thunderbolt is engraved on his chest. At the base of the statue are the hammer and tongs of Vulcan and Mercury's cock and wand (caduceus). A rare variation of the same figure is also found with a human head and a lion's head emerging from its chest.
Although animal-headed figures are prevalent in contemporary Egyptian and Gnostic mythological representations, no exact parallel to the Mithraic leontocephaline figure has been found.
Based on dedicatory inscriptions for altars, the name of the figure is conjectured to be "Arimanius", a Latinized form of the name "Ahriman" – a demonic figure in the Zoroastrian pantheon. Arimanius is known from inscriptions to have been a god in the Mithraic cult as seen, for example, in images from the "Corpus Inscriptionum et Monumentorum Religionis Mithriacae" (CIMRM) such as CIMRM 222 from Ostia, CIMRM 369 from Rome, and CIMRM 1773 and 1775 from Pannonia.
Some scholars identify the lion-man as Aion, or Zurvan, or Cronus, or Chronos, while others assert that it is a version of the Zoroastrian Ahriman or Vedic Aryaman.
There is also speculation that the figure is the Gnostic demiurge, Ialdabaoth. Although the exact identity of the lion-headed figure is debated by scholars, it is largely agreed that the god is associated with time and seasonal change.
However an occultist, D. Jason Cooper, speculates to the contrary that the lion-headed figure is not a god, but rather represents the spiritual state achieved in Mithraism's "adept" level, the Leo (lion) degree.
According to M. J. Vermaseren and C. C. van Essen, the Mithraic New Year and the birthday of Mithras was on December 25. However, Beck disagrees strongly. Clauss states: "the Mithraic Mysteries had no public ceremonies of its own. The festival of "Natalis Invicti", held on 25 December, was a general festival of the Sun, and by no means specific to the Mysteries of Mithras."
Mithraic initiates were required to swear an oath of secrecy and dedication, and some grade rituals involved the recital of a catechism, wherein the initiate was asked a series of questions pertaining to the initiation symbolism and had to reply with specific answers. An example of such a catechism, apparently pertaining to the Leo grade, was discovered in a fragmentary Egyptian papyrus (P.Berolinensis 21196), and reads:
Almost no Mithraic scripture or first-hand account of its highly secret rituals survives; with the exception of the aforementioned oath and catechism, and the document known as the Mithras Liturgy, from 4th century Egypt, whose status as a Mithraist text has been questioned by scholars including Franz Cumont. The walls of mithraea were commonly whitewashed, and where this survives it tends to carry extensive repositories of graffiti; and these, together with inscriptions on Mithraic monuments, form the main source for Mithraic texts.
Nevertheless, it is clear from the archaeology of numerous mithraea that most rituals were associated with feasting – as eating utensils and food residues are almost invariably found. These tend to include both animal bones and also very large quantities of fruit residues. The presence of large amounts of cherry-stones in particular would tend to confirm mid-summer (late June, early July) as a season especially associated with Mithraic festivities. The Virunum "album", in the form of an inscribed bronze plaque, records a Mithraic festival of commemoration as taking place on 26 June 184. Beck argues that religious celebrations on this date are indicative of special significance being given to the summer solstice; but this time of the year coincides with ancient recognition of the solar maximum at midsummer, whilst iconographically identical holidays such as Litha, St John's Eve, and Jāņi are observed also.
For their feasts, Mithraic initiates reclined on stone benches arranged along the longer sides of the mithraeumtypically there might be room for 15 to 30 diners, but very rarely many more than 40 men. Counterpart dining rooms, or "triclinia", were to be found above ground in the precincts of almost any temple or religious sanctuary in the Roman empire, and such rooms were commonly used for their regular feasts by Roman 'clubs', or "collegia". Mithraic feasts probably performed a very similar function for Mithraists as the "collegia" did for those entitled to join them; indeed, since qualification for Roman "collegia" tended to be restricted to particular families, localities or traditional trades, Mithraism may have functioned in part as providing clubs for the unclubbed. However, the size of the mithraeum is not necessarily an indication of the size of the congregation.
Each mithraeum had several altars at the further end, underneath the representation of the tauroctony, and also commonly contained considerable numbers of subsidiary altars, both in the main mithraeum chamber and in the ante-chamber or narthex. These altars, which are of the standard Roman pattern, each carry a named dedicatory inscription from a particular initiate, who dedicated the altar to Mithras "in fulfillment of his vow", in gratitude for favours received. Burned residues of animal entrails are commonly found on the main altars indicating regular sacrificial use. However, mithraea do not commonly appear to have been provided with facilities for ritual slaughter of sacrificial animals (a highly specialised function in Roman religion), and it may be presumed that a mithraeum would have made arrangements for this service to be provided for them in co-operation with the professional "victimarius" of the civic cult. Prayers were addressed to the Sun three times a day, and Sunday was especially sacred.
It is doubtful whether Mithraism had a monolithic and internally consistent doctrine. It may have varied from location to location. However, the iconography is relatively coherent. It had no predominant sanctuary or cultic centre; and, although each mithraeum had its own officers and functionaries, there was no central supervisory authority. In some mithraea, such as that at Dura Europos, wall paintings depict prophets carrying scrolls, but no named Mithraic sages are known, nor does any reference give the title of any Mithraic scripture or teaching. It is known that intitates could transfer with their grades from one Mithraeum to another.
Temples of Mithras are sunk below ground, windowless, and very distinctive. In cities, the basement of an apartment block might be converted; elsewhere they might be excavated and vaulted over, or converted from a natural cave. Mithraic temples are common in the empire; although unevenly distributed, with considerable numbers found in Rome, Ostia, Numidia, Dalmatia, Britain and along the Rhine/Danube frontier, while being somewhat less common in Greece, Egypt, and Syria. According to Walter Burkert, the secret character of Mithraic rituals meant that Mithraism could only be practiced within a Mithraeum. Some new finds at Tienen show evidence of large-scale feasting and suggest that the mystery religion may not have been as secretive as was generally believed.
For the most part, mithraea tend to be small, externally undistinguished, and cheaply constructed; the cult generally preferring to create a new centre rather than expand an existing one. The mithraeum represented the cave to which Mithras carried and then killed the bull; and where stone vaulting could not be afforded, the effect would be imitated with lath and plaster. They are commonly located close to springs or streams; fresh water appears to have been required for some Mithraic rituals, and a basin is often incorporated into the structure. There is usually a narthex or ante-chamber at the entrance, and often other ancillary rooms for storage and the preparation of food. The extant mithraea present us with actual physical remains of the architectural structures of the sacred spaces of the Mithraic cult. Mithraeum is a modern coinage and mithraists referred to their sacred structures as "speleum" or "antrum" (cave), "crypta" (underground hallway or corridor), "fanum" (sacred or holy place), or even "templum" (a temple or a sacred space).
In their basic form, mithraea were entirely different from the temples and shrines of other cults. In the standard pattern of Roman religious precincts, the temple building functioned as a house for the god, who was intended to be able to view, through the opened doors and columnar portico, sacrificial worship being offered on an altar set in an open courtyard—potentially accessible not only to initiates of the cult, but also to "colitores" or non-initiated worshippers. Mithraea were the antithesis of this.
In the Suda under the entry "Mithras", it states that “No one was permitted to be initiated into them (the mysteries of Mithras), until he should show himself holy and steadfast by undergoing several graduated tests.” Gregory Nazianzen refers to the “tests in the mysteries of Mithras”.
There were seven grades of initiation into Mithraism, which are listed by St. Jerome. Manfred Clauss states that the number of grades, seven, must be connected to the planets. A mosaic in the Mithraeum of Felicissimus, Ostia Antica depicts these grades, with symbolic emblems that are connected either to the grades or are symbols of the planets. The grades also have an inscription beside them commending each grade into the protection of the different planetary gods. In ascending order of importance, the initiatory grades were:
Elsewhere, as at Dura-Europos, Mithraic graffiti survive giving membership lists, in which initiates of a mithraeum are named with their Mithraic grades. At Virunum, the membership list or "album sacratorum" was maintained as an inscribed plaque, updated year by year as new members were initiated. By cross-referencing these lists it is possible to track some initiates from one mithraeum to another; and also speculatively to identify Mithraic initiates with persons on other contemporary lists such as military service rolls and lists of devotees of non-Mithraic religious sanctuaries. Names of initiates are also found in the dedication inscriptions of altars and other cult objects. Clauss noted in 1990 that overall, only about 14% of Mithraic names inscribed before 250 CE identify the initiate's grade – and hence questioned the traditional view that all initiates belonged to one of the seven grades. Clauss argues that the grades represented a distinct class of priests, "sacerdotes". Gordon maintains the former theory of Merkelbach and others, especially noting such examples as Dura where all names are associated with a Mithraic grade. Some scholars maintain that practice may have differed over time, or from one Mithraeum to another.
The highest grade, "pater", is by far the most common one found on dedications and inscriptions – and it would appear not to have been unusual for a mithraeum to have several men with this grade. The form "pater patrum" (father of fathers) is often found, which appears to indicate the "pater" with primary status. There are several examples of persons, commonly those of higher social status, joining a mithraeum with the status "pater" – especially in Rome during the 'pagan revival' of the 4th century. It has been suggested that some mithraea may have awarded honorary "pater" status to sympathetic dignitaries.
The initiate into each grade appears to have been required to undertake a specific ordeal or test, involving exposure to heat, cold or threatened peril. An 'ordeal pit', dating to the early 3rd century, has been identified in the mithraeum at Carrawburgh. Accounts of the cruelty of the emperor Commodus describes his amusing himself by enacting Mithraic initiation ordeals in homicidal form. By the later 3rd century, the enacted trials appear to have been abated in rigor, as 'ordeal pits' were floored over.
Admission into the community was completed with a handshake with the "pater", just as Mithras and Sol shook hands. The initiates were thus referred to as "syndexioi" (those united by the handshake). The term is used in an inscription by Proficentius and derided by Firmicus Maternus in "De errore profanarum religionum", a 4th century Christian work attacking paganism. In ancient Iran, taking the right hand was the traditional way of concluding a treaty or signifying some solemn understanding between two parties.
Activities of the most prominent deities in Mithraic scenes, Sol and Mithras, were imitated in rituals by the two most senior officers in the cult's hierarchy, the "Pater" and the "Heliodromus". The initiates held a sacramental banquet, replicating the feast of Mithras and Sol.
Reliefs on a cup found in Mainz, appear to depict a Mithraic initiation. On the cup, the initiate is depicted as being led into a location where a "Pater" would be seated in the guise of Mithras with a drawn bow. Accompanying the initiate is a mystagogue, who explains the symbolism and theology to the initiate. The Rite is thought to re-enact what has come to be called the ‘Water Miracle’, in which Mithras fires a bolt into a rock, and from the rock now spouts water.
Roger Beck has hypothesized a third processional Mithraic ritual, based on the Mainz cup and Porphyrys. This scene, called ‘Procession of the Sun-Runner’, shows the "Heliodromus" escorted by two figures representing Cautes and Cautopates (see below) and preceded by an initiate of the grade "Miles" leading a ritual enactment of the solar journey around the mithraeum, which was intended to represent the cosmos.
Consequently, it has been argued that most Mithraic rituals involved a re-enactment by the initiates of episodes in the Mithras narrative, a narrative whose main elements were: birth from the rock, striking water from stone with an arrow shot, the killing of the bull, Sol's submission to Mithras, Mithras and Sol feasting on the bull, the ascent of Mithras to heaven in a chariot. A noticeable feature of this narrative (and of its regular depiction in surviving sets of relief carvings) is the absence of female personages (the sole exception being Luna watching the
tauroctony in the upper corner opposite Helios).
Only male names appear in surviving inscribed membership lists. Historians including Cumont and Richard Gordon have concluded that the cult was for men only.
The ancient scholar Porphyry refers to female initiates in Mithraic rites. However, the early 20th-century historian A.S. Geden writes that this may be due to a misunderstanding. According to Geden, while the participation of women in the ritual was not unknown in the Eastern cults, the predominant military influence in Mithraism makes it unlikely in this instance. It has recently been suggested by David Jonathan that "Women were involved with Mithraic groups in at least some locations of the empire."
Soldiers were strongly represented amongst Mithraists, and also merchants, customs officials and minor bureaucrats. Few, if any, initiates came from leading aristocratic or senatorial families until the 'pagan revival' of the mid-4th century; but there were always considerable numbers of freedmen and slaves.
Clauss suggests that a statement by Porphyry, that people initiated into the Lion grade must keep their hands pure from everything that brings pain and harm and is impure, means that moral demands were made upon members of congregations. A passage in the "Caesares" of Julian the Apostate refers to "commandments of Mithras". Tertullian, in his treatise "On the Military Crown" records that Mithraists in the army were officially excused from wearing celebratory coronets on the basis of the Mithraic initiation ritual that included refusing a proffered crown, because "their only crown was Mithras".
According to the archaeologist Maarten Vermaseren, 1stcentury BCE evidence from Commagene demonstrates the "reverence paid to Mithras" but does not refer to "the mysteries". In the colossal statuary erected by King Antiochus I (69–34 BCE) at Mount Nemrut, Mithras is shown beardless, wearing a Phrygian cap (or the similar headdress, Persian tiara), in Iranian (Parthian) clothing, and was originally seated on a throne alongside other deities and the king himself. On the back of the thrones there is an inscription in Greek, which includes the name Apollo Mithras Helios in the genitive case (Ἀπόλλωνος Μίθρου Ἡλίου). Vermaseren also reports about a Mithras cult in 3rd century BCE. Fayum. R. D. Barnett has argued that the royal seal of King Saussatar of Mitanni from "c". 1450 BCE. depicts a tauroctonous Mithras.
The origins and spread of the Mysteries have been intensely debated among scholars and there are radically differing views on these issues. According to Clauss mysteries of Mithras were not practiced until the 1st century CE. According to Ulansey, the earliest evidence for the Mithraic mysteries places their appearance in the middle of the 1st Century BCE: The historian Plutarch says that in 67 BCE the pirates of Cilicia (a province on the southeastern coast of Asia Minor) were practicing "secret rites" of Mithras. However, according to Daniels, whether any of this relates to the origins of the mysteries is unclear. The unique underground temples or mithraea appear suddenly in the archaeology in the last quarter of the 1st century CE.
Inscriptions and monuments related to the Mithraic Mysteries are catalogued in a two volume work by Maarten J. Vermaseren, the "Corpus Inscriptionum et Monumentorum Religionis Mithriacae" (or CIMRM). The earliest monument showing Mithras slaying the bull is thought to be CIMRM 593, found in Rome. There is no date, but the inscription tells us that it was dedicated by a certain Alcimus, steward of T. Claudius Livianus. Vermaseren and Gordon believe that this Livianus is a certain Livianus who was commander of the Praetorian guard in 101 CE, which would give an earliest date of 98–99 CE.
Five small terracotta plaques of a figure holding a knife over a bull have been excavated near Kerch in the Crimea, dated by Beskow and Clauss to the second half of the 1st Century BCE, and by Beck to 50 BCE–50 CE. These may be the earliest tauroctonies, if they are accepted to be a depiction of Mithras. The bull-slaying figure wears a Phrygian cap, but is described by Beck and Beskow as otherwise unlike standard depictions of the tauroctony. Another reason for not connecting these artifacts with the Mithraic Mysteries is that the first of these plaques was found in a woman's tomb.
An altar or block from near SS. Pietro e Marcellino on the Esquiline in Rome was inscribed with a bilingual inscription by an Imperial freedman named T. Flavius Hyginus, probably between 80–100 CE. It is dedicated to "Sol Invictus Mithras".
CIMRM 2268 is a broken base or altar from Novae/Steklen in Moesia Inferior, dated 100 CE, showing Cautes and Cautopates.
Other early archaeology includes the Greek inscription from Venosia by Sagaris "actor" probably from 100–150 CE; the Sidon "cippus" dedicated by Theodotus priest of Mithras to Asclepius, 140–141 CE; and the earliest military inscription, by C. Sacidius Barbarus, centurion of XV Apollinaris, from the bank of the Danube at Carnuntum, probably before 114 CE.
According to C.M.Daniels, the Carnuntum inscription is the earliest Mithraic dedication from the Danube region, which along with Italy is one of the two regions where Mithraism first struck root. The earliest dateable mithraeum outside Rome dates from 148 CE. The Mithraeum at Caesarea Maritima is the only one in Palestine and the date is inferred.
According to Roger Beck, the attested locations of the Roman cult in the earliest phase (circa 80 120 CE) are as follows:
Mithraea datable from pottery
Datable dedications
According to Boyce, the earliest literary references to the mysteries are by the Latin poet Statius, about 80 CE, and Plutarch (c. 100 CE).
The Thebaid (c. 80 CE) an epic poem by Statius, pictures Mithras in a cave, wrestling with something that has horns. The context is a prayer to the god Phoebus. The cave is described as "persei", which in this context is usually translated "Persian"; however, according to the translator J.H. Mozley it literally means "Persean", referring to "Perses", the son of Perseus and Andromeda, this Perses being the ancestor of the Persians according to Greek legend.
The early Christian apologist, Justin Martyr, writing in approximately 145 CE, charges the cult of Mithras with imitating the Christian communion. After describing the Eucharist, he writes,
The Greek biographer Plutarch (46–127 CE) says that "secret mysteries ... of Mithras" were practiced by the pirates of Cilicia, the coastal province in the southeast of Anatolia, who were active in the 1st Century BCE: "They likewise offered strange sacrifices; those of Olympus I mean; and they celebrated certain secret mysteries, among which those of Mithras continue to this day, being originally instituted by them." He mentions that the pirates were especially active during the Mithridatic wars (between the Roman Republic and King Mithridates VI of Pontus) in which they supported the king. The association between Mithridates and the pirates is also mentioned by the ancient historian Appian. The 4th century commentary on Vergil by Servius says that Pompey settled some of these pirates in Calabria in southern Italy.
The historian Dio Cassius (2nd to 3rd century CE) tells how the name of Mithras was spoken during the state visit to Rome of Tiridates I of Armenia, during the reign of Nero. (Tiridates was the son of Vonones II of Parthia, and his coronation by Nero in 66 CE confirmed the end of a war between Parthia and Rome.) Dio Cassius writes that Tiridates, as he was about to receive his crown, told the Roman emperor that he revered him "as Mithras". Roger Beck thinks it possible that this episode contributed to the emergence of Mithraism as a popular religion in Rome.
The philosopher Porphyry (3rd–4th century CE) gives an account of the origins of the Mysteries in his work "De antro nympharum" (The Cave of the Nymphs). Citing Eubulus as his source, Porphyry writes that the original temple of Mithras was a natural cave, containing fountains, which Zoroaster found in the mountains of Persia. To Zoroaster, this cave was an image of the whole world, so he consecrated it to Mithras, the creator of the world. Later in the same work, Porphyry links Mithras and the bull with planets and star-signs: Mithras himself is associated with the sign of Aries and the planet Mars, while the bull is associated with Venus.
Porphyry is writing close to the demise of the cult, and Robert Turcan has challenged the idea that Porphyry's statements about Mithraism are accurate. His case is that far from representing what Mithraists believed, they are merely representations by the Neoplatonists of what it suited them in the late 4th century to read into the mysteries. However, Merkelbach and Beck believe that Porphyry’s work "is in fact thoroughly coloured with the doctrines of the Mysteries". Beck holds that classical scholars have neglected Porphyry’s evidence and have taken an unnecessarily skeptical view of Porphyry. According to Beck, Porphyry's "De antro" is the only clear text from antiquity which tells us about the intent of the Mithraic Mysteries and how that intent was realized. David Ulansey finds it important that Porphyry "confirms ... that astral conceptions played an important role in Mithraism."
In later antiquity, the Greek name of Mithras ("Μίθρας" ) occurs in the text known as the "Mithras Liturgy", a part of the "Paris Greek Magical Papyrus" (Paris Bibliothèque Nationale Suppl. gr. 574); here Mithras is given the epithet "the great god", and is identified with the sun god Helios. There have been different views among scholars as to whether this text is an expression of Mithraism as such. Franz Cumont argued that it isn’t; Marvin Meyer thinks it is; while Hans Dieter Betz sees it as a synthesis of Greek, Egyptian, and Mithraic traditions.
Scholarship on Mithras begins with Franz Cumont, who published a two volume collection of source texts and images of monuments in French in 1894-1900, "Textes et monuments figurés relatifs aux mystères de Mithra" [French: "Texts and Illustrated Monuments Relating to the Mysteries of Mithra"]. An English translation of part of this work was published in 1903, with the title "The Mysteries of Mithra". Cumont’s hypothesis, as the author summarizes it in the first 32 pages of his book, was that the Roman religion was "the Roman form of Mazdaism", the Persian state religion, disseminated from the East. He identified the ancient Aryan deity who appears in Persian literature as Mithras with the Hindu god Mitra of the Vedic hymns. According to Cumont, the god Mithra came to Rome "accompanied by a large representation of the Mazdean Pantheon". Cumont considers that while the tradition "underwent some modification in the Occident ... the alterations that it suffered were largely superficial".
Cumont's theories came in for severe criticism from John R. Hinnells and R.L. Gordon at the First International Congress of Mithraic Studies held in 1971. John Hinnells was unwilling to reject entirely the idea of Iranian origin, but wrote: "we must now conclude that his reconstruction simply will not stand. It receives no support from the Iranian material and is in fact in conflict with the ideas of that tradition as they are represented in the extant texts. Above all, it is a theoretical reconstruction which does not accord with the actual Roman iconography." He discussed Cumont’s reconstruction of the bull-slaying scene and stated "that the portrayal of Mithras given by Cumont is not merely unsupported by Iranian texts but is actually in serious conflict with known Iranian theology." Another paper by R.L. Gordon argued that Cumont severely distorted the available evidence by forcing the material to conform to his predetermined model of Zoroastrian origins. Gordon suggested that the theory of Persian origins was completely invalid and that the Mithraic mysteries in the West was an entirely new creation.
A similar view has been expressed by Luther H. Martin: "Apart from the name of the god himself, in other words, Mithraism seems to have developed largely in and is, therefore, best understood from the context of Roman culture."
However, according to Hopfe, "All theories of the origin of Mithraism acknowledge a connection, however vague, to the Mithra/Mitra figure of ancient Aryan religion." Reporting on the Second International Congress of Mithraic Studies, 1975, Ugo Bianchi says that although he welcomes "the tendency to question in historical terms the relations between Eastern and Western Mithraism", it "should not mean obliterating what was clear to the Romans themselves, that Mithras was a 'Persian' (in wider perspective: an Indo-Iranian) god."
Boyce states that "no satisfactory evidence has yet been adduced to show that, before Zoroaster, the concept of a supreme god existed among the Iranians, or that among them Mithra – or any other divinity – ever enjoyed a separate cult of his or her own outside either their ancient or their Zoroastrian pantheons." However, she also says that although recent studies have minimized the Iranizing aspects of the self-consciously Persian religion "at least in the form which it attained under the Roman Empire", the name "Mithras" is enough to show "that this aspect is of some importance". She also says that "the Persian affiliation of the Mysteries is acknowledged in the earliest literary references to them."
Beck tells us that since the 1970s scholars have generally rejected Cumont, but adds that recent theories about how Zoroastrianism was during the period BCE now make some new form of Cumont's east-west transfer possible. He says that
... an indubitable residuum of things Persian in the Mysteries and a better knowledge of what constituted actual Mazdaism have allowed modern scholars to postulate for Roman Mithraism a continuing Iranian theology. This indeed is the main line of Mithraic scholarship, the Cumontian model which subsequent scholars accept, modify, or reject. For the transmission of Iranian doctrine from East to West, Cumont postulated a plausible, if hypothetical, intermediary: the Magusaeans of the Iranian diaspora in Anatolia. More problematic – and never properly addressed by Cumont or his successors – is how real-life Roman Mithraists subsequently maintained a quite complex and sophisticated Iranian theology behind an occidental facade. Other than the images at Dura of the two 'magi' with scrolls, there is no direct and explicit evidence for the carriers of such doctrines. ... Up to a point, Cumont’s Iranian paradigm, especially in Turcan’s modified form, is certainly plausible.
He also says that "the old Cumontian model of formation in, and diffusion from, Anatolia ... is by no means dead – nor should it be."
Beck theorizes that the cult was created in Rome, by a single founder who had some knowledge of both Greek and Oriental religion, but suggests that some of the ideas used may have passed through the Hellenistic kingdoms. He observes that "Mithras – moreover, a Mithras who was identified with the Greek Sun god Helios" was among the gods of the syncretic Greco-Armenian-Iranian royal cult at Nemrut, founded by Antiochus I of Commagene in the mid 1st century BCE. While proposing the theory, Beck says that his scenario may be regarded as Cumontian in two ways. Firstly, because it looks again at Anatolia and Anatolians, and more importantly, because it hews back to the methodology first used by Cumont.
Merkelbach suggests that its mysteries were essentially created by a particular person or persons and created in a specific place, the city of Rome, by someone from an eastern province or border state who knew the Iranian myths in detail, which he wove into his new grades of initiation; but that he must have been Greek and Greek-speaking because he incorporated elements of Greek Platonism into it. The myths, he suggests, were probably created in the milieu of the imperial bureaucracy, and for its members. Clauss tends to agree. Beck calls this "the most likely scenario" and states "Until now, Mithraism has generally been treated as if it somehow evolved Topsy-like from its Iranian precursor – a most implausible scenario once it is stated explicitly."
Archaeologist Lewis M. Hopfe notes that there are only three mithraea in Roman Syria, in contrast to further west. He writes: "Archaeology indicates that Roman Mithraism had its epicenter in Rome ... the fully developed religion known as Mithraism seems to have begun in Rome and been carried to Syria by soldiers and merchants."
Taking a different view from other modern scholars, Ulansey argues that the Mithraic mysteries began in the Greco-Roman world as a religious response to the discovery by the Greek astronomer Hipparchus of the astronomical phenomenon of the precession of the equinoxes – a discovery that amounted to discovering that the entire cosmos was moving in a hitherto unknown way. This new cosmic motion, he suggests, was seen by the founders of Mithraism as indicating the existence of a powerful new god capable of shifting the cosmic spheres and thereby controlling the universe.
However, A. D. H. Bivar, L. A. Campbell and G. Widengren have variously argued that Roman Mithraism represents a continuation of some form of Iranian Mithra worship.
According to Antonia Tripolitis, Roman Mithraism originated in Vedic India and picked up many features of the cultures which it encountered in its westward journey.
Michael Speidel, who specializes in military history, associates Mithras with the Sun god Orion.
The first important expansion of the mysteries in the Empire seems to have happened quite quickly, late in the reign of Antoninus Pius (b. 121 CE, d. 161 CE) and under Marcus Aurelius. By this time all the key elements of the mysteries were in place.
Mithraism reached the apogee of its popularity during the 2nd and 3rd centuries, spreading at an "astonishing" rate at the same period when the worship of Sol Invictus was incorporated into the state-sponsored cults. At this period a certain Pallas devoted a monograph to Mithras, and a little later Euboulus wrote a "History of Mithras", although both works are now lost. According to the 4th century Historia Augusta, the emperor Commodus participated in its mysteries but it never became one of the state cults.
The historian Jacob Burckhardt writes:
The religion and its followers faced persecution in the 4th century from Christianization, and Mithraism came to an end at some point between its last decade and the 5th century. Ulansey states that "Mithraism declined with the rise to power of Christianity, until the beginning of the fifth century, when Christianity became strong enough to exterminate by force rival religions such as Mithraism." According to Speidel, Christians fought fiercely with this feared enemy and suppressed it during the late 4thcentury. Mithraic sanctuaries were destroyed and religion was no longer a matter of personal choice. According to Luther H. Martin, Roman Mithraism came to an end with the anti-pagan decrees of the Christian emperor Theodosius during the last decade of the 4thcentury.
Clauss states that inscriptions show Mithras as one of the cults listed on inscriptions by Roman senators who had not converted to Christianity, as part of the "pagan revival" among the elite in the second half of the 4th century. Beck states that "Quite early in the [fourth] century the religion was as good as dead throughout the empire." However, archaeological evidence indicates the continuance of the cult of Mithras up until the end of the 4thcentury. In particular, large numbers of votive coins deposited by worshippers have been recovered at the Mithraeum at Pons Sarravi (Sarrebourg) in Gallia Belgica, in a series that runs from Gallienus (r. 253–268) to Theodosius I (r. 379–395). These were scattered over the floor when the mithraeum was destroyed, as Christians apparently regarded the coins as polluted; therefore, providing reliable dates for the functioning of the mithraeum up until near the end of the century.
Franz Cumont states that Mithraism may have survived in certain remote cantons of the Alps and Vosges into the 5thcentury. According to Mark Humphries, the deliberate concealment of Mithraic cult objects in some areas suggests that precautions were being taken against Christian attacks. However, in areas like the Rhine frontier, barbarian invasions may have also played a role in the end of Mithraism.
At some of the mithraeums that have been found below churches, such as the Santa Prisca Mithraeum and the San Clemente Mithraeum, the ground plan of the church above was made in a way to symbolize Christianity's domination of Mithraism. The cult disappeared earlier than that of Isis. Isis was still remembered in the Middle Ages as a pagan deity, but Mithras was already forgotten in late antiquity. "John, the Lord Chamberlain", a 1999–2014 series of historical mystery novels, depicts a secret Mithraist community still active in Justinian's court (r. 527–567), but there is no historical evidence for such a late survival of the religion.
According to Cumont, the imagery of the tauroctony was a Graeco-Roman representation of an event in Zoroastrian cosmogony described in a 9th-century Zoroastrian text, the Bundahishn. In this text the evil spirit Ahriman (not Mithra) slays the primordial creature Gavaevodata, which is represented as a bovine. Cumont held that a version of the myth must have existed in which Mithras, not Ahriman, killed the bovine. But according to Hinnells, no such variant of the myth is known, and that this is merely speculation: "In no known Iranian text [either Zoroastrian or otherwise] does Mithra slay a bull."
David Ulansey finds astronomical evidence from the mithraeum itself. He reminds us that the Platonic writer Porphyry wrote in the 3rdcentury CE that the cave-like temple Mithraea depicted "an image of the world" and that Zoroaster consecrated a cave resembling the world fabricated by Mithras. The ceiling of the Caesarea Maritima Mithraeum retains traces of blue paint, which may mean the ceiling was painted to depict the sky and the stars.
Beck has given the following celestial anatomy of the Tauroctony:
Several celestial identities for the Tauroctonous Mithras (TM) himself have been proposed. Beck summarizes them in the table below.
Ulansey has proposed that Mithras seems to have been derived from the constellation of Perseus, which is positioned just above Taurus in the night sky. He sees iconographic and mythological parallels between the two figures: both are young heroes, carry a dagger, and wear a Phrygian cap. He also mentions the similarity of the image of Perseus killing the Gorgon and the tauroctony, both figures being associated with caverns and both having connections to Persia as further evidence.
Michael Speidel associates Mithras with the constellation of Orion because of the proximity to Taurus, and the consistent nature of the depiction of the figure as having wide shoulders, a garment flared at the hem, and narrowed at the waist with a belt, thus taking on the form of the constellation.
Beck has criticized Speidel and Ulansey of adherence to a literal cartographic logic, describing their theories as a "will-o'-the-wisp" that "lured them down a false trail". He argues that a literal reading of the tauroctony as a star chart raises two major problems: it is difficult to find a constellation counterpart for Mithras himself (despite efforts by Speidel and Ulansey) and that, unlike in a star chart, each feature of the tauroctony might have more than a single counterpart. Rather than seeing Mithras as a constellation, Beck argues that Mithras is the prime traveller on the celestial stage (represented by the other symbols of the scene), the Unconquered Sun moving through the constellations. But again, Meyer holds that the Mithras Liturgy reflects the world of Mithraism and may be a confirmation for Ulansey's theory of Mithras being held responsible for the precession of equinoxes.
The cult of Mithras was part of the syncretic nature of ancient Roman religion. Almost all Mithraea contain statues dedicated to gods of other cults, and it is common to find inscriptions dedicated to Mithras in other sanctuaries, especially those of Jupiter Dolichenus. Mithraism was not an alternative to Rome's other traditional religions, but was one of many forms of religious practice, and many Mithraic initiates can also be found participating in the civic religion, and as initiates of other mystery cults.
Early Christian apologists noted similarities between Mithraic and Christian rituals, but nonetheless took an extremely negative view of Mithraism: they interpreted Mithraic rituals as evil copies of Christian ones. For instance, Tertullian wrote that as a prelude to the Mithraic initiation ceremony, the initiate was given a ritual bath and at the end of the ceremony, received a mark on the forehead. He described these rites as a diabolical counterfeit of the baptism and chrismation of Christians. Justin Martyr contrasted Mithraic initiation communion with the Eucharist:
Ernest Renan suggested in 1882 that, under different circumstances, Mithraism might have risen to the prominence of modern-day Christianity. Renan wrote: "If the growth of Christianity had been arrested by some mortal malady, the world would have been Mithraic". However, this theory has since been contested. Leonard Boyle wrote in 1987 that "too much ... has been made of the 'threat' of Mithraism to Christianity", pointing out that there are only fifty known mithraea in the entire city of Rome. J. Alvar Ezquerra holds that since the two religions did not share similar aims; there was never any real threat of Mithraism taking over the Roman world. Mithraism had backing from the Roman aristocracy during a time when their conservative values were seen as under attack during the rising tides of Christianity.
According to Mary Boyce, Mithraism was a potent enemy for Christianity in the West, though she is sceptical about its hold in the East. Filippo Coarelli (1979) has tabulated forty actual or possible Mithraea and estimated that Rome would have had "not less than 680–690" mithraea. Lewis M. Hopfe states that more than 400 Mithraic sites have been found. These sites are spread all over the Roman empire from places as far as Dura Europos in the east, and England in the west. He, too, says that Mithraism may have been a rival of Christianity. David Ulansey thinks Renan's statement "somewhat exaggerated", but does consider Mithraism "one of Christianity's major competitors in the Roman Empire". Ulansey sees the study of Mithraism as important for understanding "the cultural matrix out of which the Christian religion came to birth".
|
https://en.wikipedia.org/wiki?curid=20826
|
Minivan
Minivan is a van for vehicles designed to transport passengers in the rear seating row(s), with reconfigurable seats in two or three rows. The equivalent terms in British English are multi-purpose vehicle (MPV), people carrier and people mover. Minivans often have a 'one-box' or 'two-box' body configuration, a higher roof, a flat floor, a sliding door for rear passengers, and high H-point seating.
Compared with a full-size van, minivans are now based on a passenger car platform and have a lower body (to fit inside a typical garage door opening). Some early versions, such as the Ford Aerostar utilized the compact-sized Ranger pickup truck platform.
The largest size of minivans is also referred to as 'Large MPV' and became popular following the introduction of the 1984 Renault Espace and Dodge Caravan. Typically, these have platforms derived from D-segment passenger cars or compact pickups. Since the 1990s, the smaller Compact MPV and Mini MPV sizes of minivans have also become popular. If the term 'minivan' is used without specifying a size, it usually refers to the largest size (i.e. Large MPV).
The term "minivan" originated in North America in order to differentiate the smaller passenger vehicles from full-size vans (such as the Ford E-Series, Dodge Ram Van and Chevrolet Van), which were then simply called 'vans'.
The first known use of the term minivan was in 1959, however it was not until the 1980s that the term became commonly used.
The 1936 Stout Scarab is often regarded as the first minivan. The passenger seats in the Scarab were moveable and could be configured for the passengers to sit around a table in the rear of the cabin. Passengers entered and exited the Scarab via a centrally-mounted door.
The DKW Schnellaster— manufactured from 1949 to 1962— featured front-wheel drive, a transverse engine, flat floor and multi-configurable seating, all of which would later become characteristics of minivans.
In 1950, the Volkswagen Type 2 adapted a bus-shaped body to chassis of a small passenger car (the Volkswagen Beetle). When Volkswagen introduced a sliding side door to the Type 2 in 1968, it then had the prominent features that would later come to define a minivan: compact length, three rows of forward-facing seats, station wagon-style top-hinged tailgate/liftgate, sliding side door, passenger car base.
The 1956-1969 Fiat Multipla also had many features in common with modern minivans. The Multipla was based on the chassis of the Fiat 600 and had a rear engine and cab forward layout.
In the late 1970s, Chrysler began a development program to design "a small affordable van that looked and handled more like a car." The result of this program was the first American minivan, the 1984 Plymouth Voyager. The Voyager debuted the minivan design features of front-wheel drive, a flat floor and a sliding door for rear passengers. The badge-engineered Dodge Caravan was also released in for the 1984 model year, and was sold alongside the Voyager.
The term minivan came into use largely in comparison to size to full-size vans; at six feet tall or lower, 1980s minivans were intended to fit inside a typical garage door opening. In 1984, "The New York Times" described minivans "the hot cars coming out of Detroit," noting that "analysts say the mini-van has created an entirely new market, one that may well overshadow the... station wagon."
In response to the popularity of the Voyager/Caravan, General Motors released the 1985 Chevrolet Astro and GMC Safari badge-engineered twins, and Ford released the 1986 Ford Aerostar. These vehicles used a traditional rear-wheel drive layout, unlike the Voyager/Caravan. By the end of the 1980s, demand for minivans as family vehicles had largely superseded full-size station wagons in the United States.
During the 1990s, the minivan segment underwent several major changes. Many models switched to the front-wheel drive layout used by the Voyager/Caravan minivans, for example Ford replaced the Aerostar with the front-wheel drive Mercury Villager (a rebadged Nissan Quest) for 1993 and the Ford Windstar for 1995. The models also increased in size, as a result of the extended-wheelbase ("Grand") versions of the Voyager and Caravan which were in 1987. An increase in luxury features and interior equipment was seen in the Eddie Bauer version of the 1988 Ford Aerostar, the 1990 Chrysler Town & Country and the 1990 Oldsmobile Silhouette. The third-generation Plymouth Voyager, Dodge Caravan and Chrysler Town & Country — released for the 1996 model year — were available with an additional sliding door on the drivers side.
The highest selling year for minivans was in 2000, when 1.4 million units were sold. However, in the following years, the increasing popularity of sport utility vehicles (SUVs) began to erode sales of minivans. North American sales of the Volkswagen Transporter (sold as the 'Volkswagen Eurovan') ceased in 2003. Ford exited the segment in 2006, when the Ford Freestar was cancelled, Chrysler discontinued its short-wheelbase minivans in 2007 (although long-wheelbase minivans remained in production in the form of the Chrysler RT-platform minivans) and General Motors exited the segment in 2009 with the cancellation of the Chevrolet Uplander. It has been suggested that the lesser popularity of minivans is due to the minivan's image as a vehicle for older, domestically-oriented drivers.
In 2013, sales of the segment reached approximately 500,000 (one-third of its 2000 peak). Despite the declining sales for the segment in the late 2000s, several European brands launched minivans in the North American market. The Volkswagen Routan (a rebadged Dodge Grand Caravan) was sold from 2009-2013. In 2010, Ford began North American sales of the European-built Ford Transit Connect Wagon. North American sales of the Mercedes-Benz Vito (sold as the 'Mercedes-Benz Metris') began in 2016. However, the Nissan Quest and Mazda MPV were both discontinued in 2016.
The five highest selling models in the United States in 2018 were the Dodge Grand Caravan, Chrysler Pacifica, Honda Odyssey, Toyota Sienna and Kia Sedona.
Introduced several months after the Chrysler minivans, the 1984 Renault Espace was the first European-developed minivan developed primarily for passenger use (as the Volkswagen Caravelle/Vanagon was a derivative of a commercial van). Beginning development in the 1970s under the European subsidiaries of Chrysler, the Espace was intended as a successor for the Matra Rancho (a primitive CUV), leading to its use of front-hinged doors. While slow-selling at the time of its release, the Espace would go on to become the most successful European-brand minivans.
Initially intending to sell the Espace in the United States, the 1987 sale of AMC to Chrysler cancelled plans of Renault doing so. At the end of the 1980s, Chrysler and Ford commenced sales of American-brand minivans in Europe, selling the Chrysler Voyager and Ford Aerostar (with varying degrees of success). Deriving its minivans from American designs, General Motors imported the Oldsmobile Silhouette (branded as the Pontiac Trans Sport), later marketing the American-produced Opel/Vauxhall Sintra.
In the 1990s, several joint ventures produced long-running minivan designs. In 1994, the Sevel Nord-produced Eurovans were introduced, marketed by Citroën, Fiat, Lancia, and Peugeot; two generations were produced through 2014. In contrast to the Espace, the Eurovans were produced with two sliding doors; to increase interior space, the gearshift was relocated to the dashboard and the handbrake was moved. In 1995, Ford of Europe and Volkswagen entered a joint venture, producing the Ford Galaxy, SEAT Alhambra, and Volkswagen Sharan. Adopting a similar configuration as the Espace, the three model lines were launched with front-hinged doors; for 2010, SEAT and Volkswagen introduced a second generation, adopting sliding doors. Despite high expectations during the 1990s, the full-size MPV market fell short, and some consumers thought of MPVs as being large like vans. Other MPVs on the market at the time include Mitsubishi Space Wagon and Honda Shuttle. Renault set a new "compact MPV" standard with the Renault Scenic in 1996 which became a hit.
The five highest selling minivans in Europe in 2018 were the Ford S-Max, SEAT Alhambra, Volkswagen Sharan, Renault Espace and Ford Galaxy.
Contrasting with compact passenger vans developed from commercial vehicles, Japanese manufacturers commenced development of minivans starting from compact MPVs in the 1980s. In 1982, the Nissan Prairie became one of the first compact minivans. Derived closely from a compact sedan, the Prairie was configured with sliding doors, folding rear seats, and a lifting rear hatch. The Mitsubishi Chariot (exported to North America as the Colt Vista) adopted nearly the same form factor, using wagon-style front-hinged doors.
In 1989, the Mazda MPV was introduced as the first full-size minivan (derived from the Mazda 929 sedan). Developed primarily for American sales, the MPV exceeded Japanese compact size regulations; it was also sold in Japan and other markets. In line with American minivans, a passenger-side door was used; a hinged door was used (a driver-side door was introduced for 1996).
For 1990, the Toyota Previa mid-engine minivan was introduced (sold as the "Estima" in Japan). While largely retaining the configuration of its TownAce predecessor, the Previa was designed solely as a passenger vehicle, with nearly panoramic window glass (excluding the B and D-pillars). Replaced in North America by the locally produced Sienna, the Previa remains in production for Japanese and Australian markets; the larger Alphard is produced as a luxury vehicle.
Following the introduction of the Nissan Quest (co-developed with Ford for North America), Nissan introduced the Nissan Elgrand in 1997 for worldwide markets; the Nissan Serena has grown into the large MPV segment as well.
Honda has produced its Honda Odyssey line of minivans since 1994; since 1999, a separate (larger) version has been produced for the United States and Canada. Until 2013, the Japan-produced version of the Odyssey was designed with front-hinged doors. In a design feature that was adopted by other manufacturers, the first generation of the Odyssey featured a rear seat that folded flat into the floor.
Expanding beyond compact MPVs, Mitsubishi entered the minivan segment in 2003 with the Mitsubishi Grandis, using front-hinged doors. Sold outside of North America, the Grandis was marketed through 2011.
Adapting a similar layout to the Chrysler minivans, the Kia Carnival (also sold the Kia Sedona) was introduced in 1998 with dual sliding doors. Sharing its configuration with the Honda Odyssey, the Hyundai Trajet was sold from 1999 to 2008 in markets outside of North America; the Hyundai Entourage was a rebadged Kia Sedona.
Introduced in 2004, the SsangYong Rodius is the highest-capacity minivan, seating up to 11 passengers.
In 1999, Shanghai GM commenced production of the Buick GL8 minivan, derived from a minivan platform designed by GM in the United States. After two generations of production, the GL8 is the final minivan produced by General Motors or its joint ventures today.
Compact MPV— an abbreviation for Compact Multi-Purpose Vehicle— is a vehicle size class for the middle size of MPVs/minivans. The Compact MPV size class sits between the mini MPV and minivan size classes.
Compact MPVs remain predominantly a European phenomenon, although they are also built and sold in many Latin American and Asian markets. As of 2016, the only compact MPV sold widely in the United States is the Ford C-Max.
Mini MPV — an abbreviation for Mini Multi-Purpose Vehicle — is a vehicle size class for the smallest size of minivans (MPVs). The Mini MPV size class sits below the compact MPV size class and the vehicles are often built on the platforms of B-segment hatchback models.
Several PSA Peugeot Citroën minivans based on B-segment platforms have been marketed as 'leisure activity vehicles' in Europe. These include the Citroën Berlingo (1996–present).
A leisure activity vehicle (abbreviated LAV) is a small van or minivan; the segment is popularized primarily in Europe. One of the first LAVs was the 1977 Matra Rancho (among the first crossover SUVs and a precursor to the Renault Espace), with European manufacturers expanding the segment in the late 1990s, following the introduction of the Citroen Berlingo and Renault Kangoo.
Leisure activity vehicles are typically derived from supermini or subcompact car platforms, differing from mini MPVs in body design. To maximize interior space, LAVs are taller in height with a vertically-oriented liftgate (or the side-hinged doors of a cargo van); the body typically features a more vertically-oriented windshield and longer hood/bonnet. Marketed as an alternative to sedan-derived small family cars, LAVs have seating with a lower H-point than MPVs or minivans, offering two (or three) rows of seating.
Though sharing underpinnings with superminis, subcompacts, and mini MPVs, the use of an extended wheelbase can make leisure activity vehicles longer than the vehicles they are derived from. For example, the Fiat Doblò is one of the longest LAVs with a total length of , versus the of the Opel Meriva (a mini MPV) and the of the Peugeot 206 SW (a supermini).
List of leisure activity vehicles (similar vehicles are grouped together):
|
https://en.wikipedia.org/wiki?curid=20828
|
Mary Tyler Moore
Mary Tyler Moore (December 29, 1936 – January 25, 2017) was an American stage, film, and television actress, as well as a producer and social advocate. She was widely known for her prominent television sitcom roles in "The Dick Van Dyke Show" (1961–1966) and "The Mary Tyler Moore Show" (1970–1977).
Her film work included 1967's "Thoroughly Modern Millie" and 1980's "Ordinary People", the latter earning Moore a nomination for the Academy Award for Best Actress. Moore was an advocate for animal rights, vegetarianism and diabetes prevention.
With her two most prominent roles challenging gender stereotypes and norms, "The New York Times" said Moore's "performances on ["The Mary Tyler Moore Show" and "The Dick Van Dyke Show"] helped define a new vision of American womanhood." "The Guardian" said "her outwardly bubbly personality and trademark broad, toothy smile disguised an inner fragility that appealed to an audience facing the new trials of modern-day existence."
Moore was born in Brooklyn, New York, to George Tyler Moore (1913–2006), a clerk, and Marjorie Hackett (1916–1992). Her Irish-Catholic family lived on Ocean Parkway in the Flatbush section of Brooklyn, and Moore was the oldest of three children, with a younger brother and sister, John and Elizabeth.
When she was eight, Moore's family moved to Los Angeles at the recommendation of Moore's uncle, an MCA employee. She was raised Catholic, and attended St. Rose of Lima Parochial School in Brooklyn until the third grade. She then attended Saint Ambrose School in Los Angeles, followed by Immaculate Heart High School in Los Feliz, California. Moore's sister, Elizabeth, died aged 21 "from a combination of... painkillers and alcohol", while her brother died at the age of 47 from kidney cancer.
Moore's paternal great-grandfather, Lieutenant Colonel Lewis Tilghman Moore, owned the house which is now the Stonewall Jackson's Headquarters Museum in Winchester, Virginia.
Moore decided at the age of 17 that she wanted to be a dancer. Her television career began with a job as "Happy Hotpoint", a tiny elf dancing on Hotpoint appliances in TV commercials during the 1950s series "Ozzie and Harriet". After appearing in 39 Hotpoint commercials in five days, she received approximately $6,000. She became pregnant while still working as "Happy", and Hotpoint ended her work when it became too difficult to conceal her pregnancy with the elf costume. Moore modeled anonymously on the covers of a number of record albums, and auditioned for the role of the elder daughter of Danny Thomas for his long-running TV show, but was turned down. Much later, Thomas explained that "she missed it by a nose... no daughter of mine could ever have a nose that small".
Moore's first regular television role was as a mysterious and glamorous telephone receptionist in "Richard Diamond, Private Detective". In the show her voice was heard but only her legs appeared on camera, adding to the character's mystique. About this time, she guest-starred in John Cassavetes' NBC detective series "Johnny Staccato", and also in "Bachelor Father" in the episode titled "Bentley and the Big Board". In 1961, Moore appeared in several big parts in movies and on television, including "Bourbon Street Beat", "77 Sunset Strip", "Surfside 6", "Wanted: Dead or Alive", "Steve Canyon", "Hawaiian Eye", "Thriller" and "Lock-Up".
In 1961, Carl Reiner cast Moore in "The Dick Van Dyke Show", a weekly series based on Reiner's own life and career as a writer for Sid Caesar's television variety show "Your Show of Shows", telling the cast from the outset that it would run for no more than five years. The show was produced by Danny Thomas' company, and Thomas himself recommended her. He remembered Moore as "the girl with three names" whom he had turned down earlier. Moore's energetic comic performances as Van Dyke's character's wife, begun at age 24 (11 years Van Dyke's junior), made both the actress and her signature tight capri pants extremely popular, and she became internationally known. When she won her first Emmy Award for her portrayal of Laura Petrie, she said, "I know this will never happen again." She would go on to win six more.
In 1970, after having appeared earlier in a pivotal one-hour musical special called "Dick Van Dyke and the Other Woman", Moore and husband Grant Tinker successfully pitched a sitcom centered on Moore to CBS. "The Mary Tyler Moore Show" was a half-hour newsroom sitcom featuring Ed Asner as her gruff boss Lou Grant. "The Mary Tyler Moore Show" became a touchpoint of the Women's Movement for its portrayal of an independent working woman, which challenged the traditional woman's role in marriage and family.
Moore's show proved so popular that three other regular characters, Valerie Harper as Rhoda Morgenstern, Cloris Leachman as Phyllis Lindstrom and Ed Asner as Lou Grant were also spun off into their own series. The premise of the single working woman's life, alternating during the program between work and home, became a television staple. After six years of ratings in the top 20, the show slipped to number 39 during season seven. Producers asked that the series be canceled because of falling ratings, afraid that the show's legacy might be damaged if it were renewed for another season. Despite the decline in ratings, the 1977 season would go on to garner its third straight Emmy Award for Outstanding Comedy.. During its seven seasons, the program won 29 Emmys in total (Moore herself winning three times for Best Lead Actress in a sitcom). That record remained unbroken until 2002, when the NBC sitcom "Frasier" won its 30th Emmy.
During season six of "The Mary Tyler Moore Show", Moore appeared in a musical/variety special for CBS titled "Mary's Incredible Dream", which featured Ben Vereen. In 1978, she starred in a second CBS special, "How to Survive the '70s and Maybe Even Bump Into Happiness". This time, she received significant support from a strong lineup of guest stars: Bill Bixby, John Ritter, Harvey Korman and Dick Van Dyke. In the 1978–79 season, Moore attempted to try the musical-variety genre by starring in two unsuccessful CBS variety series in a row: "Mary", which featured David Letterman, Michael Keaton, Swoosie Kurtz and Dick Shawn in the supporting cast. CBS canceled the series. In March 1979, the network brought Moore back in a new, retooled show, "The Mary Tyler Moore Hour", which was described as a "sit-var" (part situation comedy/part variety series) with Moore portraying a TV star putting on a variety show. The program lasted just 11 episodes.
In the 1985–86 season, she returned to CBS in a series titled "Mary", which suffered from poor reviews, sagging ratings, and internal strife within the production crew. According to Moore, she asked CBS to pull the show as she was unhappy with the direction of the program and the producers. She also starred in the short-lived "Annie McGuire" in 1988. In 1995, after another lengthy break from TV series work, Moore was cast as tough, unsympathetic newspaper owner Louise "the Dragon" Felcott on the CBS drama "New York News", her third series in which her character worked in the news industry. As with her previous series "Mary" (1985), Moore quickly became unhappy with the nature of her character and was negotiating with producers to get out of her contract for the series when it was canceled.
In the mid-1990s, Moore had a cameo and a guest-starring role as herself on two episodes of "Ellen". She also guest-starred on Ellen DeGeneres's next TV show, "The Ellen Show", in 2001. In 2004, Moore reunited with her "Dick Van Dyke Show" castmates for a reunion "episode" called "The Dick Van Dyke Show Revisited".
In 2006, Moore guest-starred as Christine St. George, a high-strung host of a fictional TV show, in three episodes of the Fox sitcom "That '70s Show". Moore's scenes were shot on the same soundstage where "The Mary Tyler Moore Show" was filmed in the 1970s. Moore made a guest appearance on the season two premiere of "Hot in Cleveland", which starred her former co-star Betty White. This marked the first time that White and Moore had worked together since "The Mary Tyler Moore Show" ended in 1977. In the fall of 2013, Moore reprised her role on "Hot in Cleveland" in a season four episode which not only reunited Moore and White, but also former MTM cast members Cloris Leachman, Valerie Harper and Georgia Engel as well. This reunion coincided with Harper's public announcement that she had been diagnosed with terminal brain cancer and was given only a few months to live.
Moore appeared in several Broadway plays. She starred in "Whose Life Is It Anyway" with James Naughton, which opened on Broadway at the Royale Theatre on February 24, 1980, and ran for 96 performances, and in "Sweet Sue", which opened at the Music Box Theatre on January 8, 1987, later transferred to the Royale Theatre, and ran for 164 performances. She was the star of a new musical version of "Breakfast at Tiffany's" in December 1966, but the show, titled "Holly Golightly", was a flop that closed in previews before opening on Broadway. In reviews of performances in Philadelphia and Boston, critics "murdered" the play in which Moore claimed to be singing with bronchial pneumonia.
Moore appeared in previews of the Neil Simon play "Rose's Dilemma" at the off-Broadway Manhattan Theatre Club in December 2003 but quit the production after receiving a critical letter from Simon instructing her to "learn your lines or get out of my play". Moore had been using an earpiece on stage to feed her lines to the repeatedly rewritten play.
During the 1980s, Moore and her production company produced five plays: "Noises Off", "The Octette Bridge Club", "Joe Egg", "Benefactors", and "Safe Sex".
Moore made her film debut in 1961's "X-15". Following her success on "The Dick Van Dyke Show", she appeared in a string of films in the late 1960s (after signing an exclusive contract with Universal Pictures), including 1967's hit "Thoroughly Modern Millie", as a would-be actress in 1920s New York who is taken under the wing of Julie Andrews' title character, and the 1968 films "What's So Bad About Feeling Good?" with George Peppard, and "Don't Just Stand There!" with Robert Wagner. In 1969, she starred opposite Elvis Presley as a nun in "Change of Habit". Moore's future television castmate Ed Asner also appeared in that film as a police officer.
Moore did not appear in another feature film for eleven years. On her return to the big screen in 1980, she received her only Oscar nomination for her role in the coming-of-age drama "Ordinary People", as a grieving mother unable to cope either with the drowning death of one of her sons or the subsequent suicide attempt of her surviving son, played by Timothy Hutton who won the Academy Award for Best Supporting Actor for his performance. Despite that success, Moore's only two films in the next fifteen years were the poorly received "Six Weeks" (1982) and "Just Between Friends" (1986). In 1996 she made her return to films with the independent hit "Flirting with Disaster".
Moore appeared in the television movie "Run a Crooked Mile" (1969), and after the conclusion of her series in 1977, she starred in several prominent movies for television, including "First, You Cry" (1978), which brought her an Emmy nomination for portraying NBC correspondent Betty Rollin's struggle with breast cancer. Her later TV films included the medical drama "Heartsounds" (1984) with James Garner, which brought her another Emmy nomination, "Finnegan Begin Again" (1985) with Robert Preston, which earned her a CableACE Award nomination, the 1988 mini-series "Lincoln", which brought her another Emmy nod for playing Mary Todd Lincoln, and "Stolen Babies", for which she won an Emmy Award for Outstanding Supporting Actress in 1993. Later she reunited with old co-stars in "Mary and Rhoda" (2000) with Valerie Harper, and "The Gin Game" (2003) (based on the Broadway play), reuniting her with Dick Van Dyke. Moore also starred in "Like Mother, Like Son" (2001), playing convicted murderer Sante Kimes.
Moore wrote two memoirs. In the first, "After All", published in 1995, she acknowledges being a recovering alcoholic, while in "Growing Up Again: Life, Loves, and Oh Yeah, Diabetes" (2009), she focuses on living with type 1 diabetes.
Moore and her husband Grant Tinker founded MTM Enterprises, Inc. in 1969. This company produced "The Mary Tyler Moore Show" and several other successful television shows and films. It also included a record label, MTM Records. MTM Enterprises produced a variety of American sitcoms and drama television series such as "Rhoda", "Lou Grant" and "Phyllis" (all spin-offs from "The Mary Tyler Moore Show)", "The Bob Newhart Show", "The Texas Wheelers", "WKRP in Cincinnati", "The White Shadow", "Friends and Lovers", "St. Elsewhere", "Newhart", and "Hill Street Blues", and was later sold to Television South, an ITV Franchise holder in 1988. The MTM logo resembles the Metro Goldwyn Mayer logo, but with a cat named Mimsie instead of a lion.
At age 18 in 1955, Moore married Richard Carleton Meeker, whom she described as "the boy next door", and within six weeks she was pregnant with her only child, Richard Jr. (born July 3, 1956). Meeker and Moore divorced in 1961. Moore married Grant Tinker, a CBS executive and later chairman of NBC, in 1962, and in 1970 they formed the television production company MTM Enterprises, which created and produced the company's first television series, "The Mary Tyler Moore Show". Moore and Tinker divorced in 1981.
On October 14, 1980, at the age of 24, Moore's son Richard died of an accidental gunshot to the head while handling a small .410 shotgun. The model was later taken off the market because of its "hair trigger".
Moore married Robert Levine on November 23, 1983, at the Pierre Hotel in New York City. They met when he treated Moore's mother in New York City on a weekend house call, after Moore and her mother returned from a visit to the Vatican where they had a personal audience with Pope John Paul II.
Moore was a recovering alcoholic and had been diagnosed with Type I diabetes in 1969 after having a miscarriage. In 2011, she had surgery to remove a meningioma, a benign brain tumor. In 2014, friends reported that Moore had heart and kidney problems in addition to being nearly blind.
Moore died at the age of 80 on January 25, 2017, at Greenwich Hospital in Greenwich, Connecticut, from cardiopulmonary arrest complicated by pneumonia after having been placed on a respirator the previous week. She was interred in Oak Lawn Cemetery in Fairfield, Connecticut, during a private ceremony.
In addition to her acting work, Moore was the International Chairman of JDRF (formerly the Juvenile Diabetes Research Foundation). In this role, she used her celebrity status to help raise funds and awareness of diabetes mellitus type 1.
In 2007, in honor of Moore's dedication to the Foundation, JDRF created the "Forever Moore" research initiative which will support JDRF's Academic Research and Development and JDRF's Clinical Development Program. The program works on translating basic research advances into new treatments and technologies for those living with type 1 diabetes.
A long-time animal rights activist, she had advocated for animal rights for years, and supported charities like the ASPCA and Farm Sanctuary. She helped raise awareness about factory farming methods and promoted more compassionate treatment of farm animals. She was a pescetarian.
Moore appeared as herself in 1996 on an episode of the Ellen DeGeneres sitcom "Ellen". The storyline of the episode includes Moore honoring Ellen for trying to save a 65-year-old lobster from being eaten at a seafood restaurant. She was also a co-founder of Broadway Barks, an annual animal adopt-a-thon held in New York City. Moore and friend Bernadette Peters worked to make it a no-kill city and to encourage adopting animals from shelters.
In honor of her father, George Tyler Moore, a lifelong American Civil War enthusiast, in 1995 Moore donated funds to acquire an historic structure in Shepherdstown, West Virginia, for Shepherd College (now Shepherd University) to be used as a center for Civil War studies. The center, named the George Tyler Moore Center for the Study of the Civil War, is housed in the historic Conrad Shindler House (c. 1795), which is named in honor of her great-great-great-grandfather, who owned the structure from 1815 to 1852.
Moore also contributed to the renovation of a historic house in Winchester, Virginia, that had been used as headquarters by Confederate Major General Thomas J. "Stonewall" Jackson during his Shenandoah Valley campaign in 1861-62. The house, now known as the Stonewall Jackson's Headquarters Museum, had been owned by Moore's great-grandfather, Lieutenant Colonel Lewis Tilghman Moore, commander of the 4th Virginia Infantry in Jackson's Stonewall Brigade.
During the 1960s and 1970s, Moore had a reputation as a liberal or moderate, although she endorsed President Richard Nixon for re-election in 1972. She endorsed President Jimmy Carter for re-election in a 1980 campaign television ad. In 2011, her friend and former co-star Ed Asner said during an interview on "The O'Reilly Factor" that Moore "has become much more conservative of late". Bill O'Reilly, host of that program, stated that Moore had been a viewer of his show and that her political views had leaned conservative in recent years. In a "Parade" magazine article from March 22, 2009, Moore identified herself as a libertarian centrist who watched Fox News. She stated: "...when one looks at what's happened to television, there are so few shows that interest me. I do watch a lot of Fox News. I like Charles Krauthammer and Bill O'Reilly... If McCain had asked me to campaign for him, I would have." In an interview for the 2013 PBS series "Pioneers of Television," Moore said that she was recruited to join the feminist movement of the 1970s by Gloria Steinem but did not agree with Steinem's views. Moore said she believed that women have an important role in raising children and that she did not believe in Steinem's view that women owe it to themselves to have a career.
In February 1981, Moore was nominated for the Academy Award for Best Actress for her role in the drama film "Ordinary People" but lost to Sissy Spacek for her role in "Coal Miner's Daughter". In 1981 she won the Golden Globe Award for Best Actress in a Drama for that role.
Moore received a total of seven Emmy Awards. Three for portraying Mary Richards on MTM Show, one for her portrayal of Laura Petrie.
On Broadway, Moore received a special Tony Award for her performance in "Whose Life Is It Anyway?" in 1980, and was nominated for a Drama Desk Award as well. In addition, as a producer, she received nominations for Tony Awards and Drama Desk Awards for MTM's productions of "Noises Off" in 1984 and "Benefactors" in 1986, and won a Tony Award for Best Revival of a Play or Musical in 1985 for "Joe Egg".
In 1986, she was inducted into the Television Hall of Fame. In 1987, she received a Lifetime Achievement Award in Comedy from the American Comedy Awards.
Moore's contributions to the television industry were recognized in 1992 with a star on the Hollywood Walk of Fame. The star is located at 7021 Hollywood Boulevard.
On May 8, 2002, Moore was present when cable network TV Land and the City of Minneapolis dedicated a statue in downtown Minneapolis of the television character she made famous on "The Mary Tyler Moore Show". The statue, by artist Gwendolyn Gillen, was chosen from designs submitted by 21 sculptors. The bronze sculpture was located in front of the Dayton's department store – now Macy's – near the corner of 7th Street South and Nicollet Mall. It depicts the iconic moment in the show's opening credits where Moore tosses her Tam o' Shanter in the air, in a freeze-frame at the end of the montage. While Dayton's is clearly seen in the opening sequence, the store in the background of the hat toss is actually Donaldson's, which was, like Dayton's, a locally based department store with a long history at 7th and Nicollet. In late 2015, the statue was relocated to the city's visitor center during renovations; it was reinstalled in its original location in 2017.
Moore was awarded the 2011 Screen Actors Guild's lifetime achievement award. In New York City in 2012, Moore and Bernadette Peters were honored by the Ride of Fame and a double-decker bus was dedicated to them and their charity work on behalf of "Broadway Barks", which the duo co-founded.
Notes
Bibliography
|
https://en.wikipedia.org/wiki?curid=20833
|
Mason Remey
Charles Mason Remey (May 15, 1874 – February 4, 1974) was a prominent and controversial American Baháʼí who was appointed in 1951 a Hand of the Cause, and president of the International Baháʼí Council. He was the architect for the Baháʼí Houses of Worship in Uganda and Australia, and Shoghi Effendi approved his design of the unbuilt House of Worship in Haifa, Israel.
When Shoghi Effendi died in 1957, he died without explicitly appointing a successor Guardian, and Remey was among the nine Hands of the Cause elected as an interim authority until the election of the first Universal House of Justice in 1963. However, in 1960 Remey declared himself to be the successor of Shoghi Effendi, and expected the allegiance of the world's Baháʼís. Subsequently, he and his followers were declared Covenant breakers by the Hands. They reasoned that he did not fulfill the qualifications set forth by Abdu'l-Bahá in His Will and Testament. Remey lacked a formal appointment from Shoghi Effendi, an appointment which needed to be confirmed by the rest of the Hands, and that the office was confined to male descendants of Baháʼu'lláh, the Aghsan, which Remey was not. Almost the whole Baháʼí world rejected his claim, but he gained the support of a small group of Baháʼís. His claim resulted in the largest schism in the history of the Baháʼí Faith, with a few groups still holding a belief that Remey was the successor of Shoghi Effendi. Various dated references show membership at less than a hundred each in two of the surviving groups.
Born in Burlington, Iowa, on May 15, 1874, Mason was the eldest son of Rear Admiral George Collier Remey and Mary Josephine Mason Remey, the daughter of Charles Mason, the first Chief Justice of Iowa. Remey's parents raised him in the Episcopal Church. Remey trained as an architect at Cornell University (1893–1896), and the École des Beaux-Arts in Paris, France (1896–1903) where he first learned of the Baháʼí Faith.
With a background in architecture, Remey was asked to design the Ugandan and Australian Baháʼí Houses of Worship which still stand today and are the mother temples for Australasia and Africa respectively. Upon the request of Shoghi Effendi, he also provided designs for a Baháʼí House of Worship in Tehran, for Haifa, and the Shrine of ʻAbdu'l-Bahá, however only the Haifa temple was approved before the death of Shoghi Effendi, and none have so far been built.
Remey traveled extensively to promote the Baháʼí Faith during the ministry of ʻAbdu'l-Bahá. Shoghi Effendi recorded that Remey and his Baháʼí companion, Howard Struven, were the first Baháʼís to circle the globe teaching the religion.
A prolific writer, Remey wrote numerous published and personal articles promoting the Baháʼí Faith, including "ʻAbdu'l-Bahá – The Center of the Covenant" and the five volume "A Comprehensive History of the Baháʼí Movement (1927)", "The Baháʼí Revelation and Reconstruction" (1919), "Constructive Principles of The Baháʼí Movement" (1917), and "The Baháʼí Movement: A Series of Nineteen Papers" (1912) are a few of the titles of the many works Remey produced while ʻAbdu'l-Bahá was still alive. Remey's life was recorded in his diaries, and in 1940 he provided copies and selected writings to several public libraries. Included in most of the collections were the letters ʻAbdu'l-Bahá wrote to him.
According to Juliet Thompson's diary, ʻAbdu'l-Bahá suggested that she marry Remey, and in 1909 asked her how she felt about it, reportedly requesting of her: "Give my greatest love to Mr. Remey and say: You are very dear to me. You are so dear to me that I think of you day and night. You are my real son. Therefore I have an idea for you. I hope it may come to pass ... He told me He loved Mason Remey so much," Thompson writes, "and He loved me so much that he wished us to marry. That was the meaning of His message to Mason. He said it would be a perfect union and good for the Cause. Then he asked me how I felt about it." They did not marry, although Thompson anguished over her decision, which she felt would cause ʻAbdu'l-Baha disappointment. In 1932 he married Gertrude Heim Klemm Mason (1887-1933), who subsequently died a year later.
Remey lived for some time in Washington, D.C., in the 1930s and 1940s. In 1950 Remey moved his residence from Washington, D.C., to Haifa, Israel, at the request of Shoghi Effendi. In January 1951, Shoghi Effendi issued a proclamation announcing the formation of the International Baháʼí Council (IBC), representing the first international Baháʼí body. The council was intended to be a forerunner to the Universal House of Justice, the supreme ruling body of the Baháʼí Faith. Remey was appointed president of the council in March, with Amelia Collins as vice-president, then in December 1951 Remey was appointed a Hand of the Cause.
When Shoghi Effendi died in 1957, Remey and the other Hands of the Cause met in a private Conclave at Bahjí in Haifa, and determined that he hadn't appointed a successor. During this conclave the Hands of the Cause decided that the situation of the Guardian having died without being able to appoint a successor was a situation not dealt with in the texts that define the Baháʼí administration, and that it would need to be reviewed and adjudicated upon by the Universal House of Justice, which hadn't been elected yet. Remey signed a unanimous declaration of the Hands that Shoghi Effendi had died "without having appointed his successor".
Three years later, in 1960, Remey made a written announcement that his appointment as president of the international council represented an appointment by Shoghi Effendi as Guardian, because the appointed council was a precursor to the elected Universal House of Justice, which has the Guardian as its president.
He also attempted to usurp the control of the Faith which the Hands had themselves assumed at the passing of Shoghi Effendi stating:
It is from and through the Guardianship that infallibility is vested and that the Hands of the Faith receive their orders ... I now command the Hands of the Faith to stop all of their preparations for 1963, and furthermore I command all believers both as individual Baháʼís and as assemblies of Baháʼís to immediately cease cooperating with and giving support to this fallacious program for 1963.
He claimed to believe that the Guardianship was an institution intended to endure forever, and that he was the 2nd Guardian by virtue of his appointment to the IBC. Almost the whole Baháʼí world rejected his claim, although he gained the support of a small group of Baháʼís. One of the most notable exceptions to accept his claim were several members of the French National Spiritual Assembly, led by Joel Marangella, who elected to support Remey. The Assembly was consequently disbanded by the Hands. The remaining 26 Hands of the Cause, after several efforts at reconciliation with Remey, unanimously declared him a Covenant-breaker and expelled him from the Baháʼí community. Remey himself declared that being the Guardian gave him the exclusive right to declare who was or wasn't a Covenant-breaker, and that those who opposed him and followed the Hands of the Cause were Covenant-breakers.
Among the Baháʼís who accepted Mason Remey as the Second Guardian of the Cause of God, several further divisions have occurred based on opinions of legitimacy and the proper succession of authority. Most of his long-term followers were Americans, who distinguished themselves as "Baháʼís Under the Hereditary Guardianship". In his later years, Remey made confused and contradictory appointments for a successor, which resulted in further divisions among his followers dividing among several claimants to leadership. When Remey died his followers split into rival factions with each believing in a different Guardian. Donald Harvey (d.1991), was appointed by Remey as "Third Guardian" in 1967. Joel Marangella was president of Remey's "Second International Baháʼí Council" and claimed in 1969 to have been secretly appointed by Remey as Guardian several years earlier, whose followers are now known as Orthodox Baháʼís. Another of Remey's followers, Leland Jensen (d. 1996), who made several religious claims of his own, formed a sect known as the Baháʼís Under the Provisions of the Covenant following Remey's death; he believed that Remey was the adopted son of Abdu'l-Baha, and that Remey's adopted son Joseph Pepe was the third Guardian.
All those that accept Mason Remey as the second Guardian do not accept the Universal House of Justice established in 1963.
From 1937 to 1958, Remey spent most of his fortune designing and building an underground mausoleum in Virginia as a memorial to his family called the "Remeum". It was replete with bas relief, statues, tombs, alcoves and reliefs depicting the lives of saints. The complex had electric chandeliers, ventilation and plumbing and, never finished because of legal issues, was frequently vandalized over the years.
On February 4, 1974, Mason Remey died at the age of 99.
|
https://en.wikipedia.org/wiki?curid=20835
|
Minimalism
In visual arts, music, and other mediums, minimalism is an art movement that began in post–World War II Western art, most strongly with American visual arts in the 1960s and early 1970s. Prominent artists associated with minimalism include Donald Judd, Agnes Martin, Dan Flavin, Carl Andre, Robert Morris, Anne Truitt, and Frank Stella. The movement is often interpreted as a reaction against abstract expressionism and modernism; it anticipated contemporary postminimal art practices, which extend or reflect on minimalism's original objectives.
Minimalism in music often features repetition and gradual variation, such as the works of La Monte Young, Terry Riley, Steve Reich, Philip Glass, Julius Eastman, and John Adams. The term "minimalist" often colloquially refers to anything that is spare or stripped to its essentials. It has accordingly been used to describe the plays and novels of Samuel Beckett, the films of Robert Bresson, the stories of Raymond Carver, and the automobile designs of Colin Chapman.
Minimalism in visual art, generally referred to as "minimal art", "literalist art" and "ABC Art" emerged in New York in the early 1960s as new and older artists moved toward geometric abstraction; exploring via painting in the cases of Frank Stella, Kenneth Noland, Al Held, Ellsworth Kelly, Robert Ryman and others; and sculpture in the works of various artists including David Smith, Anthony Caro, Tony Smith, Sol LeWitt, Carl Andre, Dan Flavin, Donald Judd and others. Judd's sculpture was showcased in 1964 at Green Gallery in Manhattan, as were Flavin's first fluorescent light works, while other leading Manhattan galleries like Leo Castelli Gallery and Pace Gallery also began to showcase artists focused on geometric abstraction. In addition there were two seminal and influential museum exhibitions: "Primary Structures: Younger American and British Sculpture" shown from April 27 – June 12, 1966 at the Jewish Museum in New York, organized by the museum's Curator of Painting and Sculpture, Kynaston McShine and "Systemic Painting," at the Solomon R. Guggenheim Museum curated by Lawrence Alloway also in 1966 that showcased Geometric abstraction in the American art world via Shaped canvas, Color Field, and Hard-edge painting. In the wake of those exhibitions and a few others the art movement called "minimal art" emerged.
In a more broad and general sense, one finds European roots of minimalism in the geometric abstractions of painters associated with the Bauhaus, in the works of Kazimir Malevich, Piet Mondrian and other artists associated with the De Stijl movement, and the Russian Constructivist movement, and in the work of the Romanian sculptor Constantin Brâncuși.
In France between 1947 and 1948, Yves Klein conceived his "Monotone Symphony" (1949, formally "The Monotone-Silence Symphony") that consisted of a single 20-minute sustained chord followed by a 20-minute silence – a precedent to both La Monte Young's drone music and John Cage's "4′33″". Klein had painted monochromes as early as 1949, and held the first private exhibition of this work in 1950—but his first public showing was the publication of the Artist's book "" in November 1954.
Minimal art is also inspired in part by the paintings of Barnett Newman, Ad Reinhardt, Josef Albers, and the works of artists as diverse as Pablo Picasso, Marcel Duchamp, Giorgio Morandi, and others. Minimalism was also a reaction against the painterly subjectivity of Abstract Expressionism that had been dominant in the New York School during the 1940s and 1950s.
Artist and critic Thomas Lawson noted in his 1981 Artforum essay "Last Exit: Painting," minimalism did not reject Clement Greenberg's claims about modernist painting's reduction to surface and materials so much as take his claims literally. According to Lawson, minimalism was the result, even though the term "minimalism" was not generally embraced by the artists associated with it, and many practitioners of art designated minimalist by critics did not identify it as a movement as such. Also taking exception to this claim was Clement Greenberg himself; in his 1978 postscript to his essay "Modernist Painting" he disavowed this interpretation of what he said, writing:
There have been some further constructions of what I wrote that go over into preposterousness: I regard flatness and the inclosing of flatness not just as the limiting conditions of pictorial art, but as criteria of aesthetic quality in pictorial art; that the further a work advances the self-definition of an art, the better that work is bound to be. The philosopher or art historian who can envision me—or anyone at all—arriving at aesthetic judgments in this way reads shockingly more into himself or herself than into my article.
In contrast to the previous decade's more subjective Abstract Expressionists, with the exceptions of Barnett Newman and Ad Reinhardt; minimalists were also influenced by composers John Cage and LaMonte Young, poet William Carlos Williams, and the landscape architect Frederick Law Olmsted. They very explicitly stated that their art was not about self-expression, and unlike the previous decade's more subjective philosophy about art making theirs was 'objective'. In general, minimalism's features included geometric, often cubic forms purged of much metaphor, equality of parts, repetition, neutral surfaces, and industrial materials.
Robert Morris, a theorist and artist, wrote a three part essay, "Notes on Sculpture 1–3", originally published across three issues of "Artforum" in 1966. In these essays, Morris attempted to define a conceptual framework and formal elements for himself and one that would embrace the practices of his contemporaries. These essays paid great attention to the idea of the gestalt – "parts... bound together in such a way that they create a maximum resistance to perceptual separation." Morris later described an art represented by a "marked lateral spread and no regularized units or symmetrical intervals..." in "Notes on Sculpture 4: Beyond Objects", originally published in "Artforum", 1969, continuing on to say that "indeterminacy of arrangement of parts is a literal aspect of the physical existence of the thing." The general shift in theory of which this essay is an expression suggests the transition into what would later be referred to as "postminimalism".
One of the first artists specifically associated with minimalism was the painter Frank Stella, four of whose early "black paintings" were included in the 1959 show, "," organized by Dorothy Miller at the Museum of Modern Art in New York. The width of the stripes in Frank Stellas's black paintings were often determined by the dimensions of the lumber he used for stretchers to support the canvas, visible against the canvas as the depth of the painting when viewed from the side. Stella's decisions about structures on the front surface of the canvas were therefore not entirely subjective, but pre-conditioned by a "given" feature of the physical construction of the support. In the show catalog, Carl Andre noted, "Art excludes the unnecessary. Frank Stella has found it necessary to paint stripes. There is nothing else in his painting." These reductive works were in sharp contrast to the energy-filled and apparently highly subjective and emotionally charged paintings of Willem de Kooning or Franz Kline and, in terms of precedent among the previous generation of abstract expressionists, leaned more toward the less gestural, often somber, color field paintings of Barnett Newman and Mark Rothko. Stella received immediate attention from the MoMA show, but other artists—including Kenneth Noland, Gene Davis, Robert Motherwell, and Robert Ryman—had also begun to explore stripes, monochromatic and Hard-edge formats from the late 50s through the 1960s.
Because of a tendency in minimal art to exclude the pictorial, illusionistic and fictive in favor of the literal, there was a movement away from painterly and toward sculptural concerns. Donald Judd had started as a painter, and ended as a creator of objects. His seminal essay, "Specific Objects" (published in Arts Yearbook 8, 1965), was a touchstone of theory for the formation of minimalist aesthetics. In this essay, Judd found a starting point for a new territory for American art, and a simultaneous rejection of residual inherited European artistic values. He pointed to evidence of this development in the works of an array of artists active in New York at the time, including Jasper Johns, Dan Flavin and Lee Bontecou. Of "preliminary" importance for Judd was the work of George Earl Ortman, who had concretized and distilled painting's forms into blunt, tough, philosophically charged geometries. These Specific Objects inhabited a space not then comfortably classifiable as either painting or sculpture. That the categorical identity of such objects was itself in question, and that they avoided easy association with well-worn and over-familiar conventions, was a part of their value for Judd.
This movement was criticized by modernist formalist art critics and historians. Some critics thought minimal art represented a misunderstanding of the modern dialectic of painting and sculpture as defined by critic Clement Greenberg, arguably the dominant American critic of painting in the period leading up to the 1960s. The most notable critique of minimalism was produced by Michael Fried, a formalist critic, who objected to the work on the basis of its "theatricality". In "Art and Objecthood" (published in Artforum in June 1967) he declared that the minimal work of art, particularly minimal sculpture, was based on an engagement with the physicality of the spectator. He argued that work like Robert Morris's transformed the act of viewing into a type of spectacle, in which the artifice of the act observation and the viewer's participation in the work were unveiled. Fried saw this displacement of the viewer's experience from an aesthetic engagement within, to an event outside of the artwork as a failure of minimal art. Fried's essay was immediately challenged by postminimalist and earth artist Robert Smithson in a letter to the editor in the October issue of Artforum. Smithson stated the following: "What Fried fears most is the consciousness of what he is doing—namely being himself theatrical."
In addition to the already mentioned Robert Morris, Frank Stella, Carl Andre, Robert Ryman and Donald Judd other minimal artists include: Robert Mangold, Larry Bell, Dan Flavin, Sol LeWitt, Charles Hinman, Ronald Bladen, Paul Mogensen, Ronald Davis, David Novros, Brice Marden, Blinky Palermo, Agnes Martin, Jo Baer, John McCracken, Ad Reinhardt, Fred Sandback, Richard Serra, Tony Smith, Patricia Johanson, and Anne Truitt.
Ad Reinhardt, actually an artist of the Abstract Expressionist generation, but one whose reductive nearly all-black paintings seemed to anticipate minimalism, had this to say about the value of a reductive approach to art:
"The more stuff in it, the busier the work of art, the worse it is. More is less. Less is more. The eye is a menace to clear sight. The laying bare of oneself is obscene. Art begins with the getting rid of nature."
Reinhardt's remark directly addresses and contradicts Hans Hofmann's regard for nature as the source of his own abstract expressionist paintings. In a famous exchange between Hofmann and Jackson Pollock as told by Lee Krasner in an interview with Dorothy Strickler (1964-11-02) for the Smithsonian Institution Archives of American Art. In Krasner's words:
When I brought Hofmann up to meet Pollock and see his work which was before we moved here, Hofmann's reaction was—one of the questions he asked Jackson was, "Do you work from nature?" There were no still lifes around or models around and Jackson's answer was, "I am nature." And Hofmann's reply was, "Ah, but if you work by heart, you will repeat yourself." To which Jackson did not reply at all. The meeting between Pollock and Hofmann took place in 1942.
In software and user interface design, minimalism describes the usage of fewer design elements, flat design, fewer options and features, and tendencially less occupied screen space.
One example is the user interface of the Samsung Galaxy S6, where many options and items from menus and settings were pruned.
The update to Android Lollipop removed the shortcuts to "Silent", "Vibration only", and "Sound on" in the stand-by menu.
The Android Lollipop update (late 2014 – 2015) applied to both stock Android and TouchWiz UI devices changes the appearance of the user interface, especially the setting menus of which the use of icons, border lines, edges, and contrast elements have been reduced to a minimum.
Furthermore, the remaining icons have become less skeumorphistic and more abstract, adapting to flat design language.
The density of the elements on the user interface has decreased. There is more whitespace, or unoccupied screen space.
Similar changes happened with the update from iOS 6 to iOS 7.
Furthermore, in 2014, the icons from context menus of Samsung's TouchWiz applications (Samsung Gallery, "S Browser", telephone app, etc.) were pruned.
Prior to Samsung's TouchWiz Nature UX 3.0, menu options that are currently unavilable (e.g. "Search for text in page" in the Internet browser during a page load) were shown, but grayed out, which has the advantage of informing the user about their existence but that the option is unavailable. Since then, unavailable options are hidden completely, which makes the context menu occupy less screen space, but it might cause the user to not realize immediately that the feature is unavailable.
Started in Safari browser for iOS and adapted by Samsung's "S Browser", some browsers only show the domain name instead of the full URL, even if there is spare space in the URL bar.
In June 2015, the layout of Instagram's website was fully redesigned to resemble the mobile application and mobile website, pruning many user interface elements, for example, the "slideshow banner".
Another example is the Skype redesign, where many icons from context menus were removed, colour gradients were replaced with flat colors, and the density of user interface elements was decreased.
The term minimalism is also used to describe a trend in design and architecture, wherein the subject is reduced to its necessary elements. Minimalist architectural designers focus on the connection between two perfect planes, elegant lighting, and the void spaces left by the removal of three-dimensional shapes in an architectural design.
Minimalistic design has been highly influenced by Japanese traditional design and architecture. The works of De Stijl artists are a major reference: De Stijl expanded the ideas of expression by meticulously organizing basic elements such as lines and planes. With regard to home design, more attractive "minimalistic" designs are not truly minimalistic because they are larger, and use more expensive building materials and finishes.
There are observers who describe the emergence of minimalism as a response to the brashness and chaos of urban life. In Japan, for example, minimalist architecture began to gain traction in the 1980s when its cities experienced rapid expansion and booming population. The design was considered an antidote to the "overpowering presence of traffic, advertising, jumbled building scales, and imposing roadways." The chaotic environment was not only driven by urbanization, industrialization, and technology but also the Japanese experience of constantly having to demolish structures on account of the destruction wrought by World War II and the earthquakes, including the calamities it entails such as fire. The minimalist design philosophy did not arrive in Japan by way of another country as it was already part of the Japanese culture rooted on the Zen philosophy. There are those who specifically attribute the design movement to Japan's spirituality and view of nature.
Architect Ludwig Mies van der Rohe (1886–1969) adopted the motto "Less is more" to describe his aesthetic. His tactic was one of arranging the necessary components of a building to create an impression of extreme simplicity—he enlisted every element and detail to serve multiple visual and functional purposes; for example, designing a floor to also serve as the radiator, or a massive fireplace to also house the bathroom. Designer Buckminster Fuller (1895–1983) adopted the engineer's goal of "Doing more with less", but his concerns were oriented toward technology and engineering rather than aesthetics.
Luis Barragán is an exemplary modern minimalist designer. Other contemporary minimalist architects include Kazuyo Sejima, John Pawson, Eduardo Souto de Moura, Álvaro Siza Vieira, Tadao Ando, Alberto Campo Baeza, Yoshio Taniguchi, Peter Zumthor, Hugh Newell Jacobsen, Vincent Van Duysen, Claudio Silvestrin, Michael Gabellini, and Richard Gluckman.
Minimalist architecture became popular in the late 1980s in London and New York, where architects and fashion designers worked together in the boutiques to achieve simplicity, using white elements, cold lighting, large space with minimum objects and furniture.
The concept of minimalist architecture is to strip everything down to its essential quality and achieve simplicity. The idea is not completely without ornamentation, but that all parts, details, and joinery are considered as reduced to a stage where no one can remove anything further to improve the design.
The considerations for 'essences' are light, form, detail of material, space, place, and human condition. Minimalist architects not only consider the physical qualities of the building. They consider the spiritual dimension and the invisible, by listening to the figure and paying attention to details, people, space, nature, and materials., believing this reveals the abstract quality of something that is invisible and aids the search for the essence of those invisible qualities—such as natural light, sky, earth, and air. In addition, they "open a dialogue" with the surrounding environment to decide the most essential materials for the construction and create relationships between buildings and sites.
In minimalist architecture, design elements strive to convey the message of simplicity. The basic geometric forms, elements without decoration, simple materials and the repetitions of structures represent a sense of order and essential quality. The movement of natural light in buildings reveals simple and clean spaces. In the late 19th century as the arts and crafts movement became popular in Britain, people valued the attitude of 'truth to materials' with respect to the profound and innate characteristics of materials. Minimalist architects humbly 'listen to figure,' seeking essence and simplicity by rediscovering the valuable qualities in simple and common materials.
The idea of simplicity appears in many cultures, especially the Japanese traditional culture of Zen Philosophy. Japanese manipulate the Zen culture into aesthetic and design elements for their buildings. This idea of architecture has influenced Western Society, especially in America since the mid 18th century. Moreover, it inspired the minimalist architecture in the 19th century.
Zen concepts of simplicity transmit the ideas of freedom and essence of living. Simplicity is not only aesthetic value, it has a moral perception that looks into the nature of truth and reveals the inner qualities and essence of materials and objects. For example, the sand garden in Ryoanji temple demonstrates the concepts of simplicity and the essentiality from the considered setting of a few stones and a huge empty space.
The Japanese aesthetic principle of Ma refers to empty or open space. It removes all the unnecessary internal walls and opens up the space. The emptiness of spatial arrangement reduces everything down to the most essential quality.
The Japanese aesthetic of Wabi-sabi values the quality of simple and plain objects. It appreciates the absence of unnecessary features, treasures a life in quietness and aims to reveal the innate character of materials. For example, the Japanese floral art, also known as Ikebana, has the central principle of letting the flower express itself. People cut off the branches, leaves and blossoms from the plants and only retain the essential part of the plant. This conveys the idea of essential quality and innate character in nature.
However, far from being just a spatial concept, Ma is present in all aspects of Japanese daily life, as it applies to time as well as to daily tasks.
The Japanese minimalist architect Tadao Ando conveys the Japanese traditional spirit and his own perception of nature in his works. His design concepts are materials, pure geometry and nature. He normally uses concrete or natural wood and basic structural form to achieve austerity and rays of light in space. He also sets up dialogue between the site and nature to create relationship and order with the buildings. Ando's works and the translation of Japanese aesthetic principles are highly influential on Japanese architecture.
Another Japanese minimalist architect, Kazuyo Sejima, works on her own and in conjunction with Ryue Nishizawa, as SANAA, producing iconic Japanese Minimalist buildings. Credited with creating and influencing a particular genre of Japanese Minimalism, Sejimas delicate, intelligent designs may use white color, thin construction sections and transparent elements to create the phenomenal building type often associated with minimalism. Works include New Museum(2010) New York City, Small House (2000) Tokyo, House surrounded By Plum Trees (2003) Tokyo.
In Vitra Conference Pavilion, Weil am Rhein, 1993, the concepts are to bring together the relationships between building, human movement, site and nature. Which as one main point of minimalism ideology that establish dialogue between the building and site. The building uses the simple forms of circle and rectangle to contrast the filled and void space of the interior and nature. In the foyer, there is a large landscape window that looks out to the exterior. This achieves the simple and silence of architecture and enhances the light, wind, time and nature in space.
John Pawson is a British minimalist architect; his design concepts are soul, light, and order. He believes that though reduced clutter and simplification of the interior to a point that gets beyond the idea of essential quality, there is a sense of clarity and richness of simplicity instead of emptiness. The materials in his design reveal the perception toward space, surface, and volume. Moreover, he likes to use natural materials because of their aliveness, sense of depth and quality of an individual. He is also attracted by the important influences from Japanese Zen Philosophy.
Calvin Klein Madison Avenue, New York, 1995–96, is a boutique that conveys Calvin Klein's ideas of fashion. John Pawson's interior design concepts for this project are to create simple, peaceful and orderly spatial arrangements. He used stone floors and white walls to achieve simplicity and harmony for space. He also emphasises reduction and eliminates the visual distortions, such as the air conditioning and lamps to achieve a sense of purity for interior.
Alberto Campo Baeza is a Spanish architect and describes his work as essential architecture. He values the concepts of light, idea and space. Light is essential and achieves the relationship between inhabitants and the building. Ideas are to meet the function and context of space, forms, and construction. Space is shaped by the minimal geometric forms to avoid decoration that is not essential.
Gasper House, Zahora, 1992 is a residence that the client wanted to be independent. High walls create the enclosed space and the stone floors used in house and courtyard show the continuality of interior and exterior. The white colour of the walls reveals the simplicity and unity of the building. The feature of the structure make lines to form the continuously horizontal house, therefore natural light projects horizontally through the building.
Literary minimalism is characterized by an economy with words and a focus on surface description. Minimalist writers eschew adverbs and prefer allowing context to dictate meaning. Readers are expected to take an active role in creating the story, to "choose sides" based on oblique hints and innuendo, rather than react to directions from the writer.
Some 1940s-era crime fiction of writers such as James M. Cain and Jim Thompson adopted a stripped-down, matter-of-fact prose style to considerable effect; some classify this prose style as minimalism.
Another strand of literary minimalism arose in response to the metafiction trend of the 1960s and early 1970s (John Barth, Robert Coover, and William H. Gass). These writers were also sparse with prose and kept a psychological distance from their subject matter.
Minimalist writers, or those who are identified with minimalism during certain periods of their writing careers, include the following: Raymond Carver, Ann Beattie, Bret Easton Ellis, Charles Bukowski, Ernest Hemingway, K. J. Stevens, Amy Hempel, Bobbie Ann Mason, Tobias Wolff, Grace Paley, Sandra Cisneros, Mary Robison, Frederick Barthelme, Richard Ford, Patrick Holland, Cormac McCarthy, and Alicia Erian.
American poets such as Stephen Crane, William Carlos Williams, early Ezra Pound, Robert Creeley, Robert Grenier, and Aram Saroyan are sometimes identified with their "minimalist" style. The term "minimalism" is also sometimes associated with the briefest of poetic genres, haiku, which originated in Japan, but has been domesticated in English literature by poets such as Nick Virgilio, Raymond Roseliep, and George Swede.
The Irish writer Samuel Beckett is well known for his minimalist plays and prose, as is the Norwegian writer Jon Fosse.
Dimitris Lyacos's "With the People from the Bridge", combining elliptical monologues with a pared-down prose narrative is a contemporary example of minimalist playwrighting.
In his novel "The Easy Chain", Evan Dara includes a 60-page section written in the style of musical minimalism, in particular inspired by composer Steve Reich. Intending to represent the psychological state (agitation) of the novel's main character, the section's successive lines of text are built on repetitive and developing phrases.
The term "minimal music" was derived around 1970 by Michael Nyman from the concept of minimalism, which was earlier applied to the visual arts. More precisely, it was in a 1968 review in "The Spectator" that Nyman first used the term, to describe a ten-minute piano composition by the Danish composer Henning Christiansen, along with several other unnamed pieces played by Charlotte Moorman and Nam June Paik at the Institute of Contemporary Arts in London.
The term usually is associated with filmmakers such as Robert Bresson, Carl Theodor Dreyer, and Yasujirō Ozu. Their films typically tell a simple story with straight forward camera usage and minimal use of score. Paul Schrader named their kind of cinema: "transcendental cinema". Abbas Kiarostami and Elia Suleiman are also considered creators of minimalistic films.
Joshua Fields Millburn and Ryan Nicodemus directed and produced a movie called "Minimalism: A Documentary." that showcased the idea of minimal living in the modern world.
To portray global warming to non-scientists, in 2018 British climate scientist Ed Hawkins developed warming stripes graphics that are deliberately devoid of scientific or technical indicia. Hawkins explained that "our visual system will do the interpretation of the stripes without us even thinking about it".
Warming stripe graphics resemble color field paintings in stripping out all distractions and using only color to convey meaning. Color field pioneer artist Barnett Newman said he was "creating images whose reality is self-evident", an ethos that Hawkins is said to have applied to the problem of climate change and leading one commentator to remark that the graphics are "fit for the Museum of Modern Art or the Getty."
|
https://en.wikipedia.org/wiki?curid=20836
|
Ignition magneto
An ignition magneto, or high tension magneto, is a magneto that provides current for the ignition system of a spark-ignition engine, such as a petrol engine. It produces pulses of high voltage for the spark plugs. The older term "tension" means "voltage".
The use of ignition magnetos is now confined mainly to engines where there is no other available electrical supply, for example in lawnmowers and chainsaws. It is also widely used in aviation piston engines even though an electrical supply is usually available. In this case the magneto's self-powered operation is considered to offer increased reliability; in theory the magneto should continue operation as long as the engine is turning.
Firing the gap of a spark plug, particularly in the combustion chamber of a high-compression engine, requires a greater voltage (or "higher tension") than can be achieved by a simple magneto. The "high-tension magneto" combines an alternating current magneto generator and a transformer. A high current at low voltage is generated by the magneto, then transformed to a high voltage (even though this is now a far smaller current) by the transformer.
The first person to develop the idea of a high-tension magneto was Andre Boudeville, but his design omitted a condenser (capacitor); Frederick Richard Simms in partnership with Robert Bosch were the first to develop a practical high-tension magneto.
Magneto ignition was introduced on the 1899 Daimler Phönix. This was followed by Benz, Mors, Turcat-Mery, and Nesseldorf, and soon was used on most cars up until about 1918 in both low voltage (voltage for secondary coils to fire the spark plugs) and high voltage magnetos (to fire the spark plug directly, similar to coil ignitions, introduced by Bosch in 1903).
In the type known as a "shuttle magneto", the engine rotates a coil of wire between the poles of a magnet. In the "inductor magneto", the magnet is rotated and the coil remains stationary.
As the magnet moves with respect to the coil, the magnetic flux linkage of the coil changes. This induces an EMF in the coil, which in turn causes a current to flow. One or more times per revolution, just as the magnet pole moves away from the coil and the magnetic flux begins to decrease, a cam opens the contact breaker (called “the points” in reference to the two points of a circuit breaker) and interrupts the current. This causes the electromagnetic field in the primary coil to collapse rapidly. As the field collapses rapidly there is a large voltage induced (as described by Faraday's Law) across the primary coil.
As the points begin to open, point spacing is initially such that the voltage across the primary coil would arc across the points. A capacitor is placed across the points which absorbs the energy stored in the leakage inductance of the primary coil, and slows the rise time of the primary winding voltage to allow the points to open fully. The capacitor's function is similar to that of a snubber as found in a flyback converter.
A second coil, with many more turns than the primary, is wound on the same iron core to form an electrical transformer. The ratio of turns in the secondary winding to the number of turns in the primary winding, is called the "turns ratio". Voltage across the primary coil results in a proportional voltage being induced across the secondary winding of the coil. The turns ratio between the primary and secondary coil is selected so that the voltage across the secondary reaches a very high value, enough to arc across the gap of the spark plug. As the voltage of the primary winding rises to several hundred volts, the voltage on the secondary winding rises to several tens of thousands of volts, since the secondary winding typically has 100 times as many turns as the primary winding.
The capacitor and the coil together form a resonant circuit which allows the energy to oscillate from the capacitor to the coil and back again. Due to the inevitable losses in the system, this oscillation decays fairly rapidly. This dissipates the energy that was stored in the condenser in time for the next closure of the points, leaving the condenser discharged and ready to repeat the cycle.
On more advanced magnetos the cam ring can be rotated by an external linkage to alter the ignition timing.
In a modern installation, the magneto only has a single low tension winding which is connected to an external ignition coil which not only has a low tension winding, but also a secondary winding of many thousands of turns to deliver the high voltage required for the spark plug(s). Such a system is known as an "energy transfer" ignition system. Initially this was done because it was easier to provide good insulation for the secondary winding of an external coil than it was in a coil buried in the construction of the magneto (early magnetos had the coil assembly externally to the rotating parts to make them easier to insulate—at the expense of efficiency). In more modern times, insulation materials have improved to the point where constructing self-contained magnetos is relatively easy, but energy transfer systems are still used where the ultimate in reliability is required such as in aviation engines.
Because it requires no battery or other source of electrical energy, the magneto is a compact and reliable self-contained ignition system, which is why it remains in use in many general aviation applications.
Since the beginning of World War I in 1914, magneto-equipped aircraft engines have typically been "dual-plugged", whereby each cylinder has two spark plugs, with each plug having a separate magneto system. Dual plugs provide both redundancy should a magneto fail, and better engine performance (through enhanced combustion). Twin sparks provide two flame fronts within the cylinder, these two flame fronts decreasing the time needed for the fuel charge to burn. As the size of the combustion chamber determines the time to burn the fuel charge, dual ignition was especially important for the large-bore aircraft engines around World War II where it was necessary to combust the entire fuel mixture in a shorter time than a single plug could provide, in order to build peak cylinder pressure at the rpm desired.
Because the magneto has low voltage output at low speed, starting an engine is more difficult. Therefore, some magnetos have an impulse coupling, a springlike mechanical linkage between the engine and magneto drive shaft which "winds up" and "lets go" at the proper moment for spinning the magneto shaft. The impulse coupling uses a spring, a hub cam with flyweights, and a shell. The hub of the magneto rotates while the drive shaft is held stationary, and the spring tension builds up. When the magneto is supposed to fire, the flyweights are released by the action of the body contacting the trigger ramp. This allows the spring to unwind giving the rotating magnet a rapid rotation and letting the magneto spin at such a speed to produce a spark.
Some aviation engines as well as some early luxury cars have had dual-plugged systems with one set of plugs fired by a magneto, and the other set wired to a coil, dynamo, and battery circuit. This was often done to ease engine starting, as larger engines may be too difficult to crank at sufficient speed to operate a magneto, even with an impulse coupling. As the reliability of battery ignition systems improved, the magneto fell out of favour for general automotive use, but may still be found in sport or racing engines.
|
https://en.wikipedia.org/wiki?curid=20837
|
Mystery Science Theater 3000
Mystery Science Theater 3000 (abbreviated as MST3K) is an American television comedy series created by Joel Hodgson and currently produced by Alternaversal Productions, LLC. The show premiered on KTMA-TV (now WUCW) in Minneapolis, Minnesota, on November 24, 1988. It later aired on The Comedy Channel/Comedy Central for seven seasons until its cancellation in 1996. Thereafter, it was picked up by The Sci-Fi Channel and aired for three seasons until another cancellation in August 1999. A 60-episode syndication package titled "The Mystery Science Theater Hour" was produced in 1993 and syndicated to stations in 1995. In 2015, Hodgson led a crowdfunded revival of the series with 14 episodes in its eleventh season, first released on Netflix on April 14, 2017, with another six-episode season following on November 22, 2018. , 217 episodes and a have been produced as well as a two live tours.
The show initially starred Hodgson as Joel Robinson, a janitor trapped by two mad scientists ("The Mads") on the Earth-orbiting "Satellite of Love", and forced to watch a series of B movies in order to monitor his reaction to them. To keep his sanity, Joel crafts sentient robot companions, including Tom Servo, Crow T. Robot, and Gypsy, to keep him company and help him humorously comment on each movie as it plays, a process known as riffing. Each two-hour episode would feature a single movie (often edited for time constraints), sometimes preceded by various old shorts and educational films, with Joel, Tom, and Crow watching in silhouette from a row of theater seats at the bottom of the screen. These "theater segments" were framed with interstitial sketches called "host segments". The show's cast changed over its duration; most notably, the character of Joel was replaced by Mike Nelson (played by Michael J. Nelson) halfway through the show's fifth season. Other cast members, most of whom were also writers for the show, include Trace Beaulieu, Josh Weinstein, Jim Mallon, Kevin Murphy, Frank Conniff, Mary Jo Pehl, Bill Corbett, Paul Chaplin, and Bridget Jones Nelson. The 2017 revival features a primarily new cast, including Jonah Ray who plays the new human test subject Jonah Heston, along with Felicia Day and Patton Oswalt as "The Mads" and Baron Vaughn, Hampton Yount, and Rebecca Hanson voicing Tom Servo, Crow T. Robot, and Gypsy, respectively.
"MST3K"s original run did not garner high viewership numbers. However, the show's popularity spread through online word-of-mouth by its fans known as "MSTies" or "Mysties" (who would remind others to "Keep circulating the tapes"), frequent repeats, syndication, and home media offerings produced by Rhino Entertainment. Currently, this popularity continues through Shout! Factory, who along with Hodgson, now own the rights to the show and supported the revived series. "MST3K" was listed as one of "Time" magazine's "100 Best TV Shows of All-TIME" in 2007, and "TV Guide" has noted "MST3K" as one of the top cult television shows. The show won a Peabody Award in 1993, was also nominated for two Emmy Awards in 1994 and 1995, and for the CableACE Award from 1992 to 1997. The show was considered highly influential, contributing towards the practice of social television, and former cast members launched similar projects based on the riffing of films, including The Film Crew, RiffTrax (ongoing as of 2020), and Cinematic Titanic. "MST3K" also brought to light several older movies that had fallen into obscurity or had received little or no public attention when originally released. Many of these films were subsequently identified as among the worst movies ever made, most notably "".
While the cast of "MST3K" has changed throughout its history, the basic premise of the show remains consistent: a human test subject—first Joel Robinson (Joel Hodgson), then Mike Nelson (Michael J. Nelson), and most recently Jonah Heston (Jonah Ray)—has been imprisoned aboard the spacecraft "Satellite of Love" by mad scientists (collectively called "The Mads") and is forced to watch a series of bad movies in order to find one that will drive the test subject insane.
In an attempt to keep his sanity, Joel built sentient robots ("the bots") from parts aboard the "Satellite of Love", and they subsequently remained aboard with Joel's successors as test subjects. The Bots include Tom Servo, Crow T. Robot, Gypsy (who is in charge of satellite operations) and Cambot, the silent recorder of the experiments. Crow and Servo join the human test subject in watching the film in the satellite's theater. To keep from going mad, the trio frequently comment and wisecrack during the movie, a process known as "riffing". At regular intervals throughout the movie, the hosts leave the theater and return to the bridge of the satellite to perform sketches (commonly called "host segments") that often satirize the film being watched.
The general format of an "MST3K" episode has remained the same throughout the series' run. Episodes are approximately 90 minutes in running time (excluding commercial breaks) and begin with a short introductory segment in which the human host and the 'bots interact with the Mads before being sent the movie. During Joel Hodgson and Jonah Ray's tenures as hosts (and for a brief period at the start of the Mike Nelson era), the hosts and the Mads engage in an "invention exchange" in which they each show off their latest inventions. Sirens and flashing lights ("Movie Sign") then signal the characters to enter the theater.
In the theater, the human host and 'bots' Tom and Crow sit in a row of theater seats, shown in silhouette along the bottom of the screen, an approach Hodgson called "Shadowramma". The three then riff on the film (which is sometimes accompanied by one or more shorts) as it plays for both them and the audience. Occasionally the silhouette format is used as a source of humor or as a means of creating unobtrusive censor bars for scenes containing nudity. The show transitions into and out of the theater via a "door sequence", a series of six doors that open or close as the camera (ostensibly Cambot) passes through them.
At regular intervals throughout the episode, the characters leave the theater and perform sketches usually inspired by the events of the film or short being shown, frequently making use of original songs and prop comedy. Some sketches bring in new or recurring characters or other devices; the host would consult an external camera "Rocket Number Nine" to show events happening outside the "Satellite", and the "Hexfield Viewscreen" would be used to communicate with other characters from the ship's bridge. At the end of each sketch, "Movie Sign" is triggered again and the characters must re-enter the theater.
During Hodgson's period on the show, the final sketch aboard the "Satellite" often included reading of fan mail from the "MST3K Info Club". Fan mail readings decreased during Mike Nelson's tenure as host and were dropped entirely once the show moved onto the Sci-Fi Channel. The final sketch of an episode typically ends on the Mads, with the lead Mad asking their lackey to "push the button" to end the transmission and transitioning to the credit sequence. After the credits, a humorous short clip from the featured film (or the accompanying short, on occasion) is replayed as a "stinger" to end the episode.
In November 1993, a limited selection of episodes were repackaged into an hour-long show titled "Mystery Science Theater Hour", meant to be better suited for off-network syndication. In these, the original episode was split into two parts of roughly 45 minutes each excluding commercials. New skits leading and ending each episode incorporated Mike Nelson portraying television host Jack Perkins in a parody of Perkins' "Biography" series in mock flattery of the "MST3K" episode being shown.
Hodgson is credited for devising the show's concept. Prior to the show, Hodgson was an up-and-coming comedian from Minneapolis having moved to Los Angeles and made appearances on "Late Night with David Letterman" and "Saturday Night Live". He had been invited by Brandon Tartikoff to be on a NBC sitcom co-starring Michael J. Fox, but Hodgson felt the material was not funny and declined (the proposed sitcom went unrealized). He further became dissatisfied with the Hollywood attitudes when they tried to double their offer, acquiring what he called a "healthy disrespect" for the industry. He moved back to Minneapolis-St. Paul, taking a job in a T-shirt printing factory that allowed him to conceive of new comedy ideas while he was bored. One such idea was the basis of "MST3K", a show to comment humorously on movies and that would also allow him to showcase his own prop comedy-style humor. Hodgson referred to these jokes as "riffs", based both on the idea of musical riffs as well as the idea of comedy riffs, a term he attributes to "The Simpsons"s writer Dana Gould. In terms of movie selection, Hodgson had recalled that his college roommate had a copy of "The Golden Turkey Awards", and he had previously wondered why no one had made any program about these "adorable, weird movies" listed within it.
Hodgson said that part of the idea for "MST3K" came from the illustration for the song "I've Seen That Movie Too" (drawn by Mike Ross) in the liner notes from Elton John's "Goodbye Yellow Brick Road" album, showing silhouettes of two people in a theater watching a movie. Hodgson also likened the show's setting to the idea of a pirate radio station broadcasting from space. Hodgson credits "Silent Running", a 1972 science-fiction film directed by Douglas Trumbull, as being perhaps the biggest direct influence on the show's concept. The film is set in the future and centers on a human, Freeman Lowell (Bruce Dern), who is the last crew member of a spaceship containing Earth's last surviving forests. His remaining companions consist only of three robot drones. "MST3K" and the Joel Robinson character occasionally reflected Lowell's hippie-like nature. Hodgson wanted the feel of the show to appear homemade, and cited the example of a crude mountain prop used during the "Saturday Night Live" sketch "Night on Freak Mountain" that received a humorous reaction from the studio audience as the type of aesthetic he wanted for the show.
Both old movies and music inspired several of the show's character names as developed by Hodgson. The show's name came from the promotional phrase "Mystery Scientist" used by magician Harlan Tarbell and a play on the name of Sun Ra's band, the Myth Science Arkestra. The "3000" was added to spoof the common practice of adding "2000" to show and product names in light of then-upcoming 21st century, and Hodgson thought it would set his show apart to make it "3000". Dr. Forrester was named after the main character of "The War of the Worlds". The "Satellite of Love" was named after the song of the same name by Lou Reed. Crow T. Robot was inspired by the song "Crow" from Jim Carroll's "Catholic Boy", while Rocket Number 9's name was inspired by the original name of Sun Ra's album "Interstellar Low Ways".
The theater shots, the primary component of an episode, is taped in "Shadowrama". The "seats" were a black-painted foam core board sitting behind the seat (towards the camera) for the host, and stages for the Crow and Tom puppets. The human host wore black clothing while the robot puppets were painted black; the screen they watched was a white luma key screen as to create the appearance of silhouettes. The actors would follow the movie and the script through television monitors located in front of them, as to create the overall theater illusion.
The "door sequence" was created to transition from host segments to the theater segments, which Hodgson took inspiration from the "Mickey Mouse Club", noting that the commonality to the title credits of "Get Smart" were coincidental. In devising this sequence, this also led to Beaulieu creating the dogbone-like shape of the "Satellite of Love" with additional inspiration taken from the bone-to-ship transition in the film "". Hodgson had wanted to use a "motivated camera" for taping, a concept related to motivated lighting; in this mode, all the shots would appear to have been taken from an actual camera that was part of the scene to make the scene appear more realistic. This led to the creation of Cambot as a robot that the host would speak to during host segments or recording them while in the theater, and Rocket Number Nine to show footage outside of the Satellite of Love.
The show's theme song, the "Love Theme from Mystery Science Theater 3000", was written by Hodgson and Weinstein, which helped to cement some of the broader narrative elements of the show, such as the Mads and Joel being part of an experiment. The song was composed by Charlie Erickson with help from Hodgson in the style of Devo, The Replacements, and The Rivieras (particularly their cover of the song "California Sun") and sung by Hodgson. Initial shows used foam letters to make the show's title, but they later created the spinning-moon logo out of a 2-foot (0.6m) diameter fiberglass ball, covered with foam insulation and the lettering cut from additional foam pieces. Hodgson felt they needed a logo with the rotating effect as opposed to a flat 2D image, and though they had envisioned a more detailed prop, with the letters being the tops of buildings on this moon, they had no time or budget for a project of that complexity and went with what they had. Musical numbers would also be used as part of the host segments, which Hodgson said came out naturally from the riffing process; they would find themselves at times singing along with the movie instead of just riffing at it, and took that to extend songs into the host segments.
Hodgson approached Jim Mallon, at the time the production manager of KTMA, a low-budget local television station, with his idea of a show based on riffing on movies, using robots that were created out of common objects. Mallon agreed to help produce a pilot episode, and Hodgson hired on local area comedians J. Elvis Weinstein (initially going by Josh Weinstein but later changed to J. Elvis as to distinguish himself from Josh Weinstein, a well-known writer for "The Simpsons") and Trace Beaulieu to develop the pilot show. By September 1988, Hodgson, Mallon, Weinstein, and Beaulieu shot a 30-minute pilot episode, using segments from the 1968 science-fiction film "The Green Slime". The robots and the set were built by Hodgson in an all-nighter. Joel watched the movie by himself, and was aided during the host segments by his robots, Crow (Beaulieu), Beeper, and Gypsy (Weinstein). Hodgson used the narrative that his character named "Joel Hodgson" (not yet using his character name of Robinson) had built the "Satellite of Love" and launched himself into space. Camera work was by Kevin Murphy, who was employed by KTMA. Murphy also created the first doorway sequence and theater seat design. These initial episodes were recorded at the long since-defunct Paragon Cable studios and customer service center in Hopkins, Minnesota.
Mallon met with KTMA station manager Donald O'Conner the next month and managed to get signed up for thirteen episodes. Show production was generally done on a 24-hour cycle, starting with Mallon offering a few films from KTMA's library for the writers to select from. Riffing in these episodes was ad-libbed during taping using notes made during preliminary viewings of the selected film. The show had some slight alterations from the pilot — the set was lit differently, the robots (now Crow, Servo and Gypsy) joined Joel in the theater, and a new doorway countdown sequence between the host and theater segments was shot. The puppeteers worked personalities into their robots: Crow (Beaulieu) was considered a robotic Groucho Marx, Tom Servo (Weinstein) as a "smarmy AM radio DJ", and Gypsy (Mallon) modeled after Mallon's mother who had a "heart of gold" but would become disoriented when confronted with a difficult task. The development of the show's theme song would lead to establishing elements for the show's ongoing premise, with Hodgson now portraying himself as the character Joel Robinson.
"Mystery Science Theater 3000" premiered on KTMA at 6:00 p.m. on Thanksgiving Day, November 24, 1988 with its first episode, "Invaders from the Deep", followed by a second episode, "Revenge of the Mysterons from Mars" at 8:00 p.m. The choice of running the premiere on Thanksgiving was by happenstance, as the station felt the show was ready to go at that point, according to Hodgson. Initially, the show's response was unknown, until Mallon set up a phone line for viewers to call in. Response was so great that the initial run of 13 episodes was extended to 21, with the show running to May 1989. Hodgson and Mallon negotiated to secure the rights for the show for themselves, creating Best Brains, Inc., agreeing to split ownership of the idea equally. During this time a fan club was set up and the show held its first live show at Scott Hansen's Comedy Gallery in Minneapolis, to a crowd of over 600.
Despite the show's success, the station's overall declining fortunes forced it to file for bankruptcy reorganization in July 1989. At the same time, HBO was looking to build a stable of shows for their upcoming Comedy Channel cable network. HBO approached Best Brains and requested a sample of their material. Hodgson and Mallon provided a seven-minute demo reel, which led to the network greenlighting "MST3K" as one of the first two shows picked up by the new network.
The Comedy Channel offered Best Brains $35,000 per episode but allowed Best Brains to retain the show's rights. Best Brains was also able to keep production local to Minnesota instead of the network's desire to film in New York City or Los Angeles, as it would have cost four times more per episodes, according to Hodgson. Best Brains established an office and warehouse space in Eden Prairie for filming. With an expanded but still limited budget, they were able to hire more writers, including Mike Nelson, Mary Jo Pehl, and Frank Conniff, and build more expansive sets and robot puppets. They created the characters of Dr. Forrester (Beaulieu) and Dr. Erhardt (Weinstein) and crafted the larger narrative of each episode being an "experiment" they test on Joel. The show began its national run shortly after the Comedy Channel went on the air in November 1989.
"MST3K" was considered Comedy Channel's signature program, generating positive press about the show despite the limited availability of the cable channel nationwide. After the second season, The Comedy Channel and rival comedy cable network HA! merged to become Comedy Central (initially, it was known as CTV: The Comedy Network). During this period, "MST3K" became the newly-merged cable channel's signature series, expanding from 13 to 24 episodes a year. To take advantage of the show's status, Comedy Central ran "Turkey Day", a 30-hour marathon of "MST3K" episodes during Thanksgiving 1991. The name of the event was not only inspired by the traditional turkey meal served on Thanksgiving, but also by use of "Turkey" from "The Golden Turkey Awards" to represent bad movies. This tradition would be continued through the rest of the Comedy Central era. Though the show did not draw large audience numbers compared to other programming on Comedy Central, such as reruns of "Saturday Night Live", the dedicated fans and attention kept the show on the network.
"" was produced during the later half of the Comedy Central era and had a very limited theatrical release in 1996 through Universal Pictures and Gramercy Pictures. It featured Mike and the bots subjected to the film "This Island Earth" by Dr. Forrester. Though well received by critics and fans, the film was a financial disappointment due to its limited distribution.
The cable network was able to provide a wider library of films for Best Brains to riff from. To ensure that they would be able to produce a funny episode, at least one member of the staff would watch the suggested films completely, generally assuring that the movie would be prime for jokes throughout. Conniff stated that he often would have to watch around twenty films in their entirety before selecting one to use for the show. In one specific case, the second-season episode with the film "Sidehackers", they had only skimmed the first part of the movie before making the decision to use it, and only later discovered that it contained a rape scene. They decided to stay committed to the film, but cut out the offending scene and had to explain the sudden absence of the affected character to the audience. After this, they carefully scrutinized entire films for other such offensive content, and once one was selected and assured the rights, committed to completing the episode with that film. Obtaining the rights was handled by the cable networks. Some licensing required buying film rights in packages, with the selected bad movies included in a catalog of otherwise good films, making the negotiations odd since the network was only interested in the bad film. Other times, the rights to the film were poorly documented, and the network would follow the chain of custody to locate the copyright owner as to secure broadcast rights.
In contrast to the ad-libbing of riffs from KTMA, the riffs were scripted ahead of time by the writers. An average episode (approximately 90 minutes running time) would contain more than 600 such riffs, and some with upwards of 800 riffs. Riffs were developed with the entire writing staff watching the film together several times through, giving off-the-cuff quips and jokes as the film went along, or identifying where additional material would be helpful for the comedy. The best jokes were polished into the script for the show. Riffs were developed to keep in line with the characterization of Joel, Mike, and the 'bots. Further, the writers tried to maintain respect for the films and avoided making negative riffs about them, taking into consideration that Joel, Mike, and the 'bots were companions to the audience while watching the movie, and they did not want to come off sounding like jerks even if the negative riff would be funny. Hodgson stated that their goal in writing riffs is not to ridicule films as some have often mistaken, but to rather instead consider what they are doing as "a variety show built on the back of a movie".
Production of an average episode of "MST3K" during the Comedy Central period took about five to nine days once the movie was selected and its rights secured. The first few days were generally used for watching the movie and scripting out the riffs and live action segments. The subsequent days were then generally used to start construction of any props or sets that would be needed for the live action segments while the writers honed the script. A full dress rehearsal would then be held, making sure the segments and props worked and fine tuning the script. The host segments would then be taped on one day, and the theater segments on the next. A final day was used to review the completed work and correct any major flaws they caught before considering the episode complete. Live scenes used only practical special effects, and there was minimal post-editing once taping was completed.
Weinstein left the show after the first Comedy Channel season, reportedly in disagreement with Hodgson about moving toward using scripted rather than ad-libbed jokes. Murphy replaced him as the voice of Tom Servo, portraying the 'bot as a cultured individual, while Dr. Erhardt was replaced with TV's Frank (Conniff).
Hodgson decided to leave the series halfway through Season Five due to his dislike of being on camera and his disagreements with producer Mallon over creative control of the program. Hodgson also stated that Mallon's insistence on producing a feature film version of the show led to his departure, giving up his rights on the "MST3K" property to Mallon. Hodgson later told an interviewer: "If I had the presence of mind to try and work it out, I would rather have stayed. 'Cause I didn't want to go, it just seemed like I needed to." Though they held casting calls for a replacement for Hodgson on camera, the crew found that none of the potential actors really fit the role; instead, having reviewed a test run that Nelson had done with the 'bots, the crew agreed that having Nelson (who had already appeared in several guest roles on the show) replace Hodgson would be the least jarring approach. The replacement of Joel by Mike would lead to an oft-jokingly "Joel vs. Mike flame war" among fans, similar to the "Kirk vs. Picard" discussions in the "Star Trek" fandom.
Conniff left the show after Season Six, looking to get into writing TV sitcoms in Hollywood. TV's Frank was soon replaced on the show by Dr. Forrester's mother, Pearl (Pehl).
By the show's sixth season in 1996, Comedy Central had started creating an identity for its network under new leadership of Doug Herzog, which would lead to successful shows like "The Daily Show", "Win Ben Stein's Money" and "South Park", leaving "MST3K" as an oddity on the network taking up limited program space. Herzog, though stating that "MST3K" "helped put the network on the map" and that its fans were "passionate", believed it was necessary to change things around due to the show's declining and lackluster ratings. The network cancelled "MST3K" after a six-episode seventh season.
The show staff continued to operate for as long as they still had finances to work with. "MST3K"s fan base staged a write-in campaign to keep the show alive. This effort led the Sci-Fi Channel, a subsidiary of USA Networks, to pick up the series. Rod Perth, then-president of programming for USA Networks, helped to bring the show to the Sci-Fi Channel, stating himself to be a huge fan of the show and believing that "the sci-fi genre took itself too seriously and that this show was a great way of lightening up our own presentation".
Writing and production of the show remained relatively unchanged from the Comedy Central period. Before Season Eight commenced filming, Beaulieu opted to leave the show, feeling that anything creative that would be produced by Best Brains would belong to Mallon, and wanted to have more creative ownership himself. To replace Dr. Forrester, two new sidekicks to Pearl were introduced: Professor Bobo (Murphy) and the Observer a.k.a. "Brain Guy" (Corbett). In addition, Corbett took over Crow's voice and puppetry and Best Brains staffer Patrick Brantseg took over Gypsy in the middle of Season Eight. With this replacement, the series' entire original cast had been turned over.
"MST3K" ran for three more seasons on the Sci-Fi Channel. During the Sci-Fi era, Best Brains found themselves more limited by the network: the pool of available films was smaller and they were required to use science fiction films (as per the network's name and programming focus), and the USA Network executives managing the show wanted to see a story arc and had more demands on how the show should be produced. Conflict between Best Brains and the network executives would eventually lead to the show's second cancellation. Peter Keepnews, writing for "The New York Times", noted that the frequent cast changes, as well as the poorer selection of films that he felt were more boring than bizarre in their execution, had caused the show to lose its original appeal. Another campaign to save the show was mounted, including several "MST3K" fans taking contributions for a full-page ad in the trade publication "Daily Variety" magazine, but unlike the first effort, this campaign was unsuccessful.
The season 10 finale, "", premiered on August 8, 1999, during which, in the show's narrative, Pearl Forrester sent the "Satellite of Love" out of orbit, with Mike and the 'bots escaping and taking up residence in an apartment near Milwaukee, where they continue to riff movies. A "lost" episode produced earlier in the season but delayed due to rights issues, "Merlin's Shop of Mystical Wonders", was the final season 10 episode of "MST3K" (and the last of the original run), broadcast on September 12, 1999. Reruns continued to air on the Sci Fi Channel for several years, ending with "The Screaming Skull" on January 31, 2004. The shows later moved to off-network syndication.
Starting in 2010, Hodgson had been trying to bring back "MST3K", spurred on by fan appreciation of the cast and crew 25 years since the show's premiere and the success of his "Cinematic Titanic" project. Hodgson also considered the timing to be ideal, with non-traditional outlets like Netflix picking up original series, and the success of crowdfunding for entertainment projects. However, Hodgson needed to reacquire the rights to the series, at that point still held by Mallon and Best Brains. By 2013, Hodgson was working closely with Shout! Factory, the distribution company handling the home media releases of "MST3K", and completed negotiations with Mallon to buy the rights for "MST3K" for a seven-figure sum by August 2015, enabling a Kickstarter campaign to fund the revival to move forward. Hodgson felt the Kickstarter approach was necessary so that the show's style and approach would be determined by fans rather than through a network if he had sought traditional broadcast funding, as well as to demonstrate the demand for the show through a successful campaign.
The Kickstarter was launched in November 2015, seeking $2 million for the production of three episodes, with stretch goals with additional funding for 12 total episodes. The Kickstarter effort was led by Ivan Askwith, a consultant who also had worked on the "Veronica Mars" and "Reading Rainbow" Kickstarter campaigns. Hodgson estimated each episode would take $250,000 to make, in addition to five-figure movie licensing rights, in contrast to $100,000 needed for the original series. The campaign reached its base funding within a week of its launch. On the final day of the campaign, Hodgson and Shout! ran a streaming telethon which included appearances from the newly selected cast and crew, and various celebrities that supported the revival to help exceed the target funding levels for twelve episodes. The campaign ended on December 11, 2015, with total funding of $5,764,229 from 48,270 backers, with an additional $600,000 in backer add-ons, which allowed Hodgson to plan two more additional episodes, including a Christmas episode, to bring the total season to fourteen episodes. The Kickstarter became the largest one for Film & Video, surpassing the $5.70 million raised for the "Veronica Mars" film, but was ultimately surpassed in March 2019 for an animated series based on the web series "Critical Role".
Hodgson believed that the revival would need a whole new cast, pointing out that the cast had completely turned over in the original series. Comedian Jonah Ray plays Jonah Heston, the new host aboard the Satellite of Love, watching and riffing on the films. Hodgson had met Ray while recording an episode of "The Nerdist Podcast", and felt he would be a good fit. The voices of Crow and Tom Servo are provided by comedians Hampton Yount and Baron Vaughn, respectively, both of whom Ray recommended to Hodgson. Hodgson felt it was important for Ray to have his say on who would play these parts, since it would help Ray be comfortable in the role. Felicia Day plays Kinga Forrester, Clayton Forrester's daughter and one of the new Mads in charge of the experiments, now operating out of a moon base known as "Moon 13". Day had been one of the last to be cast, as Hodgson had scripted out the concept for Forrester's daughter while casting Ray and the others. Hodgson had met Day at the 2015 Salt Lake Comic Con, where she stated her love of "MST3K" to him. Hodgson had seen Day's performance in shows like "The Guild" and "Dr. Horrible's Sing-Along Blog", and felt she matched his idea for the character he had envisioned. Patton Oswalt plays Kinga's henchman, Max, or as his character prefers to be known, "TV's Son of TV's Frank"; Hodgson had already planned to invite Oswalt, a longtime friend and self-professed MST3K fan, as a special guest writer for an episode of the revived series, but decided during the Kickstarter that he would also be a good fit on-camera. Rebecca Hanson, an alum of The Second City, took the role of Gypsy as well as Synthia, a clone of Pearl Forrester who assists Kinga. Har Mar Superstar leads the "Skeleton Crew", a house band in Kinga's lair.
Pehl, Corbett, and Murphy cameo on the revival, reprising their roles as Pearl, Brain Guy, and Professor Bobo, respectively. Hodgson opened up to the show any of the other cast members to make cameo appearances or aid in the creative process. However, Nelson and Beaulieu stated that they would not be involved with the MST3K revival; Nelson said, "The brand does not belong to me, and I make and have made (almost) zero dollars off it since it stopped production in 1999." Conniff noted on his Twitter that Shout! Factory would be "cutting [the former cast members] in, financially at least" on the profits from the series. In addition, other cameos on the new episodes include Neil Patrick Harris, Jerry Seinfeld, and Mark Hamill. Weinstein initially stated that he had no interest in returning to the show, but eventually reprised his role as Dr. Laurence Erhardt in the second season of the Netflix revival.
Hodgson aimed to follow in the pattern of what made for fan-favorite episodes from the original series, borrowing equally from the Joel and Mike eras; he noted there were about 30 episodes that he and fans universally agreed were the show's best, and expected to use these as templates as the basis of the new show. The new episodes include the Invention Exchange that had been part of the Joel era (and some of the Mike era) of the show. Additionally, while not required by the streaming format of Netflix, the new episodes include bumpers that would have wrapped around commercial breaks if shown on network television; Hodgson considered these breaks necessary as a "palate cleanser" as well as to support the narrative for Kinga attempting to commercialize on the "MST3K" brand.
Behind the scenes, the lead writer was Elliott Kalan, former head writer for "The Daily Show with Jon Stewart" and host of "The Flop House", a podcast about bad movies. Dan Harmon and Joel McHale also wrote for the show, along with the on-screen cast members. Hodgson also brought in guest writers for certain episodes that included Justin Roiland, Rob Schrab, Nell Scovell, Ernie Cline, Pat Rothfuss, and Dana Gould. Additionally, Paul & Storm and Robert Lopez composed original songs for the new episodes.
The revival retains the live, handcrafted look from the original, a decision that Hodgson had to set down against others involved in production. Set and prop designers included Wayne White, Pendleton Ward, Rebecca and Steven Sugar, and Guy Davis, while live and practical special effects were planned by Adam Savage. Other staff returning included "MST3K": Charlie Erickson, who composed the original show's theme song and composed the new show's theme and other musical arrangements; Beth "Beez" McKeever, who worked on the original show's props and designed costumes and props for the new show; Crist Ballas performed hair and makeup design; and Paul Chaplin, one of the show's original writers to help write the new shows, along with contributions from Pehl and Corbett. Hodgson himself remained primarily off-camera as the executive producer for the remake, though does appear briefly as Ardy, one of Kinga's henchmen who send Jonah the episode's movie. Hodgson was assisted by Kalan, Richard Foos, Bob Emmer, Garson Foos, Jonathan Stern, and Harold Buchholz. The revival was produced by the companies Satellite of Love, LLC, Alternaversal Productions, and Abominable Pictures.
Production for the new season began on January 4, 2016, with movie selection and script writing. The film selection was narrowed down to about twenty movies as of February 2016, with the rights obtained for about half of them, while Shout! Factory was working to secure worldwide distribution rights for the others. Hodgson noted that the films were more recent than those used on the original series, with "maybe one" from the 1950s/1960s, but did not want to reveal what these films were until the episodes were broadcast as to have the biggest comedic effect on the audience.
Recording and most of the production was completed over September and October 2016 in Los Angeles on a very condensed schedule. In the revival, Ray, Yount, and Vaughn recorded the riffs for all fourteen episodes in a sound studio over a period of a week, allowing them to better synchronize the riffs with the film. This also helped to simplify the process of recording the theater segments, since they then only needed to act out their parts. The 'bots were controlled by multiple puppeteers both in the theater and in skits; Yount and Vaughn used radio-controlled equipment to move the 'bots' mouths, while members from The Jim Henson Company helped with manipulating the bodies, allowing them to achieve effects they could not do in the series' original run such as having Crow appear to walk on his own. All skits for the episodes were completed within a single day, which did not allow them for doing multiple takes unless necessary.
Campaign backers at higher tiers were able to see the first episode at limited "Red Carpet Kickstarter Screening" events shown in a few theaters during February and March 2017. The fourteen episodes were released on Netflix on April 14, 2017, though Kickstarter backers had the opportunity to see the episodes in the days preceding this.
During the 2017 "Turkey Day" Marathon, Hodgson announced that Netflix had greenlighted a twelfth season of "MST3K". Shooting of the twelfth season started on June 4, 2018 and would have six episodes, written to encourage bingewatching and make the series more amenable to non-fans. Further, they created a stronger narrative in the host segments, so that casual viewers would recognize the series having a definitive start, middle, and end. Other changes included Rob Schrab coming on as co-director, and actress Deanna Rooney, Ray's wife, playing Dr. Donna St. Phibes, a "B-movie monster conservationist" who works with the Mads. Former cast member Weinstein returned to reprise his role as Dr. Erhardt. Hodgson had been trying to also bring back both Beaulieu and Conniff for this season, but could not work out the logistics in time.
The 12th season was broadcast on Netflix on Thanksgiving aka "Turkey Day", November 22, 2018, which coincided with the show's 30th anniversary. To avoid conflicting with the new season’s release, the annual Turkey Day Marathon was pushed forward to November 18, 2018.
In November 2019, Hodgson confirmed to Kickstarter backers that the show would not return for a third season on Netflix, but that he would be looking into alternative outlets to carry the show on. The two seasons made for Netflix will remain on the service. Ray stated in an April 2020 interview that "Joel's got some ideas in the pipeline, and it's pretty exciting, what he's working on", and expected further news later in the year.
During MST3k's original run between 1988 and 1999, a total of 197 "MST3K" episodes across eleven seasons were produced. This does not include "The Green Slime" pilot episode, which was used to sell the concept to KTMA but otherwise was never broadcast. The revived show currently consists of two seasons of fourteen and six episodes respectively, bringing the total episode count to 217.
None of the KTMA episodes were rerun nationally or have been released onto home video, primarily due to rights issues. For many years, the first three KTMA episodes were considered to be "missing episodes", as no fan copies are known to exist, but master copies of all these episodes still exist according to Mallon. In November 2016, Hodgson reported that master copies of two of the episodes, "Invaders from the Deep" and "Revenge of the Mysterons from Mars", had been found. The episodes were made available to Kickstarter backers of the new series on November 25, 2016.
The credits in the first four seasons on Comedy Central included the phrase "Keep circulating the tapes" to encourage fans to share VHS tapings they made with others (as Comedy Central was not widely distributed then), despite the questionable copyright practice. Though the phrase was removed from the credits due to legal reasons, the concept of "keep circulating the tapes" was held by the show's fans to continue to help introduce others to the show following its broadcast run.
An annual event in the Comedy Central era was the Turkey Day marathon that ran on or near the Thanksgiving holiday. The marathon would show between six and twelve rebroadcasts of episodes, often with new material between the episodes from the cast and crew. While the show was on Sci-Fi, one Thanksgiving Day marathon of "MST3K" was held during its first season, but lacked any new interstitial material.
Following its acquisition of the series rights, Shout! Factory has streamed Turkey Day marathons on Thanksgiving since 2013, broadcasting six of the "MST3K" episodes and wrapped with introductions from Hodgson alongside other cast members at times. The event was intended to be a one-off, but the fans' reaction to it led Hodgson and Shout! to continue the tradition in subsequent years. The 2015 Turkey Day coincided with the Kickstarter for the show's revival, while the 2016 Turkey Day includes the revival's new host Ray co-hosting alongside Hodgson. The 2017 Turkey Day was hosted by Hodgson, Ray and Felicia Day, and concluded with a surprise announcement that the show had been renewed on Netflix for another season.
Home video releases of "MST3K" episodes are complicated by the licensing rights of the featured film and any shorts, and as such many of the nationally televised episodes have not yet been released onto home video. Through the current distributor, Shout! Factory, over 100 of the films have been cleared for home media distribution. With Shout's release of the 39th volume of "MST3K" episodes in 2017, the company anticipated that only about a dozen episodes out of 197 from the original series' run will never make it to home video due to licensing rights issues of the movies featured.
Original home media releases were issued by Rhino Entertainment, initially starting with single disc releases before switching to semi-regular four-episode volume sets. According to Hodgson, the people at Rhino who were involved in the distribution of "MST3K" eventually left Rhino and joined Shout!, helping to convince that publisher to acquire the rights from Rhino. Since 2008, all releases of "MST3K" have been through Shout! (including some reprints of the first Rhino volume set) and have typically been multi-episode volumes or themed packs.
In 2014, 80 episodes of the show were made available for purchase or rental on the video streaming site Vimeo. Shout! has uploaded some episodes to YouTube with annotations, as documented by "The Annotated MST" fansite, to explain some of the sources of the jokes in the riffs. In February 2015, Shout! launched its own streaming service, Shout! Factory TV, of which selected episodes of "MST3K" were included on the service. Selected episodes were also made available on demand through RiffTrax starting in November 2015. Twenty episodes from previous "MST3K" seasons were released by Netflix in all regions in anticipation of the revival series.
All episodes of Season 11 were released on a DVD/Blu-Ray box set on April 17, 2018, which includes a documentary behind the making of the first revival season.
In 1993, the show's staff selected 30 episodes to split into 60 one-hour segments for "The Mystery Science Theater Hour". The repackaged series' first-run airings of these half-shows ran from November 1993 to July 1994. Reruns continued through December 1994, and it was syndicated to local stations from September 1995 to September 1996, allowing stations to run the series in a one-hour slot, or the original two hour version. "MST3K" returned to television for the first time in ten years in July 2014, when RetroTV began broadcasting the series on Saturday nights, with an encore on Sunday evenings. The following year, they started showing on PBS member stations. In the summer of 2016, Sinclair Broadcast Group and MGM's joint venture sci-fi network Comet picked up the series for a weekly Sunday night double-run; by coincidence, Sinclair's CW station, WUCW in the Twin Cities, which had originated the series when it was KTMA-TV, carries Comet on their second subchannel, returning the series to its original home for the first time in 27 years. The show premiered on IFC on January 7, 2020. It also airs on Z Living.
In 1996, Universal Pictures released "Mystery Science Theater 3000: The Movie", a film adaptation in which Mike and the bots riffed "This Island Earth". The film was released on DVD in the United States by Image Entertainment. Universal re-released the film on DVD on May 6, 2008, with a new anamorphic widescreen transfer, Dolby Digital 5.1 Surround Sound mix and the film's original trailer.
In 1996, the book, "The Amazing Colossal Episode Guide" (written by many of the cast members), was released, which contained a synopsis for every episode from seasons one through six, and even included some behind-the-scenes stories as well. In it, Murphy related two tales about celebrity reactions he encountered. In one, the cast went to a taping of Dennis Miller's eponymous show; when they were brought backstage to meet Miller, the comedian proceeded to criticize the "MST3K" cast for their choice of movie to mock in the then-recent episode "Space Travelers" (a re-branded version of the Oscar-winning film "Marooned"). Murphy also discussed how he met Kurt Vonnegut, one of his literary heroes. When he had mentioned the show and its premise to Vonnegut, the author suggested that even people who work hard on bad films deserve some respect. Murphy then invited Vonnegut to dine with his group, which Vonnegut declined, claiming that he had other plans. When Murphy and friends ate later that night, he saw Vonnegut dining alone in the same restaurant, and remarked that he had been "faced...but "nicely" faced" by one of his literary heroes.
Dark Horse Comics announced on February 16, 2017 that it had planned a "MST3K" comic book series that was set for initial release in 2017. In June 2018, Dark Horse affirmed that the six-issue series would launch in September 2018, and would feature Jonah and the bots riffing on public domain comic books. The first comic was released on September 12, 2018 and it focuses on Jonah and the Bots trying to get out of comics while trying to save Crow when he starts to become a monster in the pages of Horrific. Hodgson oversaw the writing.
The first "MST3K" live event was held on June 5 and 6, 1989 at the Comedy Gallery in Minneapolis. Jim Mallon served as the emcee of the event that featured stand-up sets by Joel, Josh Weinstein, and Trace Beaulieu. A Q&A session about the show was conducted, and the show's original pilot was shown. The robots and various props were on display for attendees to see.
The first live riffing event, called "MST Alive!" was held at the Uptown Theater in Minneapolis on July 11, 1992. There were two showings, both with live riffing of the feature film "World Without End", as well as sing-alongs of different songs from the show, followed by a Q&A session. The event was hosted, in character, by Dr. Forrester, TV's Frank, Joel, and the Bots. A second version of "MST Alive!" was presented as a part of the first ever MST3K "ConventioCon ExpoFest-A-Rama" in 1994. In this show, Forrester and Frank forced Mike and the bots to watch THIS ISLAND EARTH, a film which was later riffed as a part of Mystery Science Theater 3000: The Movie.
Hodgson and the team for the 2017 revival announced an "MST3K" "Watch Out For Snakes Tour" during mid-2017 covering 29 cities in the United States and Canada. Jonah and the Bots riff on one of two films live for audiences, either "Eegah" (which had already been featured on the original run of "MST3K", and which popularized the riff "Watch out for snakes", but featured new riffs for this tour) or an unannounced surprise film: "Argoman the Fantastic Superman". The tour featured Ray, Yount and Hanson reprising their roles as Jonah Heston, Crow and Gypsy/Synthia. Vaughn was unavailable to perform Servo due to the birth of his child and the role was covered by Tim Ryder. The tour also featured Grant Baciocco as Terry the Bonehead, pre-recorded appearances from Day and Oswalt as Kinga and Max, and a live introduction from Hodgson.
Hodgson and Ray also toured in late 2018 as part of a 30th anniversary of "MST3K" in a similar format to the 2017 tour. Hodgson reprised the role of Joel Robinson and riffed movies alongside Ray and the bots during these shows. Ryder continued to perform Tom Servo, while Grant Baciocco, the lead puppeteer, voiced Crow. Rebecca Hanson also joined in her role as Synthia as the host of the show. Movies riffed at these shows included "The Brain" and "Deathstalker II". During the tour, Hodgson announced that Deanna Rooney will be joining the cast in the twelfth season as a new "Mad" working with Kinga and Max.
The 2019 live tour, The Great Cheesy Movie Circus Tour, is promoted as Hodgson's "final live tour". Most of the tour dates feature the film "No Retreat, No Surrender", while a few show "Circus of Horrors". The tour was promoted with video production updates and on-the-road updates via the tour's official website. In addition to Joel, the parts of Crow and Servo are portrayed by Nate Begle and Conor McGiffin respectively. Yvonne Freese plays the part of Gypsy and Mega-Synthia, a clone of both Pearl Forrester and the original Synthia. Emily Marsh also features in the tour as the new character Crenshaw.
In 1996, during promotion for the film, Nelson and the bots were interviewed in-character on MTV, and seen in silhouettes heckling footage from MTV News featuring the band Radiohead. Also that year, Hodgson was a featured guest on Cartoon Network's "Space Ghost Coast to Coast".
In 1997, the videogame magazine "PlayStation Underground" (Volume 2, Number 1) included a Best Brains-produced "MST3K" short on one of their promotional discs. The video opened with a host segment of Mike and the Bots playing some PlayStation games, only to go into the theater to riff on some videos from the magazine's past. The feature is about seven minutes long. An Easter egg on the disc has some behind-the-scenes footage of Best Brains filming the sequences.
Nelson and the robot characters appeared in silhouette on an episode of "Cheap Seats", a TV series in which The Sklar Brothers commented on clips of sporting events in a manner similar to MST3K.
In 2007, a new online animated web series, referred to as "The Bots Are Back!", was produced by Mallon. The series planned a weekly adventure featuring Crow, Tom Servo, and Gypsy, with Mallon reprising his role as Gypsy and Paul Chaplin as Crow. However, only a handful of episodes were released, and the series was abandoned due to budgetary issues. The general internet response to the webisodes was largely negative.
During the COVID-19 pandemic, Hodgson announced a special "Mystery Science Theater 3000 Live Riff-Along" to first air live on May 3, 2020; the show will feature Hodgson along with Emily Marsh, Conor McGiffin, Nate Begle, and Yvonne Freese, who had joined him during the 2019 "MST3K" live tour, riffing to a new short "Circus Days", and then riffing atop the "MST3K" season one episode featuring "Moon Zero Two".
In 2004, the show was listed as in a featured "TV Guide" article, "25 Top Cult Shows Ever!", and included a sidebar which read, "Mike Nelson, writer and star (replacing creator Joel Hodgson), recently addressed a college audience: 'There was nobody over the age of 25. I had to ask, "Where are you seeing this show?" I guess we have some sort of timeless quality.'" Three years later, "TV Guide" rewrote the article, and bumped "MST3K" to #13. In 2007, the show was listed as one of "Time" magazine's "100 Best TV Shows of All-Time". In 2012, the show was listed as #3 in "Entertainment Weekly"s "25 Best Cult TV Shows from the Past 25 Years", with the comment that ""MST3K" taught us that snarky commentary can be way more entertaining than the actual media."
The 2017 relaunch was met with critical acclaim; the first reboot season currently has a 100% rating on Rotten Tomatoes.
The reactions of those parodied by "MST3K" have been mixed. Some notable negative reactions include that of Sandy Frank, who held the rights to several "Gamera" films parodied on the show. He said he was "intensely displeased" by the mockery directed at him. (The crew once sang the "Sandy Frank Song", which said that Frank was "the source of all our pain", "thinks that people come from trees", Steven Spielberg "won't return his calls", and implied that he was too lazy to make his own films.) Because of this, Frank reportedly refused to allow the shows to be rebroadcast once "MST3K"s rights ran out. However, this may in fact be a rumor, as other rumors indicate that the "Gamera" films distribution rights prices were increased beyond what BBI could afford as a result of the show's success. According to Shout Factory, the Japanese movie studio Kadokawa Pictures were so horrified with MST3K's treatment of five "Gamera" films that they refused to let Shout release the episodes on home video. Brian Ward (one of the members of Shout! Factory) explained to fans on the forums of the official Shout! Factory website that they tried their best to convince them, but the Japanese take their "Gamera" films very seriously and do not appreciate them being mocked. However, eventually Shout was able to clear the episodes for a special 2011 release due to the rights in North America shifting away from the Japanese to another, North American entity that had no such qualms.
Kevin Murphy has said that Joe Don Baker wanted to "beat up" the writers of the show for attacking him during riffing of "Mitchell". Murphy later stated that Baker probably meant it in a joking manner, although Mike Nelson has said that he had deliberately avoided encountering Baker while the two happened to be staying at the same hotel. Jeff Lieberman, director of "Squirm", was also quite angry at the "MST3K" treatment of his film.
Director Rick Sloane was shocked at his treatment at the conclusion of "Hobgoblins", in which Sloane himself was mercilessly mocked over the film's end credits. In a 2008 interview, however, Sloane clarified his comments, saying that "I laughed through the entire "MST3K" episode, until the very end. I wasn't expecting the humor to suddenly be at my own expense. I was mortified when they dragged out the cardboard cutout and pretended to do an interview with me. I was caught off guard. I had never seen them rip apart any other director before on the show." However, he credits the success of the "MST3K" episode with inspiring him to make a sequel to "Hobgoblins", released in 2009.
Others, however, have been more positive: Robert Fiveson and Myrl Schriebman, producers of "", said they were "flattered" to see the film appear on "MST3K". Actor Miles O'Keeffe, the star of the film "Cave Dwellers", called Best Brains and personally requested a copy of the "MST3K" treatment of the film, saying he enjoyed their skewering of what he had considered to be a surreal experience; according to Hodgson, O'Keeffe said his friends always heckled his performance in the film when it was on, and he appreciated the "MST3K" treatment. In the form of an essay and E. E. Cummings-esque poem, Mike Nelson paid tribute to O'Keeffe with a humorous mix of adulation and fear.
Rex Reason, star of "This Island Earth", made appearances at several "MST3K" events and credits "MST3K" with introducing the film to a new generation. The crew of "Time Chasers" held a party the night the "MST3K" treatment of their film aired and, while reactions were mixed, director David Giancola said, "Most of us were fans and knew what to expect and we roared with laughter and drank way too much. I had a blast, never laughed so hard in my life."
Actor Adam West, star of the 1960s "Batman" TV series, co-starred in "Zombie Nightmare", another film "MST3K" mocked. West apparently held no grudges, as he hosted the 1994 "Turkey Day" marathon in which the episode featuring "Zombie Nightmare" had its broadcast premiere. Mamie Van Doren (who appeared in episode 112, "Untamed Youth", and episode 601, "Girls Town"), Robert Vaughn (star of episode 315, "Teenage Cave Man", which he called the worst movie ever made) and Beverly Garland (who had appeared in many "MST3K"-featured Roger Corman films) also hosted at the marathon.
In 1993, "MST3K" won a Peabody Award for "producing an ingenious eclectic series": "With references to everything from Proust to "Gilligan's Island", "Mystery Science Theater 3000" fuses superb, clever writing with wonderfully terrible B-grade movies". In 1994 and 1995, the show was nominated for the Primetime Emmy Award for Outstanding Individual Achievement in Writing for a Variety or Music Program, but lost both times to "Dennis Miller Live". Every year from 1992 to 1997, it was also nominated for CableACE Awards. Its DVD releases have been nominated for Saturn Awards in 2004, 2006, 2007, and 2018.
Through "MST3K", many obscure films have been more visible to the public, and several have since been considered some of the worst films ever made and are voted into the Bottom 100 on the Internet Movie Database. Of note is "", which was riffed on by "MST3K" in its fourth season. "Manos" was a very low-budget film produced by Hal Warren, a fertilizer salesman at the time, taking on a dare from a screenwriter friend to show that anyone could make a horror film. The film suffered from numerous production issues due to its limited filming equipment, and many critics describe the result using a riff from "MST3K", in that "every frame of this movie looks like someone's last-known photograph". The "MST3K" episode featuring "Manos" was considered one of its most popular and best episodes, and brought "Manos" into the public light as one of the worst films ever produced. The film gained a cult following, and an effort was made to restore the film to high-definition quality from its original film reels. "MST3K" also riffed on three films directed by Coleman Francis; "Red Zone Cuba", "The Skydivers", and "The Beast of Yucca Flats", which brought awareness of Francis' poor direction and low-budget films, similar to that of Ed Wood. "MST3K" also brought to the limelight lackluster works by Bert I. Gordon, primarily giant monster B-movies, that gained attention through the show, and many Japanese kaiju movies imported and dubbed through producer Sandy Frank (jokingly referred to as "the source of all our pain"), particularly those in the "Gamera" series.
"MST3K"s riffing style to poke fun at bad movies, films, and TV shows, have been used in other works. In 2003, the television series "Deadly Cinema", starring Jami Deadly, debuted, which featured the cast making fun of bad movies, "MST3K"-style. In 2004, the ESPN Classic series "Cheap Seats", debuted, which featured two brothers making fun of clips of old sporting events, "MST3K"-style, and is noteworthy for containing an episode in which Mike, Crow, and Tom Servo briefly appeared in a cameo to make fun of the hosts' own skits. In 2008, the internet and direct-to-DVD comedy series "Incognito Cinema Warriors XP", debuted, which used the same "host segment-movie segment" format the show established, while featuring completely original characters and plot. "ICWXP" gained a similar cult following, even earning the praises of former "MST3K" host Michael J. Nelson. In 2010, the television series "This Movie Sucks!" (and its predecessor "Ed's Nite In"), starring Ed the Sock and co-hosts Liana K and Ron Sparks, debuted. It features the cast making fun of bad movies. Creator Steven Kerzner, however, was quick to point out that "MST3K" was not "the creator of this kind of format, they're just the most recent and most well-known". In 2011, the theater silhouette motif was parodied by golf commentator and talk show host David Feherty in an episode of "Feherty". He is shown sitting in front of a large screen and "riffing" while viewing footage of golfer Johnny Miller and is joined in the theater by his stuffed rooster (Frank) and his gnome statue (Costas).
Further, the riffing style from "MST3K" is considered part of the influence for DVD commentaries and successful YouTube reviewers and Let's Play-style commentators. DVD releases for both "Ghostbusters" and "Men in Black" used a similar format to Shadowrama for an "in-vision" commentary features. The concept of social television, where social media is integrated into the television viewing experience, was significantly influenced by "MST3K". This social media practice of live-tweeting riffs and jokes on broadcast shows, such as for films like "Sharknado", has its roots in "MST3K". The "MST3K" approach has inspired Internet movie critics to create comedic movie reviews approaches, such as through RedLetterMedia and Screen Junkies which are considered more than just snarking on the movie but aim to help the viewer understand film and story techniques and their flawed use in poorly-received films.
Public performances of live riffing have been hosted by various groups in different cities across the U.S. and Canada, including Cineprov (Atlanta, Georgia), Master Pancake Theater (Austin, TX), The Gentlemen Hecklers (Vancouver, BC Canada), Counterclockwise Comedy (Kansas City, Missouri), FilmRoasters (Richmond, Virginia), Moxie Skinny Theatre 3000 (Springfield, Missouri), Riff Raff Theatre (Iowa City, Iowa), Twisted Flicks (Seattle, Washington), and Turkey Shoot (Metro Cinema at the Garneau, Edmonton, Alberta, Canada). Canadian sketch comedy group LoadingReadyRun produced the show "Unskippable" for The Escapist website, which applied the MST3K premise to video game cut scenes.
The Center for Puppetry Arts crowdfunded and successfully acquired Tom Servo and Crow T. Robot in 2019.
"MST3K", broadcasting during the emergence of the Internet for public use, developed a large fan base during its initial broadcast; which has continued to thrive since then. The show had already had its postal-based fan club, which people could write into and which some letters and drawings read on subsequent episodes, and the producers encouraged fans to share recordings of their episodes with others. At its peak, the "MST3K Fan Club" had over 50,000 members, and Best Brains were receiving over 500 letters each week. Fans of the show generally refer to themselves as "MSTies". Usenet newsgroups rec.arts.tv.mst3k.misc and rec.arts.tv.mst3k.announce were established in the mid-1990s for announcements and discussions related to the show. A type of fan fiction called MiSTings, in which fans would add humorous comments to other, typically bad, fan fiction works, was popular on these groups. The fan-run website "Satellite News" continues to track news and information about the show and related projects from its cast members. Another fan site, "The Annotated MST", attempts to catalog and describe all the obscure popular culture references used in a given episode.
In addition to the show's fandom, a number of celebrities have expressed their love for the show. One of the earliest known celebrity fans was Frank Zappa, who went so far as to telephone Best Brains, calling "MST3K" as "the funniest fucking thing on TV" (according to Beaulieu). Zappa became a friend of the show, and following his death, episode 523 was dedicated to him. Other known celebrities fans include Al Gore, Neil Patrick Harris, Penn Jillette, and Patton Oswalt (who would later become TV's Son of TV's Frank in the revival).
There were two official fan conventions in Minneapolis (run by the series' production company Best Brains) called "ConventioCon ExpoFest-A-Rama" (1994) and "ConventioCon ExpoFest-A-Rama 2: Electric Bugaloo" (1996). At least 2,500 people attended the first convention.
The various cast and crew from the show's broadcast run have continued to produce comedy works following the show. Two separate projects were launched that specifically borrowed on the theme of riffing on bad movies. After the short-lived "The Film Crew" in 2006, Nelson started RiffTrax, providing downloadable audio files containing "MST3K"-style riffs that the viewer can synchronize to their personal copy of a given popular movie (such as ""); this was done to avoid copyright and licensing issues with such films. RiffTraxs cast expanded to include Murphy and Corbett along with occasional guest stars, and are able to use a wider range of films, including films and shorts in the public domain, and films which they could get the license to stream and distribute. In addition, they launched production of "RiffTrax Live" shows for various films, where they perform their riffing in front of a live audience that is simultaneously broadcast to other movie theaters across the country and later made available as on-demand video. , RiffTrax continues to offer new material and shows. As part of a tribute to their roots, RiffTrax has performed some works that previously appeared on "MST3K", including "Manos: the Hands of Fate", "Santa Claus", and "Time Chasers".
Similarly, Hodgson, after some experimental creative works such as "The TV Wheel", started "Cinematic Titanic" with Beaulieu, Weinstein, Conniff, and Pehl in 2007. Like "MST3K", the five riffed on bad movies they were able to acquire the licenses for (including "Santa Claus Conquers the Martians"), which then were distributed through on-demand video and streaming options. They later did a number of live shows across the United States, some which were made available for digital demand.
Other related projects by the "MST3K" crew following the show's end include:
In 2000, most of the cast of the Sci-Fi era of the show collaborated on a humor website, Timmy Big Hands, that closed in 2001.
In 2001, Mike Nelson, Patrick Brantseg, Bill Corbett, Kevin Murphy and Paul Chaplin created "The Adventures of Edward the Less", an animated parody of J. R. R. Tolkien's "The Lord of the Rings" and others in the fantasy genre, with additional vocals by Mary Jo Pehl and Mike Dodge, for the Sci Fi Channel website.
In 2008, Bill Corbett and fellow writer Rob Greenberg wrote the screenplay for "Meet Dave", a family comedy starring Eddie Murphy about a tiny "Star Trek"-like crew operating a spaceship that looks like a man. The captain of the crew and the spaceship were both played by Murphy. Originally conceived as a series called "Starship Dave" for SciFi.com, it was dropped in favor of "Edward the Less". The script (along with the title) were changed drastically by studio executives and other writers, although Corbett and Greenberg received sole screenwriter credit.
In 2010, Trace Beaulieu, Frank Conniff, Joel Hodgson, Mary Jo Pehl, Josh Weinstein, Beth McKeever and Clive Robertson voiced characters for "", a computer game created by J. Allen Williams.
In 2013, Frank Conniff and animation historian Jerry Beck debuted "Cartoon Dump", a series of classically bad cartoons, which are also occasionally performed live.
Trace Beaulieu and Joel Hodgson were featured in the Yahoo! Screen series "Other Space" in 2015, with Beaulieu voicing a robot companion of Hodgson's character, a burned-out spaceship engineer. Series creator Paul Feig, a fan of "Mystery Science Theater 3000", said that he envisioned Hodgson and Beaulieu as their respective characters while writing them.
Also in 2015, Trace Beaulieu and Frank Conniff began performing together as "The Mads", riffing movies at live screenings across the U.S.
In 2008, to commemorate the show's 20th anniversary, the principal cast and writers from all eras of the show reunited for a panel discussion at the San Diego Comic-Con, which was hosted by actor-comedian Patton Oswalt (who would later go on to star in the revived series). The event was recorded and included as a bonus feature on the 20th Anniversary DVD release via Shout! Factory. Also that year, several original "MST3K" members (including Joel Hodgson, Trace Beaulieu and Frank Conniff) reunited to shoot a brief sketch to be included on the web-exclusive DVD release of "The Giant Gila Monster". The new disc was added to Volume 10 of the "MST3K Collection" DVD boxed set series, replacing the "Godzilla vs. Megalon" disc which could no longer be sold due to copyright conflicts. The new package was sold under the name "Volume 10.2", and the sketch was presented as a seminar to instruct consumers on how to "upgrade" their DVD set, which merely consists of "disposing" of the old disc and inserting the new one.
In 2013, Joel Hodgson and Trace Beaulieu reprised their roles as Joel Robinson and Crow T. Robot for cameo appearances in the fourth season of "Arrested Development".
As part of its live show events for 2016, RiffTrax presented a "MST3K" reunion at a live show in Minneapolis in June 2016. Hodgson, Bridget Nelson, Pehl, Conniff, and Beaulieu all joined the three regulars along with Jonah Ray from the revived series. The gathered cast riffed on a variety of shorts as part of the event.
Another cult sci-fi series "Futurama" featured silhouetted robots resembling Crow and Servo in one of their episodes.
The poster for the 1996 film version can be seen in the 2005 comedy "The 40-Year Old Virgin".
Vaporwave artist death's dynamic shroud.wmv sampled the episode "Pod People" on his 2014 albums "DERELICT MEGATOWER II" and "DERELICT MEGATOWER III".
|
https://en.wikipedia.org/wiki?curid=20840
|
Music radio
Music radio is a radio format in which music is the main broadcast content. After television replaced old time radio's dramatic content, music formats became dominant in many countries. Radio drama and comedy continue, often on public radio.
Music drives radio technology, including wide-band FM, modern digital radio systems such as Digital Radio Mondiale, and even the rise of internet radio and music streaming services (such as Pandora and Spotify).
When radio was the main form of entertainment, regular programming, mostly stories and variety shows, was the norm. If there was music, it was normally a live concert or part of a variety show. Backstage sound engineers who jockeyed discs (records) from one turntable to another to keep up with the live programming were often called disc jockeys.
With the mass production and popularity of records in the mid 1940s, as well as the birth of TV, it was discovered that a show was needed to simply play records and hire a disc jockey to host the program. One of the first disc jockeys (later called DJs) was Dick Clark. Others followed suit and today music radio is the most numerous format.
The radio station provides programming to attract listeners. Commercial radio stations make profits by selling advertising. Public and community radio stations are sustained by listener donations and grants. Young people are targeted by advertisers because their product preferences can be changed more easily. Therefore, the most commercially successful stations target young audiences.
The programming usually cycles from the least attractive item, to most attractive, followed by commercials. The purpose of this plan is to build listener interest during the programming.
Because dead air does not attract listeners, the station tries to fill its broadcast day with sound. Audiences will only tolerate a certain number of commercials before tuning away. In some regions, government regulators specify how many commercials can be played in a given hour.
There are several standard ways of selecting the music, such as free-form, top-40, album-oriented rock, and Jack. These can be applied to all types of music.
Jingles are radio's equivalent of neon signs. Jingles are brief, bright pieces of choral music that promote the station's call letters, frequency and sometimes disc-jockey or program segment. Jingles are produced for radio stations by commercial specialty services such as JAM, in Texas.
Jingles are often replaced by recorded voice-overs (called "stingers", also depending on region more often "liners").
In order to build station loyalty, the station announces time, station calls letters and frequency as often as six to twelve times per hour. Jingles and stingers (liners) help to give the station a branded sound in a pleasant, minimal amount of air-time. The legal requirement for station identification in the U.S. is once per hour, approximately at the top of the hour, or at the conclusion of a transmission.
News, time-checks, real-time travel advice and weather reports are often valuable to listeners. The news headlines and station identification are therefore given just before a commercial. Time, traffic and weather are given just after. The engineer typically sets the station clocks to standard local time each day, by listening to WWV or WWVH (see atomic clock). These segments are less valued by the most targeted market, young people, so many commercial stations shorten or omit these segments in favor of music.
While most music stations that offer news reports simply "tear and read" news items (from the newswires or the Internet), larger stations (generally those affiliated with news/talk stations) may employ an editor to rewrite headlines, and provide summaries of local news. Summaries fit more news in less air-time. Some stations share news collection with TV or newspapers in the same media conglomerate. An emerging trend is to use the radio station's web site to provide in-depth coverage of news and advertisers headlined on the air. Many stations contract with agencies such as Smartraveler and AccuWeather for their weather and traffic reports instead of using in-house staff.
Fewer radio stations (except on medium and major market, depending on daypart) maintain a call-in telephone line for promotions and gags, or to take record requests. DJs of commercial stations do not generally answer the phone and edit the call during music plays in non-major markets, as the programming is either delivered via satellite, or voice-tracked using a computer. More and more stations take requests by e-mail and online chat only.
The value of a station's advertising is set by the number, age and wealth of its listeners. Arbitron, a commercial statistical service, historically used listener diaries to statistically measure the number of listeners. Arbitron diaries were collected on Thursdays, and for this reason, most radio stations have run special promotions on Thursdays, hoping to persuade last-minute Arbitron diarists to give them a larger market-share. Arbitron contractually prevents mention of its name on the air.
Promotions are the on-air equivalent of lotteries for listeners. Promotional budgets usually run about $1 per listener per year. In a large market, a successful radio station can pay a full-time director of promotions, and run several lotteries per month of vacations, automobiles and other prizes. Lottery items are often bartered from advertisers, allowing both companies to charge full prices at wholesale costs. For example, cruising companies often have unused capacity, and when given the choice, prefer to pay their bills by bartering cruise vacations. Since the ship will sail in any case, bartered vacations cost the cruise company little or nothing. The promotion itself advertises the company providing the prize. The FCC has defined a lottery as “any game, contest or promotion that combines the elements of prize, chance and consideration.”
Most music stations have DJs who play music from a playlist determined by the program director, arranged by blocks of time. Though practices differ by region and format, what follows is a typical arrangement in a North American urban commercial radio station.
The first block of the day is the "morning drive time" block in the early morning. Arbitron defines this block between 6 a.m. and 10 a.m., though it can begin as early as 5 a.m. (though usually not later than 6), and end as early as 9 a.m. or as late as 11 a.m. This block usually includes news bulletins and traffic and weather advisories for commuters, as well as light comedy from the morning DJ team (many shock jocks started as or still work on drive-time radio). Some stations emphasize music, and reduce gags and call-ins in this period.
The midday block (defined by Arbitron as 10 a.m. to 3 p.m., though often extended later to about 5 p.m.) is mostly music, and in many places is at least partially voicetracked from another market. For a period around noon a station may play nonstop music or go to an all-request format for people eating lunch. This block is often occupied by a "no-repeat workday;" stations that offer this feature usually target captive audiences such as retail workers, who have to listen to the station for long periods of time and can become irritated by repetition.
In the early evening, or "afternoon drive" (defined by Arbitron as 3 to 7 p.m.), the evening rush-hour programming resembles the midday programming, but adds traffic and weather advisories for commuters. Some stations insert a short snippet of stand-up comedy ("5 O'Clock Funnies") around 5 o'clock when commuters leave work, or play specifically selected "car tunes" ideal for listening while driving.
The evening block (defined by Arbitron as 7 p.m. to midnight), if present, returns to music. Syndicated programs such as Tom Kent or Delilah are popular in this shift.
The overnight programming, from midnight to the beginning of drive time, is generally low-key music with quiet, if any, announcing. Some stations play documentaries or even infomercials, while some others play syndicated or voicetracked DJs. Complete automation, with no jock, is very common in this day part. It is not uncommon to play more adventurous selections during late night programming blocks, since late night is generally not considered significant for ratings, and are not subject to federal restrictions as stringently as during the daytime. Stations are permitted to sign off during this time; in areas where AM radio is still significant (especially in the United States), local stations may be required to either sign off or cut to low-power to protect clear-channel stations.
Weekends, especially Sundays, often carry different programming. The countdown show, ranking the top songs of the previous week, has been a staple of weekend radio programming since 1970; current hosts of countdown shows in various formats include Rick Dees, Ryan Seacrest, Jeff Foxworthy, Kix Brooks, Bob Kingsley, Crook & Chase, Randy Jackson, Walt Love, Al Gross, Dick Bartley, and (via reruns) Casey Kasem. Other types of weekend programming include niche programming, retrospective shows and world music such as the Putumayo World Music Hour. Stations may carry shows with different genres of music such as blues or jazz. Community affairs and religious programming is often on Sunday mornings, generally one of the least listened-to periods of the week. In addition, weekend evenings are particularly specialized; a dance station might have a sponsored dance party at a local club, or a classical station may play an opera. Saturday nights are also similar to this; request shows, both local and national (e.g. Dick Bartley), are very popular on Saturday night. The longest running radio program in the country, the "Grand Ole Opry", has aired on Saturday night since its inception in 1925.
Many music stations in the United States perform news and timechecks only sparingly, preferring to put more music on the air. News is often restricted to the talk-heavy commuting hours, though weather updates are still very common throughout the day, even on these stations. ABC FM News is an example of an American news network that is designed for music radio stations. The BBC and ABC take a different approach, with all of its stations giving news updates (BBC Radio 1Xtra produces its own news segments under the name TX.)
Some well-known music-radio formats are "Top 40", "Freeform Rock" and "AOR (Album Oriented Rock)". It turns out that most other stations (such as Rhythm & Blues) use a variation of one of these formats with a different playlist. The way stations advertise themselves is not standardized. Some critical interpretation is needed to recognize classic formulas in the midst of the commercial glitz.
See List of music radio formats for further details, and note that there is a great deal of format evolution (or, to borrow a television term, channel drift) as music tastes and commercial conditions change. For example, the Beautiful music format that developed into today's Easy listening and Soft rock formats is nearly extinct due to a lack of interest from younger generations, whereas classic rock has become popular over the last 20 years or so and Jack FM has arisen only since 2000 or so.
The most popular format in the U.S. is country music, but rock music sells the most.
The original formulaic radio format was Top 40 music, now known within the industry as contemporary hit radio or "CHR". In this radio format, disc-jockeys would select one of a set of the forty best-selling singles (usually in a rack) as rated by Billboard magazine or from the station's own chart of the local top selling songs. In general, the more aggressive "Top 40" stations could sometimes be better described as "Top 20" stations. They would aggressively skirt listener boredom to play only the most popular singles.
Top 40 radio would punctuate the music with jingles, promotions, gags, call-ins, and requests, brief news, time and weather announcements and most importantly, advertising. The distinguishing mark of a traditional top-40 station was the use of a hyperexcited disc-jockey, and high tempo jingles. The format was invented in the US and today can be heard worldwide. Todd Storz and Gordon McLendon invented Top 40 radio. Bill Drake and Rick Sklar have had a lasting modern influence.
Variants and hybrids include the freeform-like Jack FM (mentioned below under Freeform Rock) and the "Mix" formats mentioned below under Oldies. Top 40 music is heavily criticized by some music fans as being repetitive and of low quality, and is almost exclusively dominated by large media conglomerates such as iHeartMedia and CBS Corporation. Top 40 tends to be underrepresented on the Internet, being mostly the domain of commercial broadcasters such as Virgin Radio UK.
Some of the most famous Top 40 stations have been Musicradio 77 WABC/New York City, Boss Radio 93 KHJ/Los Angeles, The Big 89 WLS/Chicago, 1050 CHUM/Toronto, Famous 56 WFIL/ Philadelphia, and The Big 68 WRKO/Boston. Today, there are popular Top 40 stations such as WHTZ-Z100/New York City, 102.7 KIIS-FM/Los Angeles, and KAMP-97.1 AMP Radio/Los Angeles.
A later development was freeform radio, later commercially developed as progressive rock radio, and still later even more commercially developed as AOR (Album-Oriented Rock), in which selections from an album would be played together, with an appropriate introduction.
Traditional free-form stations prided themselves on offering their disc jockeys freedom to play significant music and make significant social commentary and humor. This approach developed commercial problems because disc jockeys attracted to this freedom often had tastes substantially different from the audience, and lost audience share. Also, freeform stations could lack predictability, and listeners' loyalty could then be put at risk. Progressive rock radio (not to be confused with the progressive rock music genre) was freeform in style but constrained so that some kind of rock music was what was always or almost always played.
Responsible jocks would realize their responsibility to the audience to produce a pleasant show, and try to keep the station sound predictable by listening to other jocks, and repeating some of their music selections. WNEW-FM in New York during the 1970s exemplified this approach to progressive rock radio.
At their best, free-form stations have never been equaled for their degree of social activism, programmatic freedom, and listener involvement. However, to succeed, the approach requires genius jocks, totally in-tune with their audience, who are also committed to the commercial success of the radio station. This is a rare combination of traits. Even if such people are available, they often command extremely high salaries. However, this may be an effective approach for a new station, if talented jocks can be recruited and motivated at low salaries.
Freeform radio is particularly popular as a college radio format; offshoots include the recent (and somewhat controversial, due to its lack of on-air personalities) eclectic-pop format known as variety hits, which plays a wide assortment of mostly top-40 music from a span of several decades; and podcast radio, a mostly talk format pioneered by Infinity Broadcasting's KYOU station in California and Adam Curry's Podcast show on Sirius Satellite Radio.
AOR (album-oriented rock) developed as a commercial compromise between top-forties-style formulas and progressive rock radio/freeform. A program director or music consultant would select some set of music "standards" and require the playlist to be followed, perhaps in an order selected by the jock. The jock would still introduce each selection, but the jock would have available a scripted introduction to use if he was not personally familiar with a particular piece of music and its artist. Obviously a computer helps a lot in this process.
A useful, relatively safe compromise with the artistic freedom of the jocks is that a few times each hour, usually in the least commercially valuable slots of the hour, the disc-jockey can highlight new tracks that he or she thinks might interest the audience. The audience is encouraged to comment on the new tracks, allowing the station to track audience tastes. The freedom to introduce new artists can help a station develop its library.
Significant AOR offshoots include classic rock and adult album alternative.
Classic rock or oldies formats have been described as having the weakness of not playing new artists. This is true in a creative sense, but not a commercial one. Conventional wisdom in the radio industry is that stations will not get good ratings or revenue if they frequently play songs unfamiliar to their audience. This is why "Top 40" stations played only the biggest hits and why oldies and classic rock formats do the same for the eras they cover. An inherent danger to this philosophy is that closed-campus work environments, such as retail, foster an environment where listeners hear the station continuously for several hours, multiple days in a row, which can become unpleasant after a short amount of time. Oldies and related formats do have an inherent advantage in that they have a much broader time frame (up to 30 years, compared to the current hits of a top-40 station) from which to draw their playlist and can thus play a greater variety of songs than a station bound to devote the majority of its spins to a limited list. The ideal "classic" station (of whatever format) finds the balance between playing listener favorites frequently enough to develop a base while at the same time cultivating a playlist broad enough not to breed contempt. Nevertheless, there seems to be a cottage industry of Internet stations specializing in specific forms of classic rock and oldies, particularly psychedelic rock and progressive rock.
The oldies and classic rock formats have a strong niche market, but as the audience becomes older the station becomes less attractive to advertisers. Advertisers perceive older listeners as set in their brand choices and not as responsive to advertising as younger, more impulsive listeners. Oldies stations must occasionally change to more youthful music formats; as a result, the definition of what constitutes an "oldies" station has gradually changed over the years (and the phrase "oldies" itself is falling out of use except for the stations that still regularly play music from the 1960s and earlier). This is why many oldies stations, like WCBS-FM in New York City and WJMK in Chicago, have switched over to the younger-oriented Jack FM format in recent years—although WCBS-FM adopted a Classic Hits format on July 12, 2007, and the "Jack FM" format was moved to its HD2 subchannel. Unlike WCBS-FM's pre-JACK format which was centered on the 1955-1979 era, the post-JACK station was based on the 1964-1989 era because of the aging listener demographics of the original format.
This preference for younger listeners caused the decline of the "Big Band" or "Standards" music formats that covered music from the 1930s to the 1950s. As the audience grew too old for advertisers, the radio stations that carried these formats saw a sharp loss of ratings and revenue. This left them with no choice but to adopt more youthful formats, though the Standards format (also known as the Great American Songbook from the series of albums produced by rocker Rod Stewart) has undergone something of an off-air revival, with artists such as Stewart, Tony Bennett and Queen Latifah putting their own interpretation on the music.
During the mid-to-late-1990s, the "Mix" format—a loosely defined mixture of Top-40 and classic rock with something of an emphasis on adult contemporary music—began to appear across the country. While the format has no particular standard identity, most "mix" stations have rotations consisting largely of pop and rock music from the 1980s and 1990s (and often the 1970s), with some current material mixed in. In addition, stations devoted to the pop music of the 1970s, 1980s, and 1990s on their own have developed as the audiences that grew up with that music grew older and nostalgic for the sounds of their youth. The continued play of such songs on such stations has also exposed the music to younger generations that can take a renewed interest in the music, prolonging its shelf life beyond what conventional wisdom in the industry may suggest.
The full service format is a more freeform variant of this type of format. Full-service stations will often mix the oldies, classic rock, classic hits, and adult standards music, occasionally with music found in formats such as beautiful music, adult contemporary, or classic country. On weekends, specialty or niche programs focusing on formats such as Celtic music, polka and Italian music (depending on the ethnicity of the area) are common. In addition to music, a limited amount of local talk programming is heard on most full-service stations. Full service tends to be heard primarily on rural stations.
These formats all have small but very loyal audiences in the largest markets. Most follow formats similar to the above (Top 40s, Freeform, AOR and Oldies), except with a different playlist. Public service stations following these formats tend to be "freeform" stations.
Regional Mexican is a Latin music radio format, typically including Banda, Conjunto, Corridos, Duranguense, Grupero, Huapango, Mariachi, New Mexico music, Norteña, Ranchera, and Tejano music. It is the most popular radio format targeting Hispanic Americans in the United States.[2] The large number of immigrants from Northern Mexico can lead to an emphasis upon Norteña on Regional Mexican radio stations,[1] though markets with larger Hispanic audiences can have multiple Regional Mexican stations, with variations in which region from which the music is taken and where it is popular.
Easy listening and adult contemporary are related formats that play largely down-tempo pop music of various styles. The difference is mostly in the era and styles covered -- Easy Listening is mostly older music done in the style of standards from the early 20th century (typical artists include Johnny Mathis and Frank Sinatra) combined with Big Band music and more modern performers in the same style such as Céline Dion and Josh Groban, while Adult Contemporary focuses more on newer pop music from the 1970s on. An ancestor to the easy listening format is Beautiful Music, a now-rare format (though XM features one channel of it, called Sunny) focusing mostly on smooth jazz or classical arrangements of pop music and original compositions in a similar vein. Perhaps the best-known Adult Contemporary station currently in operation is WLTW in New York City, better known as 106.7 Lite FM.
Jazz stations generally play either traditional jazz forms or smooth jazz. The jazz station, more than any other except the college station, is stereotyped as having a small listenership and a somewhat overly highbrow on-air personality, and many are college-run stations. California State University Long Beach sponsors KJAZZ 88.1, which has a fairly significant online listenership as well. Two very well known smooth jazz stations are WNUA in Chicago and 94.7 The Wave in Los Angeles, both of which were introduced in 1987, and still continue to enjoy tremendous success in the format today. Also, WUCF-FM in Orlando has been playing jazz music since 1978. Both traditional and smooth jazz stations have been in severe decline, both on commercial and noncommercial stations, since the 2000s, in part because of the formats' lower profitability compared to other formats (adult contemporary for commercial stations, NPR-driven news/talk for noncommercial ones).
Blues programming is generally limited to niche programs on stations that primarily broadcast other formats. An exception to this is CIDG-FM, an all-blues station based in the Canadian city of Ottawa.
Dance music is a niche, and so-called "rhythmic pop" stations have had a fierce but not always commercially sustainable following. There was a wide spectrum of disco-format radio stations during the late 1970s, but virtually all of them died out during the disco backlash; WXKS in Boston is one of the few notable survivors, now a Clear Channel Communications-owned top-40 station of considerable influence. Nevertheless, there are a large number of dance music stations available both on the internet and on satellite radio, mostly specializing in various forms of electronica. Both major US satellite radio services include disco stations.
Rock music has a long and honorable radio tradition going back to DJs like Wolfman Jack and Alan Freed, and as a result variations on rock radio are fairly common. The classic rock and oldies formats are discussed above; in addition to those, however, there are several genres of music radio devoted to different aspects of modern rock music. Alternative rock grew out of the grunge scene of the late 1980s and early 1990s and is particularly favored by college radio and adult album alternative stations; there is a strong focus on songwriters and bands with an outsider sound or a more sophisticated sound than the "three chord wonder" cliché. Meanwhile, other stations focus on heavy metal, punk rock, or the various post-punk and pop-influenced sounds known collectively as "modern rock".
Narrow-interest rock stations are particularly common on the Internet and satellite radio scenes, broken down into genres such as punk, metal, classic rock, indie music, and the like. There is a general feeling among radio connoisseurs that rock radio is becoming badly watered down by big corporate ownership, leading to a considerable do-it-yourself spirit.
While stereotyped as rural music, the Country music format is common and popular throughout the United States and in some other countries (particularly Canada and Australia, both of which share much of the same Anglo-Saxon and Celtic roots as the United States). Country has been a popular radio format since the early days of music radio, dating back to the early days of radio itself when barn dance radio programs were widely popular; however, the format was indeed originally a predominantly rural phenomenon, especially on AM radio. Decades worth of efforts at mainstreaming the format eventually paid off when country radio became widely popular among a large number of FM radio stations that signed on in the suburban United States in the 1980s and early 1990s.
For most mainstream country stations, the emphasis is generally on current pop country, following the same process as top 40; the remaining music in a particular station's library generally uses music from the past fifteen years (shorter for "hot country" or "new country" stations), with the exact music used varying depending on the station and the style of music the listener wants to hear.
Classic country is a variant of the country music format; it is effectively the country music analog to oldies. Classic country is generally preserved in the rural AM stations that country music aired on before its mainstream expansion. Depending on the music mix, it can play either relatively recent classic country tunes from the 1970s to the 1990s (generally more favorable to advertisers) or can span all the way back to the 1920s, thus playing music far older than almost any other radio format available.
Due to increasing similarities between country music and some variants of rock music (such as southern rock, country rock and heartland rock), there have been efforts at combining country and rock formats together, most of which have been unsuccessful.
An alternative country format is Americana, which eschews the mainstream pop country songs in favor of classic-era, alt country and cult musicians. Like the music it plays, these stations can develop strong cult followings and listener loyalty, but they are also less commercially successful than pop country stations.
The explosive rise in popularity during the 1980s of rap music has led to a large number of radio stations specializing in rap/hip-hop and R&B music (with the exception of classic R&B such as Motown, which is as often as not the province of Oldies stations). This format is popular among all ethnic groups and social classes.
Dance music radio focuses on live DJ sets and hit singles from genres of techno, house, electro, drum and bass, UK garage and big beat. While some stations play all kinds of electronic dance music, others (mainly pirate radio stations) focus on particular genres. This format is popular in England, Germany, Netherlands and some other countries, but less so in the United States (where dance is a niche format often exclusive to internet radio stations).
However, the number of U.S. stations airing such content has grown; five terrestrial radio stations in the U.S. with a purely dance-oriented format (one of which airing it part-time during the night and early-morning hours) report their airplay to the Billboard Dance/Mix Show Airplay chart, while top 40 and rhythmic stations may also air EDM songs that have crossed over onto pop-oriented charts due to the recent growth in mainstream popularity of dance music.
Christian and gospel radio stations usually plays the popular Contemporary Christian music, others playing Gospel music and/or Praise and worship. K-Love and Air1 are among the most popular Christian radio stations. In the United States, Christian music stations (especially ostensibly noncommercial ones) often rely heavily on brokered programming: a Christian music station typically only carries music for part of the day, with the rest of the day filled by evangelists who pay to place recordings of their sermons on the station.
Some music radio is broadcast by public service organizations, such as National Public Radio or the BBC. In the United States, public radio is typically confined to three formats: news/talk, classical music, or jazz, the last of which is declining rapidly as of the late 2000s. In other countries, where national broadcasters hold significantly more clout, formats can vary more widely.
Community radio often relies heavily on the music format because it is relatively cheap and generally makes for easy listening.
Commercial stations charge advertisers for the estimated number of listeners. The larger the audience, the higher the stations' rate card can be for commercial advertising.
Commercial stations program the format of the station to gain as large a slice of the demographic audience as possible.
A station's value is usually measured as a percentage of market share in a market of a certain size. The measurement in U.S. markets has historically been by Arbitron, a commercial statistical service that uses listener diaries. Arbitron diaries were historically collected on Thursdays, and for this reason, most radio stations have run special promotions on Thursdays, hoping to persuade last-minute Arbitron diarists to give them a larger market-share. Stations are contractually prohibited from mentioning Arbitron on the air.
Market share is not always a consideration, because not all radio stations are commercial. Public radio is funded by government and private donors. Since most public broadcasting operations don't have to make a profit, no commercials are necessary. (In fact, because most public broadcasting stations operate under noncommercial licenses from their country's broadcasting regulator, they may not be allowed to sell advertising at all.) Underwriting spots, which mention the name of a sponsor and some information but cannot include “calls to action” attempting to convince the listener to patronize the sponsor, may be allowed.
Also, satellite radio either charges subscribers or is operated by a public broadcasting service. Therefore, satellite radio rarely carries commercials or tries to raise money from donors. The lack of commercial interruptions in satellite radio is an important advantage. Often the only breaks in a satellite music station's programming are for station identification and DJ introductions.
Internet radio stations exist that follow all of these plans.
Much early commercial radio was completely freeform; this changed drastically with the payola scandals of the 1950s. As a result, DJs seldom have complete programming freedom. Occasionally a special situation or highly respected, long established personality is given such freedom. Most programming is done by the program director. Program directors may work for the station or at a central location run by a corporate network. The DJ's function is generally reduced to introducing and playing songs. Many stations target younger listeners, because advertisers believe that advertising can change a younger person's product choice. Older people are thought to be less easy to change.
Music radio has several possible arrangements. Originally, it had blocks of sponsored airtime that played music from a live orchestra. In the 1930s, phonograph records, especially the single, let a disc jockey introduce individual songs, or introduce blocks of songs. Since then, the program has been arranged so that commercials are followed by the content that is most valuable to the audience.
Programming is different for non-traditional broadcasting. The Jack FM format eliminates DJs entirely, as do many internet radio stations. The music is simply played. If it is announced, it is by RDS (for FM broadcast) or ID3 tags (for Internet broadcast). Satellite radio usually uses DJs, but their programming blocks are longer and not distinguished much by the time of day. In addition, receivers usually display song titles, so announcing them is not needed.
Internet and satellite broadcasting are not considered public media, so treaties and statutes concerning obscenity, transmission of ciphers and public order do not apply to those formats. So, satellite and internet radio are free to provide sexually explicit, coarse and political material. Typical providers include Playboy Radio, uncensored rap and hard rock stations, and "outlaw" country music stations.
The wide reach and selective, non-broadcast usage of the internet allows programmers access to special interest audiences. As a result, both mainstream and narrow-interest webcasts flourish; in particular, electronic music stations are much more common on the Internet than they are in satellite or broadcast media.
Outside of English-speaking world, several radio formats built around local musical genres are popular. Examples include Portuguese Fado, Spanish-speaking Mexican Regional, Reggaeton and tejano, French Cajun (especially in French Louisiana), Russian Shanson, and (since the late 2000s) Korean K-pop.
Stations usually adopt a music format to gain the greatest number of listeners for the least expense. Since the content has already been produced, the station merely adds the low-cost on-air programming between records.
Music radio stations pay music-licensing fees to licensing agencies such as ASCAP and BMI in the United States or PRS in the UK. These fees or royalties are generally paid to the songwriters; the musicians themselves typically do not get a cut of radio royalties, even if they own a share of the performance rights, unless they wrote the song themselves. (Thus, a song that is in the public domain is free to play on the radio, regardless of who performs it or when it was performed.) For example, the industry-wide fees payable in 2004 to ASCAP was $176 million. Commercial stations often get their CDs free, but still pay royalties to play it on air. Some small neighborhood stations play unlisted locally produced music, and avoid these fees.
Licensing issues nearly destroyed early Internet radio. In the U.S., Congress intervened with a royalty structure that was expensive to small independent operators, but easier than the RIAA's standard scale. Both XM and Sirius provide commercial packages allowing exclusive license-free use (though not rebroadcast) of their music programming by businesses. Most popular internet radio networks such as Pandora and Digitally Imported were paying royalty fees annually to SoundExchange.
Music radio, particularly top 40, has often acted as both a barometer and an arbiter of musical taste, and radio airplay is one of the defining measures of success in the mainstream musical world. In fact, the rise of rock music to popularity is intimately tied to the history of music radio. Early forms of rock had languished in poor areas of the South. It was enjoyed mostly by rural blacks, with notable exposure in Memphis, Tennessee due to the all African American programming of WDIA. Rock music entered the mainstream during the 1950s because of controversial white DJs such as Dewey Phillips, Alan Freed, Dick Clark and Wolfman Jack with an appreciation for black music.
For many years, many listeners have been dissatisfied with the content of radio programming since the decline of early free form rock radio. The popularity of offshore pirate radio stations in the United Kingdom was an early symptom of frustration with the often overly safe and occasionally politicized playlists of commercial radio.
The growth of Internet radio from a small experimenter's toy in the mid-1990s to a huge phenomenon allowing both small do-it-yourselfers and large commercial stations to make their offerings available worldwide was seen as a threat to over-the-air music broadcasting, and was nearly shut down by onerous licensing demands made by the recording industry. Meanwhile, the rise of satellite radio services as a major competitor have brought many of the advantages of Internet radio to an increasingly mobile listening public, including lack of censorship, greater choice, a more eclectic approach to format programming, and static-free digital sound quality. Indeed, one-size-fits-all programming is no longer seen as tenable by some, as the diversity of musical tastes among the listening public have created a proliferation of radio formats in what some might call a form of narrowcasting.
|
https://en.wikipedia.org/wiki?curid=20841
|
Massively multiplayer online role-playing game
A massively multiplayer online role-playing game (MMORPG) is a video game that combines aspects of a role-playing video game and a massively multiplayer online game.
As in all role-playing games (RPGs), the player assumes the role of a character (often in a fantasy world or science-fiction world) and takes control over many of that character's actions. MMORPGs are distinguished from single-player or small multi-player online RPGs by the number of players able to interact together, and by the game's persistent world (usually hosted by the game's publisher), which continues to exist and evolve while the player is offline and away from the game.
MMORPGs are played throughout the world. Worldwide revenues for MMORPGs exceeded half a billion dollars in 2005, and Western revenues exceeded a billion dollars in 2006. In 2008, the spending on subscription MMORPGs by consumers in North America and Europe grew to $1.4 billion.
"World of Warcraft", a popular MMORPG, has over 10 million subscribers as of November 2014. "World of Warcraft"s total revenue was $1.04 billion US dollars in 2014. "", released in 2011, became the world's 'Fastest-Growing MMO Ever' after gaining more than 1 million subscribers within the first three days of its launch.
Although modern MMORPGs sometimes differ dramatically from their predecessors, many of them share the same basic characteristics. These include several common features: persistent game environment, some form of level progression, social interaction within the game, in-game culture, system architecture, membership in a group, and character customization.
The majority of popular MMORPGs are based on traditional fantasy themes, often occurring in an in-game universe comparable to that of "Dungeons & Dragons". Some employ hybrid themes that either merge or replace fantasy elements with those of science fiction, sword and sorcery, or crime fiction. Others draw thematic material from American comic books, the occult, and other genres. These elements are often developed using similar tasks and scenarios involving quests, monsters, and loot.
In nearly all MMORPGs, the development of the player's character is the primary goal. Nearly all MMORPGs feature a character progression system, in which players earn experience points for their actions and use those points to reach character "levels", which makes them better at whatever they do. Traditionally, combat with monsters and completing quests for non-player characters, either alone or in groups, are the primary ways to earn experience points. The accumulation of wealth (including combat-useful items) is also a way to progress in many MMORPGs. This is traditionally best accomplished via combat. The cycle produced by these conditions, combat leading to new items allowing for more combat with no change in gameplay, is sometimes pejoratively referred to as the level treadmill, or "grinding". The role-playing game "Progress Quest" was created as a parody of this trend. "Eve Online" trains skills in real time rather than using experience points as a measure of progression.
In some MMORPGs, there is no limit to a player's level, allowing the grinding experience to continue indefinitely. MMORPGs that use this model often glorify top ranked players by displaying their avatars on the game's website or posting their stats on a high score screen. Another common practice is to enforce a maximum reachable level for all players, often referred to as a level cap. Once reached, the definition of a player's progression changes. Instead of being awarded primarily with experience for completing quests and dungeons, the player's motivation to continue playing will be replaced with collecting money and equipment.
Often, the widened range of equipment available at the maximum level will have increased aesthetic value to distinguish high ranking players in game between lower ranked players. Colloquially known as endgame gear, this set of empowered weapons and armor adds a competitive edge to both scripted boss encounters as well as player vs player combat. Player motivation to outperform others is fueled by acquiring such items and is a significant determining factor in their success or failure in combat-related situations.
MMORPGs almost always have tools to facilitate communication between players. Many MMORPGs offer support for in-game guilds or clans, though these will usually form whether the game supports them or not.
In addition, most MMOGs require some degree of teamwork in parts of the game. These tasks usually require players to take on roles in the group, such as protecting other players from damage (called tanking), "healing" damage done to other players or damaging enemies.
MMORPGs generally have Game Moderators or Game Masters (frequently referred to as GMs or "mods"), who may be paid employees or unpaid volunteers who attempt to supervise the world. Some GMs may have additional access to features and information related to the game that are not available to other players and roles.
Relationships formed in MMORPGs can often be just as intense as relationships formed between friends or partners met outside the game, and often involve elements of collaboration and trust between players.
Most MMORPGs provide different types of classes that players can choose. Among those classes, a small portion of players choose to roleplay their characters, and there are rules that provide functionality and content to those who do. Community resources such as forums and guides exist in support of this play style.
For example, if a player wants to play a priest role in his MMORPG world, he might buy a cope from a shop and learn priestly skills, proceeding to speak, act, and interact with others as their character would. This may or may not include pursuing other goals such as wealth or experience. Guilds or similar groups with a focus on roleplaying may develop extended in-depth narratives using the setting and resources similar to those in the game world.
Over time, the MMORPG community has developed a sub-culture with its own slang and metaphors, as well as an unwritten list of social rules and taboos. Players will often complain about 'grind' (a slang term for any repetitive, time-consuming activity in an MMORPG), or talk about 'buffs' and 'nerfs' (respectively an upgrade or downgrade of a particular game mechanic).
As with all such cultures, social rules exist for such things as invitations to join an adventuring party, the proper division of treasure, and how a player is expected to behave while grouped with other players.
Debate rages in various gaming media over the long-term effect of video game overuse.
Most MMORPGs are deployed using a client–server system architecture. The server software generates a persistent instance of the virtual world that runs continuously, and players connect to it via a client software. The client software may provide access to the entire playing world, or further 'expansions' may be required to be purchased to allow access to certain areas of the game. "EverQuest" and "Guild Wars" are two examples of games that use such a format. Players generally must purchase the client software for a one-time fee, although an increasing trend is for MMORPGs to work using pre-existing "thin" clients, such as a web browser.
Some MMORPGs require payment or a monthly subscription to play. By definition, "massively multiplayer" games are always online, and most require some sort of continuous revenue (such as monthly subscriptions and advertisements) for maintenance and development purposes. Some games, such as "Guild Wars", have disposed of the 'monthly fee' model entirely, and recover costs directly through sales of the software and associated expansion packs. Still others adopt a micropayment model where the core content is free, but players are given the option to purchase additional content, such as equipment, aesthetic items, or pets. Games that make use of this model often have originated in Korea, such as "Flyff" and "MapleStory". This business model is alternately called "pay for perks" or "freemium", and games using it often describe themselves with the term "free-to-play".
Depending on the number of players and the system architecture, an MMORPG might be run on multiple separate servers, each representing an independent world, where players from one server cannot interact with those from another; "World of Warcraft" is a prominent example, with each separate server housing several thousand players. In many MMORPGs the number of players in one world is often limited to around a few thousand, but a notable example of the opposite is "EVE Online", which accommodates several hundred thousand players on the same server, with over 60,000 playing simultaneously (June 2010) at certain times. Some games allow characters to appear on any world, but not simultaneously (such as "Seal Online: Evolution"); others limit each character to the world in which it was created. "World of Warcraft" has experimented with "cross-realm" (i.e. cross-server) interaction in player vs player "battlegrounds", using server clusters or "battlegroups" to co-ordinate players looking to participate in structured player vs player content such as the Warsong Gulch or Alterac Valley battlegrounds. Additionally, patch 3.3, released on December 8, 2009, introduced a cross-realm "looking for group" system to help players form groups for instanced content (though not for open-world questing) from a larger pool of characters than their home server can necessarily provide.
MMORPG is a term coined by Richard Garriott to refer to massive multiplayer online role-playing games and their social communities. Previous to this and related coinages, these games were generally called graphical MUDs; the history of MMORPGs traces back directly through the MUD genre. Through this connection, MMORPGs can be seen to have roots in the earliest multi-user games such as "Mazewar" (1974) and "MUD1" (1978). 1985 saw the release of a roguelike (pseudo-graphical) MUD called "Island of Kesmai" on CompuServe and Lucasfilm's graphical MUD Habitat. The first fully graphical multi-user RPG was "Neverwinter Nights", which was delivered through America Online in 1991 and was personally championed by AOL President Steve Case. Other early proprietary graphical online RPGs include three on The Sierra Network: "The Shadow of Yserbius" in 1992, "The Fates of Twinion" in 1993, and "The Ruins of Cawdor" in 1995. Another milestone came in 1995 as NSFNET restrictions were lifted, opening the Internet up for game developers, which allowed for the first truly "massively"-scoped titles. Finally, MMORPGs as defined today began with "Meridian 59" in 1996, innovative both in its scope and in offering first-person 3D graphics, with "The Realm Online" appearing nearly simultaneously. "Ultima Online", released in 1997, is often credited with first popularizing the genre, though more mainstream attention was garnered by 1999's "EverQuest" and "Asheron's Call" in the West and 1996's "" in South Korea.
The financial success of these early titles has ensured competition in the genre since that time. MMORPG titles now exist on consoles and in new settings. As of 2008, the market for MMORPGs has Blizzard Entertainment's "World of Warcraft" dominating as the largest MMORPG, alongside other titles such as "Final Fantasy XIV" and "Guild Wars 2", though an additional market exists for free-to-play MMORPGs, which are supported by advertising and purchases of in-game items. This free-to-play model is particularly common in South Korea such as "MapleStory", "", and "Atlantica Online". Also, there are some free-to-play games, such as "RuneScape" and "Tibia", where the game is free, but one would have to pay monthly to play the game with more features. "Guild Wars" and its sequel avoid some degree of competition with other MMORPGs by only requiring the initial purchase of the game to play.
The cost of developing a competitive commercial MMORPG title often exceeded $10 million in 2003. These projects require multiple disciplines within game design and development such as 3D modeling, 2D art, animation, user interfaces, client/server engineering, database architecture, and network infrastructure.
The front-end (or client) component of a commercial, modern MMORPG features 3D graphics. As with other modern 3D games, the front-end requires expertise with implementing 3D engines, real-time shader techniques and physics simulation. The actual visual content (areas, creatures, characters, weapons, spaceships and so forth) is developed by artists who typically begin with two-dimensional concept art, and later convert these concepts into animated 3D scenes, models and texture maps.
Developing an MMOG server requires expertise with client/server architecture, network protocols, security, and database design. MMORPGs include reliable systems for a number of vital tasks. The server must be able to handle and verify a large number of connections, prevent cheating, and apply changes (bug fixes or added content) to the game. A system for recording the games data at regular intervals, without stopping the game, is also important.
Maintenance requires sufficient servers and bandwidth, and a dedicated support staff. Insufficient resources for maintenance lead to lag and frustration for the players, and can severely damage the reputation of a game, especially at launch. Care must also be taken to ensure that player population remains at an acceptable level by adding or removing servers. Peer-to-peer MMORPGs could theoretically work cheaply and efficiently in regulating server load, but practical issues such as asymmetrical network bandwidth, CPU-hungry rendering engines, unreliability of individual nodes, and inherent lack of security (opening fertile new grounds for cheating) can make them a difficult proposition. The hosted infrastructure for a commercial-grade MMORPG requires the deployment of hundreds (or even thousands) of servers. Developing an affordable infrastructure for an online game requires developers to scale large numbers of players with less hardware and network investment.
In addition, the development team will need to have expertise with the fundamentals of game design: world-building, lore and game mechanics, as well as what makes games fun.
Though the vast majority of MMORPGs are produced by companies, many small teams of programmers and artists have contributed to the genre. As shown above, the average MMORPG development project requires enormous investments of time and money, and running the game can be a long-term commitment. As a result, non-corporate (or independent, or "indie") development of MMORPGs is less common compared to other genres. Still, many independent MMORPGs do exist, representing a wide spectrum of genres, gameplay types, and revenue systems.
Some independent MMORPG projects are completely open source, while others feature proprietary content made with an open-source game engine. The WorldForge project has been active since 1998 and formed a community of independent developers who are working on creating framework for a number of open-source MMORPGs. The Multiverse Foundation has also created a platform specifically for independent MMOG developers.
As there are a number of wildly different titles within the genre, and since the genre develops so rapidly, it is difficult to definitively state that the genre is heading in one direction or another. Still, there are a few obvious developments. One of these developments is the raid group quest, or "raid", which is an adventure designed for large groups of players (often twenty or more).
Instance dungeons, sometimes shortened to "instances", are game areas that are "copied" for individual players or groups, which keeps those in the instance separated from the rest of the game world. This reduces competition, and also reducing the amount of data that needs to be sent to and from the server, reducing lag. "The Realm Online" was the first MMORPG to begin to use a rudimentary form of this technique and "Anarchy Online" would develop it further, using instances as a key element of gameplay. Since then, instancing has become increasingly common. The "raids", as mentioned above, often involve instance dungeons. Examples of games which feature instances are "World of Warcraft", "The Lord of the Rings Online", "EverQuest", "EverQuest II", "", "Final Fantasy XIV", "Guild Wars", "Rift", "RuneScape", "Star Trek Online" and "DC Universe Online".
Increased amounts of "player-created content" is another trend.
The use of intellectual property licensing common in other video game genres has also appeared in MMORPGs. 2007 saw the release of "The Lord of the Rings Online", based on J. R. R. Tolkien's Middle-earth. Other licensed MMORPGs include "The Matrix Online", based on the "Matrix" trilogy of films, "", based on Games Workshop's table top game, "Star Wars Galaxies", "Star Wars The Old Republic", "Champions Online" and "Age of Conan".
Additionally, several licenses from television have been optioned for MMORPGs, for example "Star Trek Online" and "Stargate Worlds" (which was later canceled).
The first console-based MMORPG was "Phantasy Star Online" for the Sega DreamCast. The first console-based open-world MMORPG was "Final Fantasy XI" for the PlayStation 2.
"EverQuest Online Adventures", also on the PlayStation 2, was the first console MMORPG in North America.
Although console-based MMORPGs are considered more difficult to produce, the platform is gaining more attention.
With the popularization of Facebook and microtransactions has come a new wave of Flash and HTML5 based MMORPGs that use the free to play model. They require no download outside of a browser and usually have heavily integrated social media sharing features. Examples of browser-based MMORPG are here "List_of_multiplayer_browser_games".
Smartphones with their GPS capabilities (amongst others) enable augmented reality in games such as "Ingress" and "Pokémon Go". The games are enhanced by location and distance based tracking, bench marking goals or facilitating trade between players.
Since the interactions between MMORPG players are real, even if the environments are virtual, psychologists and sociologists are able to use MMORPGs as tools for academic research. Sherry Turkle has found that many people have expanded their emotional range by exploring the many different roles (including gender identities) that MMORPGs allow a person to explore.
Nick Yee has surveyed more than 35,000 MMORPG players over the past several years, focusing on psychological and sociological aspects of these games. Recent findings included that 15% of players become a guild-leader at one time or another, but most generally find the job tough and thankless; and that players spend a considerable amount of time (often a third of their total time investment) doing things that are external to gameplay but part of the metagame.
Many players report that the emotions they feel while playing an MMORPG are very strong, to the extent that 8.7% of male and 23.2% of female players in a statistical study have had an online wedding. Other researchers have found that the enjoyment of a game is directly related to the social organization of a game, ranging from brief encounters between players to highly organized play in structured groups.
In a study by Zaheer Hussain and Mark D. Griffiths, it was found that just over one in five gamers (21%) said they preferred socializing online to offline. Significantly more male gamers than female gamers said that they found it easier to converse online than offline. It was also found that 57% of gamers had created a character of the opposite gender, and it is suggested that the online female persona has a number of positive social attributes.
A German fMRT-study conducted by researchers of the Central Institute of Mental Health points towards impairments in social, emotional and physical aspects of the self-concept and a higher degree in avatar identification in addicted MMORPG players, compared to non-addicted and naive (nonexperienced) people. These findings generally support Davis' cognitive behavioral model of Internet addiction, which postulates that dysfunctional self-related cognitions represent central factors contributing towards the development and maintenance of MMORPG addiction. The high degree of avatar identification found by Leménager et al. in the addicted group of this study indicates that MMORPG playing may represent an attempt to compensate for impairments in self-concept. Psychotherapeutic interventions should therefore focus on the development of coping strategies for real-life situations in which addicted players tend to experience themselves as incompetent and inferior.
Richard Bartle, author of "Designing Virtual Worlds", classified multiplayer RPG-players into four primary psychological groups. His classifications were then expanded upon by Erwin Andreasen, who developed the concept into the thirty-question Bartle Test that helps players determine which category they are associated with. With over 650,000 test responses as of 2011, this is perhaps the largest ongoing survey of multiplayer game players. Based on Bartle and Yee's research, Jon Radoff has published an updated model of player motivation that focuses on immersion, competition, cooperation and achievement. These elements may be found not only in MMORPGs, but many other types of games and within the emerging field of gamification.
In "World of Warcraft", a temporary design glitch attracted the attention of psychologists and epidemiologists across North America, when the "Corrupted Blood" disease of a monster began to spread unintentionally—and uncontrollably—into the wider game world. The Centers for Disease Control used the incident as a research model to chart both the progression of a disease, and the potential human response to large-scale epidemic infection.
Many MMORPGs feature living economies. Virtual items and currency have to be gained through play and have definite value for players. Such a virtual economy can be analyzed (using data logged by the game) and has value in economic research. More significantly, these "virtual" economies can affect the economies of the real world.
One of the early researchers of MMORPGs was Edward Castronova, who demonstrated that a supply-and-demand market exists for virtual items and that it crosses over with the real world. This crossover has some requirements of the game:
The idea of attaching real-world value to "virtual" items has had a profound effect on players and the game industry, and even the courts. The virtual currency selling pioneer IGE received a lawsuit from a World of Warcraft player for interfering in the economics and intended use of the game by selling WoW gold. Castronova's first study in 2002 found that a highly liquid (if illegal) currency market existed, with the value of "Everquest"'s in-game currency exceeding that of the Japanese yen. Some people even make a living by working these virtual economies; these people are often referred to as gold farmers, and may be employed in game sweatshops.
Game publishers usually prohibit the exchange of real-world money for virtual goods, but others actively promote the idea of linking (and directly profiting from) an exchange. In "Second Life" and "Entropia Universe", the virtual economy and the real-world economy are directly linked. This means that real money can be deposited for game money and vice versa. Real-world items have also been sold for game money in "Entropia", and some players of "Second Life" have generated revenues in excess of $100,000.
Some of the issues confronting online economies include:
Linking real-world and virtual economies is rare in MMORPGs, as it is generally believed to be detrimental to gameplay. If real-world wealth can be used to obtain greater, more immediate rewards than skillful gameplay, the incentive for strategic roleplay and real game involvement is diminished. It could also easily lead to a skewed hierarchy where richer players gain better items, allowing them to take on stronger opponents and level up more quickly than less wealthy but more committed players.
|
https://en.wikipedia.org/wiki?curid=20844
|
Multiplication
Multiplication (often denoted by the cross symbol ', by the mid-line dot operator ', by juxtaposition, or, on computers, by an asterisk ) is one of the four elementary mathematical operations of arithmetic, with the others being addition, subtraction and division. The result of a multiplication operation is called the product.
The multiplication of whole numbers may be thought as a repeated addition; that is, the multiplication of two numbers is equivalent to adding as many copies of one of them, the "multiplicand", as the value of the other one, the "multiplier". Both numbers can be called "factors".
For example, 4 multiplied by 3 (often written as formula_2 and spoken as "3 times 4") can be calculated by adding 3 copies of 4 together:
Here 3 and 4 are the "factors" and 12 is the "product".
One of the main properties of multiplication is the commutative property: adding 3 copies of 4 gives the same result as adding 4 copies of 3:
Thus the designation of multiplier and multiplicand does not affect the result of the multiplication.
The multiplication of integers (including negative numbers), rational numbers (fractions) and real numbers is defined by a systematic generalization of this basic definition.
Multiplication can also be visualized as counting objects arranged in a rectangle (for whole numbers) or as finding the area of a rectangle whose sides have given lengths. The area of a rectangle does not depend on which side is measured first, which illustrates the commutative property. The product of two measurements is a new type of measurement, for instance multiplying the lengths of the two sides of a rectangle gives its area, this is the subject of dimensional analysis.
The inverse operation of multiplication is division. For example, since 4 multiplied by 3 equals 12, then 12 divided by 3 equals 4. Multiplication by 3, followed by division by 3, yields the original number (since the division of a number other than 0 by itself equals 1).
Multiplication is also defined for other types of numbers, such as complex numbers, and more abstract constructs, like matrices. For some of these more abstract constructs, the order in which the operands are multiplied together matters. A listing of the many different kinds of products that are used in mathematics is given in another page.
In arithmetic, multiplication is often written using the sign "×" between the terms; that is, in infix notation. For example,
The sign is encoded in Unicode at .
There are other mathematical notations for multiplication:
In computer programming, the asterisk (as in codice_1) is still the most common notation. This is due to the fact that most computers historically were limited to small character sets (such as ASCII and EBCDIC) that lacked a multiplication sign (such as codice_2 or codice_3), while the asterisk appeared on every keyboard. This usage originated in the FORTRAN programming language.
The numbers to be multiplied are generally called the "factors". The number to be multiplied is the "multiplicand", and the number by which it is multiplied is the "multiplier". Usually the multiplier is placed first and the multiplicand is placed second; however sometimes the first factor is the multiplicand and the second the multiplier. Also as the result of a multiplication does not depend on the order of the factors, the distinction between "multiplicand" and "multiplier" is useful only at a very elementary level and in some multiplication algorithms, such as the long multiplication. Therefore, in some sources, the term "multiplicand" is regarded as a synonym for "factor". In algebra, a number that is the multiplier of a variable or expression (e.g., the 3 in 3"xy"2) is called a coefficient.
The result of a multiplication is called a product. A product of integers is a multiple of each factor. For example, 15 is the product of 3 and 5, and is both a multiple of 3 and a multiple of 5.
The common methods for multiplying numbers using pencil and paper require a multiplication table of memorized or consulted products of small numbers (typically any two numbers from 0 to 9), however one method, the peasant multiplication algorithm, does not.
Multiplying numbers to more than a couple of decimal places by hand is tedious and error prone. Common logarithms were invented to simplify such calculations, since adding logarithms is equivalent to multiplying. The slide rule allowed numbers to be quickly multiplied to about three places of accuracy. Beginning in the early 20th century, mechanical calculators, such as the Marchant, automated multiplication of up to 10 digit numbers. Modern electronic computers and calculators have greatly reduced the need for multiplication by hand.
Methods of multiplication were documented in the Egyptian, Greek, Indian and Chinese civilizations.
The Ishango bone, dated to about 18,000 to 20,000 BC, hints at a knowledge of multiplication in the Upper Paleolithic era in Central Africa.
The Egyptian method of multiplication of integers and fractions, documented in the Ahmes Papyrus, was by successive additions and doubling. For instance, to find the product of 13 and 21 one had to double 21 three times, obtaining , , . The full product could then be found by adding the appropriate terms found in the doubling sequence:
The Babylonians used a sexagesimal positional number system, analogous to the modern day decimal system. Thus, Babylonian multiplication was very similar to modern decimal multiplication. Because of the relative difficulty of remembering different products, Babylonian mathematicians employed multiplication tables. These tables consisted of a list of the first twenty multiples of a certain "principal number" "n": "n", 2"n", ..., 20"n"; followed by the multiples of 10"n": 30"n" 40"n", and 50"n". Then to compute any sexagesimal product, say 53"n", one only needed to add 50"n" and 3"n" computed from the table.
In the mathematical text "Zhoubi Suanjing", dated prior to 300 BC, and the "Nine Chapters on the Mathematical Art", multiplication calculations were written out in words, although the early Chinese mathematicians employed Rod calculus involving place value addition, subtraction, multiplication and division. The Chinese were already using a decimal multiplication table since the Warring States period.
The modern method of multiplication based on the Hindu–Arabic numeral system was first described by Brahmagupta. Brahmagupta gave rules for addition, subtraction, multiplication and division. Henry Burchard Fine, then professor of Mathematics at Princeton University, wrote the following:
These place value decimal arithmetic algorithms were introduced to Arab countries by Al Khwarizmi in the early 9th century, and popularized in the Western world by Fibonacci in the 13th century.
Grid method multiplication or the box method, is used in primary schools in England and Wales and in some areas of the United States to help teach an understanding of how multiple digit multiplication works. An example of multiplying 34 by 13 would be to lay the numbers out in a grid like:
and then add the entries.
The classical method of multiplying two -digit numbers requires digit multiplications. Multiplication algorithms have been designed that reduce the computation time considerably when multiplying large numbers. Methods based on the discrete Fourier transform reduce the computational complexity to . Recently, the factor has been replaced by a function that increases much slower although it is still not constant (as it can be hoped).
In March 2019, David Harvey and Joris van der Hoeven submitted an article presenting an integer multiplication algorithm with a claimed complexity of formula_9 The algorithm, also based on the fast Fourier transform, is conjectured to be asymptotically optimal. The algorithm is not considered practically useful, as its advantages only appear when multiplying extremely large numbers (having more than bits).
One can only meaningfully add or subtract quantities of the same type but can multiply or divide quantities of different types. Four bags with three marbles each can be thought of as:
When two measurements are multiplied together the product is of a type depending on the types of the measurements. The general theory is given by dimensional analysis. This analysis is routinely applied in physics but has also found applications in finance.
A common example is multiplying speed by time gives distance, so
In this case, the hour units cancel out and we are left with only kilometer units.
Other examples:
The product of a sequence of factors can be written with the product symbol, which derives from the capital letter Π (pi) in the Greek alphabet. Unicode position U+220F (∏) contains a glyph for denoting such a product, distinct from U+03A0 (Π), the letter. The meaning of this notation is given by:
that is
The subscript gives the symbol for a bound variable ("i" in this case), called the "index of multiplication" together with its lower bound ("1"), whereas the superscript (here "4") gives its upper bound. The lower and upper bound are expressions denoting integers. The factors of the product are obtained by taking the expression following the product operator, with successive integer values substituted for the index of multiplication, starting from the lower bound and incremented by 1 up to and including the upper bound. So, for example:
More generally, the notation is defined as
where "m" and "n" are integers or expressions that evaluate to integers. In case , the value of the product is the same as that of the single factor "x""m". If , the product is an empty product, whose value is 1, regardless of the expression for the factors.
One may also consider products of infinitely many terms; these are called infinite products. Notationally, we would replace "n" above by the lemniscate ∞. The product of such a series is defined as the limit of the product of the first "n" terms, as "n" grows without bound. That is, by definition,
One can similarly replace "m" with negative infinity, and define:
provided both limits exist.
For the real and complex numbers, which includes for example natural numbers, integers, and fractions, multiplication has certain properties:
Other mathematical systems that include a multiplication operation may not have all these properties. For example, multiplication is not, in general, commutative for matrices and quaternions.
In the book "Arithmetices principia, nova methodo exposita", Giuseppe Peano proposed axioms for arithmetic based on his axioms for natural numbers. Peano arithmetic has two axioms for multiplication:
Here "S"("y") represents the successor of "y", or the natural number that "follows" "y". The various properties like associativity can be proved from these and the other axioms of Peano arithmetic including induction. For instance "S"(0), denoted by 1, is a multiplicative identity because
The axioms for integers typically define them as equivalence classes of ordered pairs of natural numbers. The model is based on treating ("x","y") as equivalent to when "x" and "y" are treated as integers. Thus both (0,1) and (1,2) are equivalent to −1. The multiplication axiom for integers defined this way is
The rule that −1 × −1 = 1 can then be deduced from
Multiplication is extended in a similar way to rational numbers and then to real numbers.
The product of non-negative integers can be defined with set theory using cardinal numbers or the Peano axioms. See below how to extend this to multiplying arbitrary integers, and then arbitrary rational numbers. The product of real numbers is defined in terms of products of rational numbers, see construction of the real numbers.
There are many sets that, under the operation of multiplication, satisfy the axioms that define group structure. These axioms are closure, associativity, and the inclusion of an identity element and inverses.
A simple example is the set of non-zero rational numbers. Here we have identity 1, as opposed to groups under addition where the identity is typically 0. Note that with the rationals, we must exclude zero because, under multiplication, it does not have an inverse: there is no rational number that can be multiplied by zero to result in 1. In this example we have an abelian group, but that is not always the case.
To see this, look at the set of invertible square matrices of a given dimension, over a given field. Now it is straightforward to verify closure, associativity, and inclusion of identity (the identity matrix) and inverses. However, matrix multiplication is not commutative, therefore this group is nonabelian.
Another fact of note is that the integers under multiplication is not a group, even if we exclude zero. This is easily seen by the nonexistence of an inverse for all elements other than 1 and −1.
Multiplication in group theory is typically notated either by a dot, or by juxtaposition (the omission of an operation symbol between elements). So multiplying element a by element b could be notated a formula_31 b or ab. When referring to a group via the indication of the set and operation, the dot is used, e.g., our first example could be indicated by formula_32
Numbers can "count" (3 apples), "order" (the 3rd apple), or "measure" (3.5 feet high); as the history of mathematics has progressed from counting on our fingers to modelling quantum mechanics, multiplication has been generalized to more complicated and abstract types of numbers, and to things that are not numbers (such as matrices) or do not look much like numbers (such as quaternions).
When multiplication is repeated, the resulting operation is known as exponentiation. For instance, the product of three factors of two (2×2×2) is "two raised to the third power", and is denoted by 23, a two with a superscript three. In this example, the number two is the base, and three is the exponent. In general, the exponent (or superscript) indicates how many times the base appears in the expression, so that the expression
indicates that "n" copies of the base "a" are to be multiplied together. This notation can be used whenever multiplication is known to be power associative.
|
https://en.wikipedia.org/wiki?curid=20845
|
Malt
Malt is germinated cereal grain that has been dried in a process known as "malting". The grain is made to germinate by soaking in water and is then halted from germinating further by drying with hot air.
Malting grain develops the enzymes (α-amylase, β-amylase) required for modifying the grains' starches into various types of sugar, including monosaccharide glucose, disaccharide maltose, trisaccharide maltotriose, and higher sugars called maltodextrines.
It also develops other enzymes, such as proteases, that break down the proteins in the grain into forms that can be used by yeast. The point at which the malting process is stopped affects the starch to enzyme ratio and partly converted starch becomes fermentable sugars.
Malt also contains small amounts of other sugars, such as sucrose and fructose, which are not products of starch modification but which are already in the grain. Further conversion to fermentable sugars is achieved during the mashing process.
Malted grain is used to make beer, whisky, malted milk, malt vinegar, confections such as Maltesers and Whoppers, flavored drinks such as Horlicks, Ovaltine, and Milo, and some baked goods, such as malt loaf, bagels, and rich tea biscuits. Malted grain that has been ground into a coarse meal is known as "sweet meal".
Various cereals are malted, though barley is the most common. A high-protein form of malted barley is often a label-listed ingredient in blended flours typically used in the manufacture of yeast breads and other baked goods.
The term "malt" (see Wiktionary entry ) refers to several products of the process: the grains to which this process has been applied, for example malted barley; the sugar, heavy in maltose, derived from such grains, such as the baker's malt used in various cereals; or a product based on malted milk, similar to a malted milkshake (i.e., "malts").
Malted grains have probably been used as an ingredient of beer since ancient times, for example in Egypt (Ancient Egyptian cuisine), Sumer, and China.
In Persian countries, a sweet paste made entirely from germinated wheat is called Samanū () in Iran, Samanak (), (); () or Sümölök (), which is prepared for Nowruz (Persian new year celebration) in a large pot (like a kazan). A plate or bowl of Samanu is a traditional component of the Haft sin table symbolising affluence. Traditionally, women have a special party to prepare it during the night, and cook it from late in the evening till the daylight, singing related songs. In Tajikistan and Afghanistan they sing: "Samanak dar Jūsh u mā Kafcha zanēm - Dīgarān dar Khwāb u mā Dafcha zanēm". (Meaning: ""Samanak is boiling and we are stirring it, others are asleep and we are playing daf""). In modern times, making samanu can be a family gathering. It originally comes from the Great Persian Empire.
Mämmi, or Easter Porridge, is a traditional Finnish Lenten food. Cooked from rye malt and flour, mämmi has a great resemblance (in recipe, color, and taste) to Samanū. Today, this product is available in shops from February until Easter. A (non-representative) survey in 2013 showed that almost no one cooks mämmi at home in modern-day Finland.
Malting is the process of converting barley or other cereal grains into malt for use in brewing, distilling, or in foods and takes place in a maltings, sometimes called a malthouse, or a malting floor. The cereal is spread out on the malting floor in a layer of depth.
A "maltings" is typically a long, single-story building with a floor that slopes slightly from one end of the building to the other. Floor maltings began to be phased out in the 1940s in favor of "pneumatic plants" where large industrial fans are used to blow air through the germinating grain beds and to pass hot air through the malt being kilned. Like floor maltings, these pneumatic plants use batch processes, but of considerably greater size, typically 100 ton batches compared with 20 ton batches floor maltings.
, the largest malting operation in the world was Malteurop, which operates in 14 countries.
Barley is the most commonly malted grain, in part because of its high content of enzymes, though wheat, rye, oats, rice, and corn are also used. Also very important is the retention of the grain's husk, even after threshing, unlike the bare seeds of threshed wheat or rye. This protects the growing acrospire (developing plant embryo) from damage during malting, which can easily lead to mold growth; it also allows the mash of converted grain to create a filter bed during lautering (see brewing).
As all grains sprout, natural enzymes within the grain break down the starch the grain is composed of into simpler sugars which taste sweet and are easier for yeast to use as growth food. Malt with active enzymes is called "diastatic malt". Malt with inactive enzymes is called "nondiastatic malt". The enzymes are deactivated by heating the malt.
Malt is often divided into two categories by brewers: base malts and specialty malts.
Base malts have enough diastatic power to convert their own starch and usually that of some amount of starch from unmalted grain, called adjuncts.
Specialty malts have little diastatic power, but provide flavor, color, or "body" (viscosity) to the finished beer. Specialty caramel or crystal malts have been subjected to heat treatment to convert their starches to sugars nonenzymatically. Within these categories is a variety of types distinguished largely by the kilning temperature (see mash ingredients).
In addition, malts are distinguished by the two major cultivar types of barley used for malting, two-row and six-row. The most common varieties of barley used for malting in the United States from 2009 to 2013 are two-row AC Metcalfe and Conrad and six-row Tradition and Lacey cultivars.
Malt extract, also known as extract of malt, is a sweet, treacly substance used as a dietary supplement. It was popular in the first half of the 20th century as a nutritional enhancer for the children of the British urban working class, whose diet was often deficient in vitamins and minerals. Children were given cod liver oil for the same reason, but it proved so unpalatable that it was combined with extract of malt to produce "Malt and Cod-Liver Oil."
The 1907 "British Pharmaceutical Codex"'s instructions for making nutritional extract of malt do not include a mashout at the end of extraction, and include the use of lower mash temperatures than is typical with modern beer-brewing practices. The "Codex" indicates that diastatic activity is to be preserved by the use of temperatures not exceeding .
Malt extract is frequently used in the brewing of beer. Its production begins by germinating barley grain in a process known as "malting", immersing barley in water to encourage the grain to sprout, then drying the barley to halt the progress when the sprouting begins. The drying step stops the sprouting, but the enzymes remain active due to the low temperatures used in base malt production. In one before-and-after comparison, malting decreased barley's extractable starch content by about 7% on a dry matter basis and turned that portion into various other carbohydrates.
In the next step, brewers use a process called mashing to extract the sugars. Brewers warm cracked malt in temperature-modulated water, activating the enzymes, which cleave more of the malt's remaining starch into various sugars, the largest percentage of which is maltose. Modern beer mashing practices typically include high enough temperatures at mash-out to deactivate remaining enzymes, thus it is no longer diastatic. The liquid produced from this, wort, is then concentrated by using heat or a vacuum procedure to evaporate water from the mixture. The concentrated wort is called malt extract.
Liquid malt extract (LME) is a thick syrup and is used for a variety of purposes, such as baking and brewing. It is also sold in jars as a consumer product.
The LME may be further dried to produce dry malt extract (DME), which is crystalline in form similar to common sugar.
Brewers have the option of using a liquid (LME) or dry (DME) form of it. Each has its pros and cons, so the choice is dependent solely on the individual brewer's preferences. Some brewers choose to work only with LME because they feel it works best for the result they wish to achieve. Also, it requires one fewer processing step, so it is appealing to those favoring the purest form of product available. However, it is very sticky and, therefore, messier to work with and has a shorter shelf life. Some feel the results are just as good with DME.
A new encapsulating technology permits the production of malt granules. Malt granules are the dried liquid extract from malt used in the brewing or distilling process.
Scientists aim to discover what happens inside barley grains as they become malted to help plant breeders produce better malting barley for food and beverage products. The United States Agricultural Research Service scientists are interested in specialized enzymes called serine-class proteases that digest beta-amylases, which convert carbohydrates into "simple sugars" during the sprouting process. The enzyme also breaks down stored proteins into their amino acid derivatives. The balance of proteins and carbohydrates broken down by the enzyme affect the malt's flavor.
|
https://en.wikipedia.org/wiki?curid=20853
|
Masonry
Masonry is the building of structures from individual units, which are often laid in and bound together by mortar; the term "masonry" can also refer to the units themselves. The common materials of masonry construction are brick, building stone such as marble, granite, and limestone, cast stone, concrete block, glass block, and adobe. Masonry is generally a highly durable form of construction. However, the materials used, the quality of the mortar and workmanship, and the pattern in which the units are assembled can substantially affect the durability of the overall masonry construction. A person who constructs masonry is called a mason or bricklayer. These are both classified as construction trades.
Masonry is commonly used for walls and buildings. Brick and concrete block are the most common types of masonry in use in industrialized nations and may be either weight-bearing or a veneer. Concrete blocks, especially those with hollow cores, offer various possibilities in masonry construction. They generally provide great compressive strength and are best suited to structures with light transverse loading when the cores remain unfilled. Filling some or all of the cores with concrete or concrete with steel reinforcement (typically rebar) offers much greater tensile and lateral strength to structures. Masonry workers are typically paid hourly depending on their position.
Masonry has high compressive strength under vertical loads but has a low tensile strength (against twisting or stretching) unless reinforced. The tensile strength of masonry walls can be increased by thickening the wall, or by building masonry "piers" (vertical columns or ribs) at intervals. Where practical, steel reinforcements such as windposts can be added.
A masonry veneer wall consists of masonry units, usually clay-based bricks, installed on one or both sides of a structurally independent wall usually constructed of wood or masonry. In this context, the brick masonry is primarily decorative, not structural. The brick veneer is generally connected to the structural wall by brick ties (metal strips that are attached to the structural wall, as well as the mortar joints of the brick veneer). There is typically an air gap between the brick veneer and the structural wall. As clay-based brick is usually not completely waterproof, the structural wall will often have a water-resistant surface (usually tar paper) and weep holes can be left at the base of the brick veneer to drain moisture that accumulates inside the air gap. Concrete blocks, real and cultured stones, and veneer adobe are sometimes used in a very similar veneer fashion.
Most insulated buildings that utilize concrete block, brick, adobe, stone, veneers or some combination thereof feature interior insulation in the form of fiberglass batts between wooden wall studs or in the form of rigid insulation boards covered with plaster or drywall. In most climates this insulation is much more effective on the exterior of the wall, allowing the building interior to take advantage of the aforementioned thermal mass of the masonry. This technique does, however, require some sort of weather-resistant exterior surface over the insulation and, consequently, is generally more expensive.
The strength of a masonry wall is not entirely dependent on the bond between the building material and the mortar; the friction between the interlocking blocks of masonry is often strong enough to provide a great deal of strength on its own. The blocks sometimes have grooves or other surface features added to enhance this interlocking, and some "dry set" masonry structures forgo mortar altogether.
Solid brickwork is made of two or more wythes of bricks with the units running horizontally (called "stretcher" bricks) bound together with bricks running to the wall (called "header" bricks). Each row of bricks is known as a course. The pattern of headers and stretchers employed gives rise to different 'bonds' such as the common bond (with every sixth course composed of headers), the English bond, and the Flemish bond (with alternating stretcher and header bricks present on every course). Bonds can differ in strength and in insulating ability. Vertically staggered bonds tend to be somewhat stronger and less prone to major cracking than a non-staggered bond.
The wide selection of brick styles and types generally available in industrialized nations allow much variety in the appearance of the final product. In buildings built during the 1950s-1970s, a high degree of uniformity of brick and accuracy in masonry was typical. In the period since then this style was thought to be too sterile, so attempts were made to emulate older, rougher work. Some brick surfaces are made to look particularly rustic by including "burnt" bricks, which have a darker color or an irregular shape. Others may use antique salvage bricks, or new bricks may be artificially aged by applying various surface treatments, such as tumbling. The attempts at rusticity of the late 20th century have been carried forward by masons specializing in a free, artistic style, where the courses are intentional "not" straight, instead weaving to form more organic impressions.
A crinkle-crankle wall is a brick wall that follows a serpentine path, rather than a straight line. This type of wall is more resistant to toppling than a straight wall; so much so that it may be made of a single wythe of unreinforced brick and so despite its longer length may be more economical than a straight wall.
Blocks of cinder concrete ("cinder blocks" or "breezeblocks"), ordinary concrete ("concrete blocks"), or hollow tile are generically known as Concrete Masonry Units (CMUs). They usually are much larger than ordinary bricks and so are much faster to lay for a wall of a given size. Furthermore, cinder and concrete blocks typically have much lower water absorption rates than brick. They often are used as the structural core for veneered brick masonry or are used alone for the walls of factories, garages, and other industrial-style buildings where such appearance is acceptable or desirable. Such blocks often receive a stucco surface for decoration. Surface-bonding cement, which contains synthetic fibers for reinforcement, is sometimes used in this application and can impart extra strength to a block wall. Surface-bonding cement is often pre-colored and can be stained or painted thus resulting in a finished stucco-like surface.
The primary structural advantage of concrete blocks in comparison to smaller clay-based bricks is that a CMU wall can be reinforced by filling the block voids with concrete with or without steel rebar. Generally, certain voids are designated for filling and reinforcement, particularly at corners, wall-ends, and openings while other voids are left empty. This increases wall strength and stability more economically than filling and reinforcing all voids. Typically, structures made of CMUs will have the top course of blocks in the walls filled with concrete and tied together with steel reinforcement to form a bond beam. Bond beams are often a requirement of modern building codes and controls. Another type of steel reinforcement referred to as ladder-reinforcement, can also be embedded in horizontal mortar joints of concrete block walls. The introduction of steel reinforcement generally results in a CMU wall having much greater lateral and tensile strength than unreinforced walls.
"Architectural masonry is the evolvement of standard concrete masonry blocks into aesthetically pleasing concrete masonry units (CMUs)." CMUs can be manufactured to provide a variety of surface appearances. They can be colored during manufacturing or stained or painted after installation. They can be split as part of the manufacturing process, giving the blocks a rough face replicating the appearance of natural stone, such as brownstone. CMUs may also be scored, ribbed, sandblasted, polished, striated (raked or brushed), include decorative aggregates, be allowed to slump in a controlled fashion during curing, or include several of these techniques in their manufacture to provide a decorative appearance.
"Glazed concrete masonry units are manufactured by bonding a permanent colored facing (typically composed of polyester resins, silica sand and various other chemicals) to a concrete masonry unit, providing a smooth impervious surface."
Glass block or glass brick are blocks made from glass and provide a translucent to clear vision through the block.
Stone blocks used in masonry can be dressed or rough, though in both examples: corners, door and window jambs, and similar areas are usually dressed.
Stonemasonry utilizing dressed stones is known as ashlar masonry, whereas masonry using irregularly shaped stones is known as rubble masonry. Both rubble and ashlar masonry can be laid in coursed rows of even height through the careful selection or cutting of stones, but a great deal of stone masonry is uncoursed.
Gabions are baskets, usually now of zinc-protected steel (galvanized steel) that are filled with fractured stone of medium size. These will act as a single unit and are stacked with setbacks to form a revetment or retaining wall. They have the advantage of being well drained, flexible, and resistant to flood, water flow from above, frost damage, and soil flow. Their expected useful life is only as long as the wire they are composed of and if used in severe climates (such as shore-side in a salt water environment) must be made of appropriate corrosion-resistant wire. Most modern gabions are rectangular.
Earlier gabions were often cylindrical wicker baskets, open at both ends, used usually for temporary, often military, construction.
Similar work can be done with finer aggregates using cellular confinement.
Masonry walls have an endothermic effect of its hydrates, as in chemically bound water, unbound moisture from the concrete block, and the poured concrete if the hollow cores inside the blocks are filled. Masonry can withstand temperatures up to 1,000ºF and it can withstand direct exposure to fire for up to 4 hours. In addition to that, concrete masonry keeps fires contained to their room of origin 93% of the time. For those reasons, concrete masonry units hold the highest fire class flame spread classification, a Class A.
Masonry buildings can also be built to increase safety by reducing fire damage, such as the use of fire cuts during construction.
From the point of view of material modeling, masonry is a special material of extreme mechanical properties (with a very high ratio between strength in compression and in tension), so that the applied loads do not diffuse as they do in elastic bodies, but tend to percolate along lines of high stiffness, see the figure on the right and watch a video for more details.
|
https://en.wikipedia.org/wiki?curid=20857
|
Mickey Mouse
Mickey Mouse is a cartoon character and the mascot of The Walt Disney Company. He was created by Walt Disney and Ub Iwerks at the Walt Disney Studios in 1928. An anthropomorphic mouse who typically wears red shorts, large yellow shoes, and white gloves, Mickey is one of the world's most recognizable characters.
Created as a replacement for a prior Disney character, Oswald the Lucky Rabbit, Mickey first appeared in the short "Plane Crazy", debuting publicly in the short film "Steamboat Willie" (1928), one of the first sound cartoons. He went on to appear in over 130 films, including "The Band Concert" (1935), "Brave Little Tailor" (1938), and "Fantasia" (1940). Mickey appeared primarily in short films, but also occasionally in feature-length films. Ten of Mickey's cartoons were nominated for the Academy Award for Best Animated Short Film, one of which, "Lend a Paw", won the award in 1942. In 1978, Mickey became the first cartoon character to have a star on the Hollywood Walk of Fame.
Beginning in 1930, Mickey has also been featured extensively as a comic strip character. The "Mickey Mouse" comic strip, drawn primarily by Floyd Gottfredson, ran for 45 years. Mickey has also appeared in comic books such as Disney Italy's "Topolino", "MM – Mickey Mouse Mystery Magazine", and "Wizards of Mickey", and in television series such as "The Mickey Mouse Club" (1955–1996) and others. He also appears in other media such as video games as well as merchandising and is a meetable character at the Disney parks.
Mickey generally appears alongside his girlfriend Minnie Mouse, his pet dog Pluto, his friends Donald Duck and Goofy, and his nemesis Pete, among others (see Mickey Mouse universe). Though originally characterized as a cheeky lovable rogue, Mickey was rebranded over time as a nice guy, usually seen as an honest and bodacious hero. In 2009, Disney began to rebrand the character again by putting less emphasis on his friendly, well-meaning persona and reintroducing the more menacing and stubborn sides of his personality, beginning with the video game "Epic Mickey".
Mickey Mouse was created as a replacement for Oswald the Lucky Rabbit, an earlier cartoon character that was created by the Disney studio but owned by Universal Pictures. Charles Mintz served as a middleman producer between Disney and Universal through his company, Winkler Pictures, for the series of cartoons starring Oswald. Ongoing conflicts between Disney and Mintz and the revelation that several animators from the Disney studio would eventually leave to work for Mintz's company ultimately resulted in Disney cutting ties with Oswald. Among the few people who stayed at the Disney studio were animator Ub Iwerks, apprentice artist Les Clark, and Wilfred Jackson. On his train ride home from New York, Walt brainstormed ideas for a new cartoon character.
Mickey Mouse was conceived in secret while Disney produced the final Oswald cartoons he contractually owed Mintz. Disney asked Ub Iwerks to start drawing up new character ideas. Iwerks tried sketches of various animals, such as dogs and cats, but none of these appealed to Disney. A female cow and male horse were also rejected. They would later turn up as Clarabelle Cow and Horace Horsecollar. A male frog was also rejected. It would later show up in Iwerks' own "Flip the Frog" series. Walt Disney got the inspiration for Mickey Mouse from a tame mouse at his desk at Laugh-O-Gram Studio in Kansas City, Missouri. In 1925, Hugh Harman drew some sketches of mice around a photograph of Walt Disney. These inspired Ub Iwerks to create a new mouse character for Disney. "Mortimer Mouse" had been Disney's original name for the character before his wife, Lillian, convinced him to change it, and ultimately Mickey Mouse came to be. The actor Mickey Rooney claimed that, during his Mickey McGuire days, he met cartoonist Walt Disney at the Warner Brothers studio, and that Disney was inspired to name Mickey Mouse after him. This claim, however, has been debunked by Disney historian Jim Korkis, since at the time of Mickey Mouse's development, Disney Studios had been located on Hyperion Avenue for several years, and Walt Disney never kept an office or other working space at Warner Brothers, having no professional relationship with Warner Brothers.
Iwerks was the main animator for the first short that would star Mickey and reportedly spent six weeks working on it. In fact, Iwerks was the main animator for every Disney short released in 1928 and 1929. Hugh Harman and Rudolf Ising also assisted Disney during those years. They had already signed their contracts with Charles Mintz, but he was still in the process of forming his new studio and so for the time being they were still employed by Disney. This short would be the last they animated under this somewhat awkward situation.
Mickey was first seen in a test screening of the cartoon short "Plane Crazy", on May 15, 1928, but it failed to impress the audience and Walt could not find a distributor for the short. Walt went on to produce a second Mickey short, "The Gallopin' Gaucho", which was also not released for lack of a distributor.
"Steamboat Willie" was first released on November 18, 1928, in New York. It was co-directed by Walt Disney and Ub Iwerks. Iwerks again served as the head animator, assisted by Johnny Cannon, Les Clark, Wilfred Jackson and Dick Lundy. This short was intended as a parody of Buster Keaton's "Steamboat Bill, Jr.", first released on May 12 of the same year. Although it was the third Mickey cartoon produced, it was the first to find a distributor, and thus is considered by The Disney Company as Mickey's debut. "Willie" featured changes to Mickey's appearance (in particular, simplifying his eyes to large dots) that established his look for later cartoons and in numerous Walt Disney films.
The cartoon was not the first cartoon to feature a soundtrack connected to the action. Fleischer Studios, headed by brothers Dave and Max Fleischer, had already released a number of sound cartoons using the DeForest system in the mid-1920s. However, these cartoons did not keep the sound synchronized throughout the film. For "Willie", Disney had the sound recorded with a click track that kept the musicians on the beat. This precise timing is apparent during the "Turkey in the Straw" sequence when Mickey's actions exactly match the accompanying instruments. Animation historians have long debated who had served as the composer for the film's original music. This role has been variously attributed to Wilfred Jackson, Carl Stalling and Bert Lewis, but identification remains uncertain. Walt Disney himself was voice actor for both Mickey and Minnie and would remain the source of Mickey's voice through 1946 for theatrical cartoons. Jimmy MacDonald took over the role in 1946, but Walt provided Mickey's voice again from 1955 to 1959 for "The Mickey Mouse Club" television series on ABC.
Audiences at the time of "Steamboat Willie"'s release were reportedly impressed by the use of sound for comedic purposes. Sound films or "talkies" were still considered innovative. The first feature-length movie with dialogue sequences, "The Jazz Singer" starring Al Jolson, was released on October 6, 1927. Within a year of its success, most United States movie theaters had installed sound film equipment. Walt Disney apparently intended to take advantage of this new trend and, arguably, managed to succeed. Most other cartoon studios were still producing silent products and so were unable to effectively act as competition to Disney. As a result, Mickey would soon become the most prominent animated character of the time. Walt Disney soon worked on adding sound to both "Plane Crazy" and "The Gallopin' Gaucho" (which had originally been silent releases) and their new release added to Mickey's success and popularity. A fourth Mickey short, "The Barn Dance", was also put into production; however, Mickey does not actually speak until "The Karnival Kid" in 1929 when his first spoken words were "Hot dogs, Hot dogs!" After "Steamboat Willie" was released, Mickey became a close competitor to Felix the Cat, and his popularity would grow as he was continuously featured in sound cartoons. By 1929, Felix would lose popularity among theater audiences, and Pat Sullivan decided to produce all future Felix cartoons in sound as a result. Unfortunately, audiences did not respond well to Felix's transition to sound and by 1930, Felix had faded from the screen.
In Mickey's early films he was often characterized not as a hero, but as an ineffective young suitor to Minnie Mouse. "The Barn Dance" (March 14, 1929) is the first time in which Mickey is turned down by Minnie in favor of Pete.
"The Opry House" (March 28, 1929) was the first time in which Mickey wore his white gloves. Mickey wears them in almost all of his subsequent appearances and many other characters followed suit. The three lines on the back of Mickey's gloves represent darts in the gloves' fabric extending from between the digits of the hand, typical of glove design of the era.
"When the Cat's Away" (April 18, 1929), essentially a remake of the "Alice Comedy", "Alice Rattled by Rats", was an unusual appearance for Mickey. Although Mickey and Minnie still maintained their anthropomorphic characteristics, they were depicted as the size of regular mice and living with a community of many other mice as pests in a home. Mickey and Minnie would later appear the size of regular humans in their own setting. In appearances with real humans, Mickey has been shown to be about two to three feet high. The next Mickey short was also unusual. "The Barnyard Battle" (April 25, 1929) was the only film to depict Mickey as a soldier and also the first to place him in combat. "The Karnival Kid" (1929) was the first time Mickey spoke. Before this he had only whistled, laughed, and grunted. His first words were "Hot dogs! Hot dogs!" said while trying to sell hot dogs at a carnival. "Mickey's Follies" (1929) introduced the song "Minnie's Yoo-Hoo" which would become the theme song for "Mickey Mouse" films for the next several years. The "Minnie's Yoo-Hoo" song sequence was also later reused with different background animation as its own special short shown only at the commencement of 1930s theater-based Mickey Mouse Clubs. Mickey's dog Pluto first appeared as Mickey's pet in "The Moose Hunt" (1931) after previously appearing as Minnie's dog "Rover" in "The Picnic" (1930).
"The Cactus Kid" (April 11, 1930) was the last film to be animated by Ub Iwerks at Disney. Shortly before the release of the film, Iwerks left to start his own studio, bankrolled by Disney's then-distributor Pat Powers. Powers and Disney had a falling out over money due Disney from the distribution deal. It was in response to losing the right to distribute Disney's cartoons that Powers made the deal with Iwerks, who had long harbored a desire to head his own studio. The departure is considered a turning point in Mickey's career, as well as that of Walt Disney. Walt lost the man who served as his closest colleague and confidant since 1919. Mickey lost the man responsible for his original design and for the direction or animation of several of the shorts released till this point. Advertising for the early Mickey Mouse cartoons credited them as "A Walt Disney Comic, drawn by Ub Iwerks". Later Disney Company reissues of the early cartoons tend to credit Walt Disney alone.
Disney and his remaining staff continued the production of the Mickey series, and he was able to eventually find a number of animators to replace Iwerks. As the Great Depression progressed and Felix the Cat faded from the movie screen, Mickey's popularity would rise, and by 1932 The Mickey Mouse Club would have one million members. At the 5th Academy Awards in 1932, Mickey received his first Academy Award nomination, received for "Mickey's Orphans" (1931). Walt Disney also received an honorary Academy Award for the creation of Mickey Mouse. Despite being eclipsed by the "Silly Symphony" short the "Three Little Pigs" in 1933, Mickey still maintained great popularity among theater audiences too, until 1935, when polls showed that Popeye was more popular than Mickey. By 1934, Mickey merchandise had earned $600,000.00 a year. In 1935, Disney began to phase out the Mickey Mouse Clubs, due to administration problems.
About this time, story artists at Disney were finding it increasingly difficult to write material for Mickey. As he had developed into a role model for children, they were limited in the types of gags they could make. This led to Mickey taking more of a secondary role in some of his next films allowing for more emphasis on other characters. In "Orphan's Benefit" (August 11, 1934) Mickey first appeared with Donald Duck who had been introduced earlier that year in the "Silly Symphony" series. The tempestuous duck would provide Disney with seemingly endless story ideas and would remain a recurring character in Mickey's cartoons.
Mickey first appeared animated in color in "Parade of the Award Nominees" in 1932, however, the film strip was created for the 5th Academy Awards ceremony and was not released to the public. Mickey's official first color film came in 1935 with "The Band Concert". The Technicolor film process was used in the film production. Here Mickey conducted the "William Tell Overture", but the band is swept up by a tornado. It is said that conductor Arturo Toscanini so loved this short that, upon first seeing it, he asked the projectionist to run it again. In 1994, "The Band Concert" was voted the third-greatest cartoon of all time in a poll of animation professionals. By colorizing and partially redesigning Mickey, Walt would put Mickey back on top once again, and Mickey would reach popularity he never reached before as audiences now gave him more appeal. Also in 1935, Walt would receive a special award from the League of Nations for creating Mickey.
However, by 1938, the more manic Donald Duck would surpass the passive Mickey, resulting in a redesign of the mouse between 1938 and 1940 that put Mickey at the peak of his popularity. The second half of the 1930s saw the character Goofy reintroduced as a series regular. Together, Mickey, Donald Duck, and Goofy would go on several adventures together. Several of the films by the comic trio are some of Mickey's most critically acclaimed films, including "Mickey's Fire Brigade" (1935), "Moose Hunters" (1937), "Clock Cleaners" (1937), "Lonesome Ghosts" (1937), "Boat Builders" (1938), and "Mickey's Trailer" (1938). Also during this era, Mickey would star in "Brave Little Tailor" (1938), an adaptation of "The Valiant Little Tailor", which was nominated for an Academy Award.
Mickey was redesigned by animator Fred Moore which was first seen in "The Pointer" (1939). Instead of having solid black eyes, Mickey was given white eyes with pupils, a Caucasian skin colored face, and a pear-shaped body. In the 1940s, he changed once more in "The Little Whirlwind", where he used his trademark pants for the last time in decades, lost his tail, got more realistic ears that changed with perspective and a different body anatomy. But this change would only last for a short period of time before returning to the one in ""The Pointer"", with the exception of his pants. In his final theatrical cartoons in the 1950s, he was given eyebrows, which were removed in the more recent cartoons.
In 1940, Mickey appeared in his first feature-length film, "Fantasia". His screen role as The Sorcerer's Apprentice, set to the symphonic poem of the same name by Paul Dukas, is perhaps the most famous segment of the film and one of Mickey's most iconic roles. The segment features no dialogue at all, only the music. The apprentice (Mickey), not willing to do his chores, puts on the sorcerer's magic hat after the sorcerer goes to bed and casts a spell on a broom, which causes the broom to come to life and perform the most tiring chore—filling up a deep well using two buckets of water. When the well eventually overflows, Mickey finds himself unable to control the broom, leading to a near-flood. After the segment ends, Mickey is seen in silhouette shaking hands with Leopold Stokowski, who conducts all the music heard in "Fantasia". Mickey has often been pictured in the red robe and blue sorcerer's hat in merchandising. It was also featured into the climax of Fantasmic!, an attraction at the Disney theme parks.
After 1940, Mickey's popularity would decline until his 1955 re-emergence as a daily children's television personality. Despite this, the character continued to appear regularly in animated shorts until 1943 (winning his only competitive Academy Award—with canine companion Pluto—for a short subject, "Lend a Paw") and again from 1946 to 1952.
The last regular installment of the "Mickey Mouse" film series came in 1953 with "The Simple Things" in which Mickey and Pluto go fishing and are pestered by a flock of seagulls.
In the 1950s, Mickey became more known for his appearances on television, particularly with "The Mickey Mouse Club". Many of his theatrical cartoon shorts were rereleased on television series such as "Ink & Paint Club", various forms of the Walt Disney anthology television series, and on home video. Mickey returned to theatrical animation in 1983 with "Mickey's Christmas Carol", an adaptation of Charles Dickens' "A Christmas Carol" in which Mickey played Bob Cratchit. This was followed up in 1990 with "The Prince and the Pauper".
Throughout the decades, Mickey Mouse competed with Warner Bros.' Bugs Bunny for animated popularity. But in 1988, the two rivals finally shared screen time in the Robert Zemeckis Disney/Amblin film "Who Framed Roger Rabbit". Disney and Warner signed an agreement stating that each character had the same amount of screen time in the scene.
Similar to his animated inclusion into a live-action film on "Roger Rabbit", Mickey made a featured cameo appearance in the 1990 television special "The Muppets at Walt Disney World" where he met Kermit the Frog. The two are established in the story as having been old friends. The Muppets have otherwise spoofed and referenced Mickey over a dozen times since the 1970s. Eventually, The Muppets were purchased by the Walt Disney Company in 2004.
His most recent theatrical cartoon short was 2013's "Get a Horse!" which was preceded by 1995's "Runaway Brain", while from 1999 to 2004, he appeared in direct-to-video features like "Mickey's Once Upon a Christmas", "" and the computer-animated "Mickey's Twice Upon a Christmas".
Many television series have centered on Mickey, such as the ABC shows "Mickey Mouse Works" (1999–2000), "Disney's House of Mouse" (2001–2003), Disney Channel's "Mickey Mouse Clubhouse" (2006–2016), and "Mickey and the Roadster Racers" (2017–). Prior to all these, Mickey was also featured as an unseen character in the "Bonkers" episode "You Oughta Be In Toons".
Mickey has recently been announced to star in two films. One is being based on the Magic Kingdom theme park at the Walt Disney World Resort, while the other is a film idea pitched by Walt Disney Animation Studios veteran Burny Mattinson centering on Mickey, Donald, and Goofy.
Since June 28, 2013, Disney Channel has been airing new 3-minute "Mickey Mouse" shorts, with animator Paul Rudish at the helm, incorporating elements of Mickey's late twenties-early thirties look with a contemporary twist. The creative team behind the 2017 "DuckTales" reboot had hoped to have Mickey Mouse in the series, but this idea was rejected by Disney executives. However, this didn't stop them from including a watermelon shaped like Mickey Mouse that Donald Duck made and used like a ventriloquist dummy (to the point where he had perfectly replicated his voice (supplied by Chris Diamantopoulos)) while he was stranded on a deserted island during the season two finale.
In August 2018, ABC television announced a two-hour prime time special, "Mickey's 90th Spectacular", in honor of Mickey's 90th birthday. The program featured never-before-seen short videos and several other celebrities who wanted to share their memories about Mickey Mouse and performed some of the Disney songs to impress Mickey. The show took place at the Shrine Auditorium in Los Angeles and was produced and directed by Don Mischer on November 4, 2018. On November 18, 2018, a 90th anniversary event for the character was celebrated around the world. In December 2019, both Mickey and Minnie served as special co-hosts of "Wheel of Fortune" for two weeks while Vanna White served as the main host during Pat Sajak's absence.
Mickey first appeared in comics after he had appeared in 15 commercially successful animated shorts and was easily recognized by the public. Walt Disney was approached by King Features Syndicate with the offer to license Mickey and his supporting characters for use in a comic strip. Disney accepted and "Mickey Mouse" made its first appearance on January 13, 1930. The comical plot was credited to Disney himself, art to Ub Iwerks and inking to Win Smith. The first week or so of the strip featured a loose adaptation of ""Plane Crazy"". Minnie soon became the first addition to the cast. The strips first released between January 13, 1930, and March 31, 1930, has been occasionally reprinted in comic book form under the collective title ""Lost on a Desert Island"". Animation historian Jim Korkis notes "After the eighteenth strip, Iwerks left and his inker, Win Smith, continued drawing the gag-a-day format..."
In early 1930, after Iwerks' departure, Disney was at first content to continue scripting the Mickey Mouse comic strip, assigning the art to Win Smith. However, Disney's focus had always been in animation and Smith was soon assigned with the scripting as well. Smith was apparently discontent at the prospect of having to script, draw, and ink a series by himself as evidenced by his sudden resignation.
Disney then searched for a replacement among the remaining staff of the Studio. He selected Floyd Gottfredson, a recently hired employee. At the time Gottfredson was reportedly eager to work in animation and somewhat reluctant to accept his new assignment. Disney had to assure him the assignment was only temporary and that he would eventually return to animation. Gottfredson accepted and ended up holding this "temporary" assignment from May 5, 1930, to November 15, 1975.
Walt Disney's last script for the strip appeared May 17, 1930. Gottfredson's first task was to finish the storyline Disney had started on April 1, 1930. The storyline was completed on September 20, 1930, and later reprinted in comic book form as "Mickey Mouse in Death Valley". This early adventure expanded the cast of the strip which to this point only included Mickey and Minnie. Among the characters who had their first comic strip appearances in this story were Clarabelle Cow, Horace Horsecollar, and Black Pete as well as the debuts of corrupted lawyer Sylvester Shyster and Minnie's uncle Mortimer Mouse. The Death Valley narrative was followed by "Mr. Slicker and the Egg Robbers", first printed between September 22 and December 26, 1930, which introduced Marcus Mouse and his wife as Minnie's parents.
Starting with these two early comic strip stories, Mickey's versions in animation and comics are considered to have diverged from each other. While Disney and his cartoon shorts would continue to focus on comedy, the comic strip effectively combined comedy and adventure. This adventurous version of Mickey would continue to appear in comic strips and later comic books throughout the 20th and into the 21st century.
Floyd Gottfredson left his mark with stories such as "Mickey Mouse Joins the Foreign Legion" (1936) and "The Gleam" (1942). He also created the Phantom Blot, Eega Beeva, Morty and Ferdie, Captain Churchmouse, and Butch. Besides Gottfredson artists for the strip over the years included Roman Arambula, Rick Hoover, Manuel Gonzales, Carson Van Osten, Jim Engel, Bill Wright, Ted Thwailes and Daan Jippes; writers included Ted Osborne, Merrill De Maris, Bill Walsh, Dick Shaw, Roy Williams, Del Connell, and Floyd Norman.
The next artist to leave his mark on the character was Paul Murry in Dell Comics. His first Mickey tale appeared in 1950 but Mickey did not become a specialty until Murry's first serial for "Walt Disney's Comics and Stories" in 1953 ("The Last Resort"). In the same period, Romano Scarpa in Italy for the magazine "Topolino" began to revitalize Mickey in stories that brought back the Phantom Blot and Eega Beeva along with new creations such as the Atomo Bleep-Bleep. While the stories at Western Publishing during the Silver Age emphasized Mickey as a detective in the style of Sherlock Holmes, in the modern era several editors and creators have consciously undertaken to depict a more vigorous Mickey in the mold of the classic Gottfredson adventures. This renaissance has been spearheaded by Byron Erickson, David Gerstein, Noel Van Horn, Michael T. Gilbert and César Ferioli.
In Europe, Mickey Mouse became the main attraction of a number of comics magazines, the most famous being "Topolino" in Italy from 1932 onward, "Le Journal de Mickey" in France from 1934 onward, "Don Miki" in Spain and the Greek "Miky Maous".
Mickey was the main character for the series" MM Mickey Mouse Mystery Magazine", published in Italy from 1999 to 2001.
In 2006, he appeared in the Italian fantasy comic saga "Wizards of Mickey".
In 1958, Mickey Mouse was introduced to the Arab world through another comic book called “Sameer”. He became very popular in Egypt and got a comic book with his name. Mickey's comics in Egypt are licensed by Disney and were published since 1959 by “Dar Al-Hilal” and they were success, however Dar Al-Hilal stopped the publication in 2003 because of problems with Disney. The comics were re-released by "Nahdat Masr" in 2004 and the first issues were sold out in less than 8 hours.
Throughout the earlier years, Mickey's design bore heavy resemblance to Oswald, save for the ears, nose, and tail. Ub Iwerks designed Mickey's body out of circles in order to make the character simple to animate. Disney employees John Hench and Marc Davis believed that this design was part of Mickey's success as it made him more dynamic and appealing to audiences.
Mickey's circular design is most noticeable in his ears. In animation in the 1940s, Mickey's ears were animated in a more realistic perspective. Later, they were drawn to always appear circular no matter which way Mickey was facing. This made Mickey easily recognizable to audiences and made his ears an unofficial personal trademark. The circular rule later created a dilemma for toy creators who had to recreate a three-dimensional Mickey.
In 1938, animator Fred Moore redesigned Mickey's body away from its circular design to a pear-shaped design. Colleague Ward Kimball praised Moore for being the first animator to break from Mickey's "rubber hose, round circle" design. Although Moore himself was nervous at first about changing Mickey, Walt Disney liked the new design and told Moore "that's the way I want Mickey to be drawn from now on."
Each of Mickey's hands has only three fingers and a thumb. Disney said that this was both an artistic and financial decision, explaining "Artistically five digits are too many for a mouse. His hand would look like a bunch of bananas. Financially, not having an extra finger in each of 45,000 drawings that make up a six and one-half minute short has saved the Studio millions." In the film "The Opry House" (1929), Mickey was first given white gloves as a way of contrasting his naturally black hands against his black body. The use of white gloves would prove to be an influential design for cartoon characters, particularly with later Disney characters, but also with non-Disney characters such as Bugs Bunny, Woody Woodpecker, Mighty Mouse, and Mario.
Mickey's eyes, as drawn in "Plane Crazy" and "The Gallopin' Gaucho", were large and white with black outlines. In "Steamboat Willie," the bottom portion of the black outlines was removed, although the upper edges still contrasted with his head. Mickey's eyes were later re-imagined as only consisting of the small black dots which were originally his pupils, while what were the upper edges of his eyes became a hairline. This is evident only when Mickey blinks. Fred Moore later redesigned the eyes to be small white eyes with pupils and gave his face a Caucasian skin tone instead of plain white. This new Mickey first appeared in 1938 on the cover of a party program, and in animation the following year with the release of "The Pointer". Mickey is sometimes given eyebrows as seen in "The Simple Things" (1953) and in the comic strip, although he does not have eyebrows in his subsequent appearances.
Some of Mickey's early appearance, particularly the gloves, and facial characteristics, evolved from blackface caricatures used in minstrel shows.
Besides Mickey's gloves and shoes, he typically wears only a pair of shorts with two large buttons in the front. Before Mickey was seen regularly in color animation, Mickey's shorts were either red or a dull blue-green. With the advent of Mickey's color films, the shorts were always red. When Mickey is not wearing his red shorts, he is often still wearing red clothing such as a red bandmaster coat ("The Band Concert", "The Mickey Mouse Club"), red overalls ("Clock Cleaners", "Boat Builders"), a red cloak ("Fantasia", "Fun and Fancy Free"), a red coat ("Squatter's Rights", "Mickey's Christmas Carol"), or a red shirt ("Mickey Down Under", "The Simple Things").
A large part of Mickey's screen persona is his famously shy, falsetto voice. From 1928 onward, Mickey was voiced by Walt Disney himself, a task in which Disney took great personal pride. Composer Carl W. Stalling was the very first person to provide lines for Mickey in the 1929 short "The Karnival Kid", and J. Donald Wilson and Joe Twerp provided the voice in some 1938 broadcasts of "The Mickey Mouse Theater of the Air", although Disney remained Mickey's official voice during this period. However, by 1946, Disney was becoming too busy with running the studio to do regular voice work which meant he could not do Mickey's voice on a regular basis anymore. It is also speculated that his cigarette habit had damaged his voice over the years. After recording the "Mickey and the Beanstalk" section of "Fun and Fancy Free", Mickey's voice was handed over to veteran Disney musician and actor Jimmy MacDonald. Walt would reprise Mickey's voice occasionally until his passing in 1966, such as in the introductions to the original 1955–1959 run of "The Mickey Mouse Club" TV series, the "Fourth Anniversary Show" episode of the Walt Disney's Disneyland TV series that aired on September 11, 1957 and the "Disneyland USA at Radio City Music Hall" show from 1962.
MacDonald voiced Mickey in most of the remaining theatrical shorts and for various television and publicity projects up until his retirement in 1977. However, other actors would occasionally play the role during this era. Clarence Nash, the voice of Donald Duck, provided the voice in some of Mickey's later theatrical shorts, such as "R'coon Dawg" and "Pluto's Party". Stan Freberg voiced Mickey in the Freberg-produced record "Mickey Mouse's Birthday Party". Alan Young voiced Mickey in the Disneyland record album "An Adaptation of Dickens' Christmas Carol, Performed by The Walt Disney Players" in 1974.
The 1983 short film "Mickey's Christmas Carol" marked the theatrical debut of Wayne Allwine as Mickey Mouse, who was the official voice of Mickey from 1977 until his death in 2009. Allwine once recounted something MacDonald had told him about voicing Mickey: "The main piece of advice that Jim gave me about Mickey helped me keep things in perspective. He said, 'Just remember kid, you're only filling in for the boss.' And that's the way he treated doing Mickey for years and years. From Walt, and now from Jimmy." In 1991, Allwine married Russi Taylor, the voice of Minnie Mouse from 1986 until her death in 2019.
Les Perkins did the voice of Mickey in two TV specials, "Down and Out with Donald Duck" and "DTV Valentine", in the mid-1980s. Peter Renaday voiced Mickey in the 1980s Disney albums "Yankee Doodle Mickey" and "Mickey Mouse Splashdance". He also provided his voice for "The Talking Mickey Mouse" toy in 1986. Quinton Flynn briefly filled in for Allwine as the voice of Mickey in a few episodes of the first season of "Mickey Mouse Works" whenever Allwine was unavailable to record.
Bret Iwan, a former Hallmark greeting card artist, is the current voice of Mickey. Iwan was originally cast as an understudy for Allwine due to the latter's declining health; however, Allwine died before Iwan could meet him and Iwan became the new official voice of the character. Iwan's early recordings in 2009 included work for the Disney Cruise Line, Mickey toys, the Disney theme parks and the Disney on Ice: Celebrations! ice show. He directly replaced Allwine as Mickey for the "Kingdom Hearts" video game series and the TV series "Mickey Mouse Clubhouse". His first video game voice-over of Mickey Mouse can be heard in "". Iwan also became the first voice actor to portray Mickey during Disney's rebranding of the character, providing the vocal effects of Mickey in "Epic Mickey" as well as his voice in "" and the remake of "Castle of Illusion".
Despite Iwan being Mickey's primary voice actor, the character's voice is provided by Chris Diamantopoulos in the 2013 animated series and the 2017 "DuckTales" reboot (in the form of a watermelon that Donald uses as a ventriloquist dummy) as the producers were looking for a voice closer to Walt Disney's portrayal of the character in order to match the vintage look of that series.
Since his early years, Mickey Mouse has been licensed by Disney to appear on many different kinds of merchandise. Mickey was produced as plush toys and figurines, and Mickey's image has graced almost everything from T-shirts to lunchboxes. Largely responsible for Disney merchandising in the 1930s was Kay Kamen (1892–1949) who was called a "stickler for quality." Kamen was recognized by The Walt Disney Company as having a significant part in Mickey's rise to stardom and was named a Disney Legend in 1998. At the time of his 80th-anniversary celebration in 2008, "Time" declared Mickey Mouse one of the world's most recognized characters, even when compared against Santa Claus. Disney officials have stated that 98% of children aged 3–11 around the world are at least aware of the character.
As the official Walt Disney mascot, Mickey has played a central role in the Disney parks since the opening of Disneyland in 1955. As with other characters, Mickey is often portrayed by a non-speaking costumed actor. In this form, he has participated in ceremonies and countless parades. A popular activity with guests is getting to meet and pose for photographs with the mouse. As of the presidency of Barack Obama (who jokingly referred to him as "a world leader who has bigger ears than me") Mickey has met every U.S. President since Harry Truman, with the exception of Lyndon B. Johnson.
Mickey also features in several specific attractions at the Disney parks. Mickey's Toontown (Disneyland and Tokyo Disneyland) is a themed land which is a recreation of Mickey's neighborhood. Buildings are built in a cartoon style and guests can visit Mickey or Minnie's houses, Donald Duck's boat, or Goofy's garage. This is a common place to meet the characters.
Mickey's PhilharMagic (Magic Kingdom, Tokyo Disneyland, Hong Kong Disneyland) is a 4D film which features Mickey in the familiar role of symphony conductor. At Main Street Cinema several of Mickey's short films are shown on a rotating basis; the sixth film is always "Steamboat Willie". Mickey plays a central role in "Fantasmic!" (Disneyland Resort, Disney's Hollywood Studios) a live nighttime show which famously features Mickey in his role as the Sorcerer's Apprentice. Mickey was also a central character in the now-defunct Mickey Mouse Revue (Magic Kingdom, Tokyo Disneyland) which was an indoor show featuring animatronic characters. Mickey's face currently graces the Mickey's Fun Wheel at Disney California Adventure Park, where a figure of him also stands on top of Silly Symphony Swings.
In addition to Mickey's overt presence in the parks, numerous images of him are also subtly included in sometimes unexpected places. This phenomenon is known as "Hidden Mickey", involving hidden images in Disney films, theme parks, and merchandise.
Like many popular characters, Mickey has starred in many video games, including "Mickey Mousecapade" on the Nintendo Entertainment System, "", "Mickey's Ultimate Challenge", and "Disney's Magical Quest" on the Super Nintendo Entertainment System, "Castle of Illusion Starring Mickey Mouse" on the Mega Drive/Genesis, "" on the Game Boy, and many others. In the 2000s, the "Disney's Magical Quest" series were ported to the Game Boy Advance, while Mickey made his sixth generation era debut in "Disney's Magical Mirror Starring Mickey Mouse", a Nintendo GameCube title aimed at younger audiences. Mickey plays a major role in the "Kingdom Hearts" series, as the king of Disney Castle and aide to the protagonist, Sora. King Mickey wields the Keyblade, a weapon in the form of a key that has the power to open any lock and combat darkness. "Epic Mickey", featuring a darker version of the Disney universe, was released in 2010 for the Wii. The game is part of an effort by The Walt Disney Company to re-brand the Mickey Mouse character by moving away from his current squeaky clean image and reintroducing the mischievous side of his personality.
Mickey was most famously featured on wristwatches and alarm clocks, typically utilizing his hands as the actual hands on the face of the clock. The first Mickey Mouse watches were manufactured in 1933 by the Ingersoll Watch Company. The seconds were indicated by a turning disk below Mickey. The first Mickey watch was sold at the Century of Progress in Chicago, 1933 for $3.75 (). Mickey Mouse watches have been sold by other companies and designers throughout the years, including Timex, Elgin, Helbros, Bradley, Lorus, and Gérald Genta The fictional character Robert Langdon from Dan Brown's novels was said to wear a Mickey Mouse watch as a reminder "to stay young at heart."
In 1989, Milton Bradley released the electronic talking game titled "Mickey Says", with three modes featuring Mickey Mouse as its host. Mickey also appeared in other toys and games, including the Worlds of Wonder released "The Talking Mickey Mouse".
Fisher-Price has recently produced a line of talking animatronic Mickey dolls including "Dance Star Mickey" (2010) and "Rock Star Mickey" (2011).
In total, approximately 40% of Disney's revenues for consumer products are derived from Mickey Mouse merchandise, with revenues peaking in 1997.
In the United States, protest votes are often made in order to indicate dissatisfaction with the slate of candidates presented on a particular ballot or to highlight the inadequacies of a particular voting procedure. Since most states' electoral systems do not provide for blank balloting or a choice of "None of the Above", most protest votes take the form of a clearly non-serious candidate's name entered as a write-in vote. Mickey Mouse is often selected for this purpose. As an election supervisor in Georgia observed, "If [Mickey Mouse] doesn’t get votes in our election, it’s a bad election." The earliest known mention of Mickey Mouse as a write-in candidate dates back to the 1932 New York City mayoral elections.
Mickey Mouse's name has also been known to appear fraudulently on voter registration lists, such as in the 2008 U.S. Presidential Election.
"Mickey Mouse" is a slang expression meaning small-time, amateurish or trivial. In the United Kingdom and Ireland, it also means poor quality or counterfeit. However, in parts of Australia it can mean excellent or very good (rhyming slang for "grouse"). Examples of the former two of the three usages include the following:
Mickey Mouse's global fame has made him both a symbol of The Walt Disney Company and of the United States itself. For this reason, Mickey has been used frequently in anti-American satire, such as the infamous underground cartoon ""Mickey Mouse in Vietnam"" (1969). There have been numerous parodies of Mickey Mouse, such as the 2-page parody ""Mickey Rodent"" by Will Elder (published in "Mad" #19, 1955) in which the mouse walks around unshaven and jails Donald Duck out of jealousy over the duck's larger popularity. The grotesque Rat Fink character was created by Ed "Big Daddy" Roth over his hatred of Mickey Mouse. In "The Simpsons Movie", Bart Simpson puts a black bra on his head to mimic Mickey Mouse and says: "I'm the mascot of an evil corporation!" On the Comedy Central series "South Park", Mickey is depicted as the sadistic, greedy, foul-mouthed boss of The Walt Disney Company, only interested in money. He also appears briefly with Donald Duck in the comic "Squeak the Mouse" by the Italian cartoonist Massimo Mattioli. Horst Rosenthal created a comic book, "Mickey au Camp de Gurs" ("Mickey Mouse in the Gurs Internment Camp") while detained in the Gurs internment camp during the Second World War; he added "Publié Sans Autorisation de Walt Disney" ("Published without Walt Disney's Permission") to the front cover.
In the 1969 parody novel "Bored of the Rings", Mickey Mouse is satirized as Dickey Dragon.
In the fifth episode of the Japanese anime, "Pop Team Epic", Popuko, one of the main characters, attempts an impression of Mickey, but does so poorly.
Like all major Disney characters, Mickey Mouse is not only copyrighted but also trademarked, which lasts in perpetuity as long as it continues to be used commercially by its owner. So, whether or not a particular Disney cartoon goes into the public domain, the characters themselves may not be used as trademarks without authorization.
Because of the Copyright Term Extension Act of the United States (sometimes called the 'Mickey Mouse Protection Act' because of extensive lobbying by the Disney corporation) and similar legislation within the European Union and other jurisdictions where copyright terms have been extended, works such as the early Mickey Mouse cartoons will remain under copyright until at least 2023. However, some copyright scholars argue that Disney's copyright on the earliest version of the character may be invalid due to ambiguity in the copyright notice for "Steamboat Willie".
The Walt Disney Company has become well known for protecting its trademark on the Mickey Mouse character—whose likeness is closely associated with the company—with particular zeal. In 1989, Disney threatened legal action against three daycare centers in the Orlando, Florida region (where Walt Disney World is a dominant employer) for having Mickey Mouse and other Disney characters painted on their walls. The characters were removed, and the newly-opened rival Universal Studios Florida allowed the centers to use their own cartoon characters with their blessing, to build community goodwill.
In 1971, a group of underground cartoonists calling themselves the Air Pirates, after a group of villains from early Mickey Mouse films, produced a comic called "Air Pirates Funnies". In the first issue, cartoonist Dan O'Neill depicted Mickey and Minnie Mouse engaging in explicit sexual behavior and consuming drugs. As O'Neill explained, "The air pirates were...some sort of bizarre concept to steal the air, pirate the air, steal the media...Since we were cartoonists, the logical thing was Disney." Rather than change the appearance or name of the character, which O'Neill felt would dilute the parody, the mouse depicted in "Air Pirates Funnies" looks like and is named "Mickey Mouse". Disney sued for copyright infringement, and after a series of appeals, O'Neill eventually lost and was ordered to pay Disney $1.9 million. The outcome of the case remains controversial among free-speech advocates. New York Law School professor Edward Samuels said, "[The Air Pirates] set parody back twenty years."
There have been multiple attempts to argue that certain versions of Mickey Mouse are in fact in the public domain. In the 1980s, archivist George S. Brown attempted to recreate and sell cels from the 1933 short "The Mad Doctor", on the theory that they were in the public domain because Disney had failed to renew the copyright as required by current law. However, Disney successfully sued Brown to prevent such sale, arguing that the lapse in copyright for "The Mad Doctor" did not put Mickey Mouse in the public domain because of the copyright in the earlier films. Brown attempted to appeal, noting imperfections in the earlier copyright claims, but the court dismissed his argument as untimely.
In 1999, Lauren Vanpelt, a law student at Arizona State University, wrote a paper making a similar argument. Vanpelt points out that copyright law at the time required a copyright notice specify the year of the copyright and the copyright owner's name. The title cards to early Mickey Mouse films "Steamboat Willie", "Plane Crazy", and "Gallopin' Gaucho" do not clearly identify the copyright owner, and also misidentify the copyright year. However, Vanpelt notes that copyright cards in other early films may have been done correctly, which could make Mickey Mouse "protected as a component part of the larger copyrighted films".
A 2003 article by Douglas A. Hedenkamp in the "Virginia Sports and Entertainment Law Journal" analyzed Vanpelt's arguments, and concluded that she is likely correct. Hedenkamp provided additional arguments, and identified some errors in Vanpelt's paper, but still found that due to imperfections in the copyright notice on the title cards, Walt Disney forfeited his copyright in Mickey Mouse. He concluded: "The forfeiture occurred at the moment of publication, and the law of that time was clear: publication without proper notice irrevocably forfeited copyright protection."
Disney threatened to sue Hedenkamp for slander of title, but did not follow through. The claims in Vanpelt and Hedenkamp's articles have not been tested in court.
In 1930, the German Board of Film Censors prohibited any presentations of the Mickey Mouse cartoon "The Barnyard Battle" (1929). The animated short, which features the mouse as a kepi-wearing soldier fighting cat enemies in German-style helmets, was viewed by censors as a negative portrayal of Germany. It was claimed by the board that the film would "reawaken the latest anti-German feeling existing abroad since the War". The "Barnyard Battle" incident did not incite wider anti-Mickey sentiment in Germany in 1930; however, after Adolf Hitler came to power several years later, the Nazi regime unambiguously propagandized against Disney. A mid-1930s German newspaper article read:
Mickey Mouse is the most miserable ideal ever revealed...Healthy emotions tell every independent young man and every honorable youth that the dirty and filth-covered vermin, the greatest bacteria carrier in the animal kingdom, cannot be the ideal type of animal...Away with Jewish brutalization of the people! Down with Mickey Mouse! Wear the Swastika Cross!
American cartoonist and writer Art Spiegelman would later use this quote on the opening page of the second volume of his graphic novel "Maus".
In 1935 Romanian authorities also banned Mickey Mouse films from cinemas, purportedly fearing that children would be "scared to see a ten-foot mouse in the movie theatre". In 1938, based on the Ministry of Popular Culture's recommendation that a reform was necessary "to raise children in the firm and imperialist spirit of the Fascist revolution", the Italian Government banned foreign children's literature except Mickey; Disney characters were exempted from the decree for the "acknowledged artistic merit" of Disney's work. Actually, Mussolini's children were fond of Mickey Mouse, so they managed to delay his ban as long as possible. In 1942, after Italy declared war on the United States, fascism immediately forced Italian publishers to stop printing any Disney stories. Mickey's stories were replaced by the adventures of "Tuffolino", a new human character created by Federico Pedrocchi (script) and Pier Lorenzo De Vita (art). After the downfall of Italy's fascist government in 1945, the ban was removed.
Mickey has been announced to appear in two films. One is a live-action/CGI hybrid film based on the Magic Kingdom theme park at the Walt Disney World Resort, while the other is a film idea pitched by Walt Disney Animation Studios veteran Burny Mattinson centering on Mickey, Donald, and Goofy.
Mickey Mouse has received ten nominations for the Academy Award for Best Animated Short Film. These are "Mickey's Orphans" (1931), "Building a Building" (1933), "Brave Little Tailor" (1938), "The Pointer" (1939), "Lend a Paw" (1941), "Squatter's Rights" (1946), "Mickey and the Seal" (1948), "Mickey's Christmas Carol" (1983), "Runaway Brain" (1995), and "Get a Horse!" (2013). Among these, "Lend a Paw" was the only film to actually win the award. Additionally, in 1932 Walt Disney received an honorary Academy Award in recognition of Mickey's creation and popularity.
In 1994, four of Mickey's cartoons were included in the book "The 50 Greatest Cartoons" which listed the greatest cartoons of all time as voted by members of the animation field. The films were "The Band Concert" (#3), "Steamboat Willie" (#13), "Brave Little Tailor" (#26), and "Clock Cleaners" (#27).
On November 18, 1978, in honor of his 50th anniversary, Mickey became the first cartoon character to have a star on the Hollywood Walk of Fame. The star is located on 6925 Hollywood Blvd.
Melbourne (Australia) runs the annual Moomba festival street procession and appointed Mickey Mouse as their "King of Moomba" (1977). Although immensely popular with children, there was controversy with the appointment: some Melburnians wanted a 'home-grown' choice, e.g. Blinky Bill; when it was revealed that Patricia O'Carroll (from Disneyland's Disney on Parade show) was performing the mouse, Australian newspapers reported "Mickey Mouse is really a girl!"
Mickey was the Grand Marshal of the Tournament of Roses Parade on New Year's Day 2005. He was the first cartoon character to receive the honor and only the second fictional character after Kermit the Frog in 1996.
|
https://en.wikipedia.org/wiki?curid=20859
|
Meher Baba
Meher Baba (born Merwan Sheriar Irani; 25 February 1894 – 31 January 1969) was an Indian spiritual master who claimed he was an Avatar — God in human form.
Merwan Sheriar Irani was born in 1894 in Poona, India, to Irani Zoroastrian parents. His spiritual transformation began when he was 19 years old and lasted for seven years. During this time he contacted five spiritual teachers before beginning his own mission and gathering his own disciples in early 1922, at the age of 27.
From 10 July 1925 to the end of his life, Meher Baba maintained silence, communicating by means of an alphabet board or by unique hand gestures. With his "mandali" (circle of disciples), he spent long periods in seclusion, during which time he often fasted. He also traveled widely, held public gatherings, and engaged in works of charity with lepers and the poor.
In 1931, Meher Baba made the first of many visits to the West, where he attracted followers. Throughout most of the 1940s, Meher Baba worked with a category of spiritual aspirants called "masts", who he said are entranced or spellbound by internal spiritual experiences. Starting in 1949, along with selected "mandali", he traveled incognito about India in an enigmatic and still largely unexplained period he called the "New Life".
After being injured as a passenger in two serious automobile accidents, one near Prague, Oklahoma in the United States in 1952 and one in India in 1956, his ability to walk became severely limited. In 1962, he invited his Western followers to India for a mass "darshan" called "The East–West Gathering". Concerned by an increasing use of LSD and other psychedelic drugs, in 1966 Baba stated that they did not convey real benefits. Despite deteriorating health, he continued what he called his "Universal Work", which included fasting and seclusion, until his death on 31 January 1969. His "samadhi" (shrine/tomb) in Meherabad, India, has become a place of international pilgrimage.
Meher Baba gave numerous teachings on the cause and purpose of life, including teaching reincarnation and that the phenomenal world is an illusion. He taught that the Universe is imagination, that God is what really exists, and that each soul is really God passing through imagination to realize individually his own divinity. In addition he gave practical advice for the aspirant who wishes to attain God-realization and thereby escape the wheel of births and deaths. He also taught about the concept of Perfect Masters, the Avatar, and those on the various stages of the spiritual path that he called involution. His most important teachings are recorded in his principal books "Discourses" and "God Speaks".
His legacy includes the Avatar Meher Baba Charitable Trust he established in India, a handful of centres for information and pilgrimage, as well as an influence on pop-culture artists and the introduction of common expressions such as "Don't Worry, Be Happy." Meher Baba's silence has remained a mysterious issue among some of his followers.
Meher Baba was an Irani born in Poona, India, to a Zoroastrian family. His given name was Merwan Sheriar Irani. He was the second son of Sheriar Irani, a Persian Zoroastrian who had spent years wandering in search of spiritual experience before settling in Poona (now Pune), and Shireen Irani.
As a boy, he formed the Cosmopolitan Club, which was dedicated to remaining informed in world affairs and donating money to charity. He was a multi-instrumentalist and poet. Fluent in several languages, he was especially fond of the poetry of Hafez, Shakespeare, and Shelley. After that he contacted other spiritual figures, who, along with Babajan, he later said were the five "Perfect Masters" of the age, namely Tajuddin Baba, Narayan Maharaj, Sai Baba of Shirdi, and Upasni Maharaj.
Upasni Maharaj, he later said, helped him to integrate his mystical experiences with normal consciousness, thus enabling him to function in the world without diminishing his experience of God-realization. In late 1921, at the age of 27, after living for seven years with Upasni, Merwan began to attract a following of his own. His early followers gave him the name Meher Baba (Compassionate Father).
In 1922, Meher Baba and his followers established Manzil-e-Meem (House of the Master) in Bombay (now Mumbai). There, Baba commenced his practice of demanding strict discipline and obedience from his disciples. A year later, Baba and his mandali moved to an area a few miles outside Ahmednagar that he named Meherabad (Garden of Blessing). This ashram would become the center for his work. During the 1920s, Meher Baba opened a school, hospital and dispensary at Meherabad, all three free and open to all castes and faiths.
From July 1925 onwards, Meher Baba initiated a life-long period of self-imposed silence, which would last forty-four years, until the end of his life. He now communicated by use of chalk and slate, then by an alphabet board, and later via self-styled hand gestures. In January 1927 he gave up writing with pen or pencil also.
In the 1930s, Meher Baba began a period of extensive world travel and took several trips to Europe and the United States. It was during this period that he established contact with his first close group of Western disciples. He traveled on a Persian passport because he had given up writing as well as speaking and would not sign the forms required by the British government of India.
On his first trip to England in 1931, he traveled on the "SS Rajputana", the same ship that was carrying Mahatma Gandhi, who was sailing to the second Round Table Conference in London. Baba and Gandhi had three meetings on board, one of which lasted for three hours. The British press publicized these meetings, but an aide to Gandhi said, "You may say emphatically that Gandhi never asked Meher Baba for help or for spiritual or other advice."
In the West, Meher Baba met with a number of celebrities and artists, amongst them Gary Cooper, Charles Laughton, Tallulah Bankhead, Boris Karloff, Tom Mix, Maurice Chevalier, and Ernst Lubitsch. On 1 June 1932, Mary Pickford and Douglas Fairbanks, Jr. held a reception for Baba at Pickfair at which he delivered a message to Hollywood. As a result, says Robert S. Ellwood, Meher Baba emerged as "one of the enthusiasms of the '30s".
In 1934, after announcing that he would break his self-imposed silence in the Hollywood Bowl, Baba changed his plans abruptly, boarded the RMS "Empress of Canada", and sailed to Hong Kong without explanation. The Associated Press reported that "Baba had decided to postpone the word-fast breaking until next February because 'conditions are not yet ripe'." He returned to England in 1936 but did not return to the United States again until the early 1950s.
In the late 1930s, Meher Baba invited a group of Western women to join him in India, where he arranged a series of trips throughout India and British Ceylon that became known as the Blue Bus Tours. When the tour returned home, many newspapers treated their journey as an occasion for scandal. Time Magazine's 1936 review of "God is my Adventure" describes the US's fascination with the "long-haired, silky-mustached Parsee named Shri Sadgaru [sic] Meher Baba" four years earlier.
In the 1930s and 1940s, Meher Baba worked extensively with masts, persons "intoxicated with God". According to Baba, these individuals are disabled by their enchanting experience of the higher spiritual planes. Although outwardly masts may appear irrational or insane, Baba claimed that their spiritual status was elevated, and that by meeting with them he helped them to move forward spiritually while enlisting their aid in his spiritual work. One of the best known of these masts, known as Mohammed Mast, lived at Meher Baba's encampment at Meherabad until his death in 2003.
During his journey in 1946, he went to Sehwan Sharif to meet a well known sufi saint and a successor of Lal Shahbaz Qalandar, Murshid Nadir Ali Shah. Baba called him an advanced pilgrim.
In 1949 Baba began an enigmatic period that he called the New Life. Following a series of questions on their readiness to obey even the most difficult of his requests, Baba selected twenty companions to join him in a life of complete "hopelessness and helplessness".
He made provisions for those dependent on him, after which he and his companions otherwise gave up nearly all property and financial responsibilities. They traveled about India incognito while begging for food and carrying out Baba's instructions in accordance with a strict set of "conditions of the New Life". These included absolute acceptance of any circumstance and consistent good cheer in the face of any difficulty. Companions who failed to comply were sent away.
Concerning the New Life Meher Baba wrote:
This New Life is endless, and even after my physical death it will be kept alive by those who live the life of complete renunciation of falsehood, lies, hatred, anger, greed and lust; and who, to accomplish all this, do no lustful actions, do no harm to anyone, do no backbiting, do not seek material possessions or power, who accept no homage, neither covet honor nor shun disgrace, and fear no one and nothing; by those who rely wholly and solely on God, and who love God purely for the sake of loving; who believe in the lovers of God and in the reality of Manifestation, and yet do not expect any spiritual or material reward; who do not let go the hand of Truth, and who, without being upset by calamities, bravely and wholeheartedly face all hardships with one hundred percent cheerfulness, and give no importance to caste, creed and religious ceremonies. This New Life will live by itself eternally, even if there is no one to live it.
Meher Baba ended the New Life in February 1952 and once again began a round of public appearances throughout India and the West.
In the 1950s, Baba established two centers outside of India, namely the Meher Spiritual Center in Myrtle Beach, South Carolina in the United States and Avatar's Abode near Brisbane, Australia. He inaugurated the Meher Spiritual Center in April 1952. On 24 May 1952, en route from the Spiritual Center to Meher Mount in Ojai, California, the car in which he was a passenger was struck head-on near Prague, Oklahoma. He and his companions were thrown from the vehicle and suffered many injuries. Baba's leg was severely broken and he sustained facial injuries which included a broken nose. The injured were treated at Duke Hospital in Durham, North Carolina, after which they returned to Myrtle Beach to recuperate. While recuperating at Youpon Dunes, a home owned by Elizabeth Patterson, in Myrtle Beach, he worked on the charter for a group of Sufis, which he named Sufism Reoriented.
Meher Baba began dictating his major book, "God Speaks, The Theme of Creation and Its Purpose", using an alphabet board in Dehradun, August 1953. In September 1954, Meher Baba gave a men-only sahavas at Meherabad that later became known as the Three Incredible Weeks. During this time Baba issued a declaration, "Meher Baba's Call", wherein he once again affirmed his Avatarhood "irrespective of the doubts and convictions" of others. At the end of this sahavas Meher Baba gave the completed manuscript of his book "God Speaks" to two members of Sufism Reoriented, Ludwig H. Dimpfl and Don E. Stevens, for editing and publication in America. The book was subsequently published by Dodd, Mead and Company the following year.
On 30 September 1954 Meher Baba gave his Final Declaration message, in which he made enigmatic predictions.
In October 1954, Meher Baba discarded his alphabet board and began using a unique set of hand gestures to communicate, which he used for the remainder of his life.
On 2 December 1956, outside Satara, India, the car in which Baba was being driven went out of control and a second serious automobile accident occurred. Baba suffered a fractured pelvis and other severe injuries. Nilu, one of Baba's mandali, was killed. This collision seriously incapacitated Baba. Despite his physicians' predictions to the contrary, Baba began to walk again after great effort, but from that point on he was in constant pain and was severely limited in his ability to move. During his trip to the West in 1958 he often needed to be carried from venue to venue.
In 1956, during his fifth visit to the US, Baba stayed at New York's Hotel Delmonico before traveling to the Meher Center at Myrtle Beach, South Carolina. In July he traveled to Washington, D.C., and received friends and disciples at the home of Ivy Duce, wife of James Terry Duce, the vice-president of the Arabian American Oil Co. He then traveled to Meher Mount at Ojai, California before continuing on to Australia. His final visits to the United States and Australia were made in 1958.
In 1962, Baba gave one of his last public functions, a series of meetings he called "The East-West Gathering." At these meetings, at which his western followers were invited to meet his Indian disciples, Baba gave darshan to many thousands of people despite the physical strain this caused him.
In the mid-1960s Baba became concerned with the drug culture in the West and began correspondences with several Western academics, including Timothy Leary and Richard Alpert, in which he discouraged the use of hallucinogenic drugs for spiritual purposes. In 1966 Baba's responses to questions on drugs were published in a pamphlet titled "God in a Pill?" Meher Baba stated that drug use was spiritually damaging and that if enlightenment were possible through drugs then "God is not worthy of being God". Meher Baba instructed his young Western disciples to spread this message; in doing so, they increased awareness of Meher Baba's teachings. In an interview with Frederick Chapman, a Harvard graduate and Fulbright scholar who met Baba during a year of study in India, Baba described LSD as "harmful physically, mentally and spiritually" and warned that "[its continued use] leads to madness or death".
An anti-drug campaign was initiated by Baba lovers in the United States, Europe and Australia. The campaign was mostly futile, but won Baba new followers, and some of Baba's views found their way into academic debate on the merits and dangers of hallucinogens.
From the East-West Gathering of 1962 onward, Baba's health deteriorated. Despite the physical toll it took on his body, he continued to undergo periods of seclusion and fasting. In late July 1968, Baba completed a particularly taxing period of seclusion and stated that by then his work was "completed 100% to my satisfaction". He was by then using a wheelchair. Within a few months, his condition had worsened and he was bedridden, wracked by muscle spasms that had no clear medical origin. Despite the care of several doctors the spasms worsened.
On 31 January 1969, Meher Baba died at his home in Meherazad. He conveyed by his last gestures, "Do not forget that I am God." In time, his devotees called the anniversary of his death "Amartithi" (deathless day). Meher Baba's body was placed at his samadhi at Meherabad, covered with roses and cooled by ice. His body was kept available to the public for one week before its final burial. Prior to his death, Meher Baba had made extensive preparations for a public darshan program to be held in Poona. His mandali decided to proceed with the arrangements despite the absence of the host. Several thousand attended this "Last Darshan," including many hundreds from the United States, Europe, and Australia.
From 10 July 1925, until his death in 1969, Meher Baba was silent. He communicated first by using an alphabet board and later by unique hand gestures which were interpreted and spoken out by one of his mandali, usually by his disciple Eruch Jessawala. Meher Baba said that his silence was not undertaken as a spiritual exercise but solely in connection with his universal work.
Man's inability to live God's words makes the Avatar's teaching a mockery. Instead of practicing the compassion he taught, man has waged wars in his name. Instead of living the humility, purity, and truth of his words, man has given way to hatred, greed, and violence. Because man has been deaf to the principles and precepts laid down by God in the past, in this present Avataric form, I observe silence.
Meher Baba often signaled the moment "that he would 'break' his silence by speaking the 'Word' in every heart, thereby giving a spiritual push forward to all living things".
When I break My Silence, the impact of My Love will be universal and all life in creation will know, feel and receive of it. It will help every individual to break himself free from his own bondage in his own way. I am the Divine Beloved who loves you more than you can ever love yourself. The breaking of My Silence will help you to help yourself in knowing your real Self.
According to him, the breaking of his silence would be a defining event in the spiritual evolution of the world.
When I speak that Word, I shall lay the foundation for that which is to take place during the next seven hundred years.
On many occasions Meher Baba promised to break his silence with an audible word before he died, often stating a specific time and place when this would occur, but according to all contemporary accounts, Meher Baba remained silent until his death. His failure to break his silence disappointed some of his followers, while others regarded these broken promises as a test of their faith. A number of his followers speculate that "the Word" will yet be "spoken" or that Meher Baba broke his silence in a spiritual rather than a physical way.
Baba, for many years, asked his followers to undertake austerities on 10 July, the anniversary of the day his silence began, such as keeping silence, fasting and praying. In his final Silence Day request to his followers in 1968, he asked only that they keep silent. Many followers continue to celebrate Silence Day by keeping silence in his honor.
Meher Baba's teachings can roughly be divided into two main categories: his metaphysics on the nature of the soul and the Universe, and practical advice for the spiritual aspirant. The two are interrelated. His metaphysics is mostly found in his principal book on the subject, "God Speaks". It contains detailed statements on his cosmology and the purpose of life as well as the progression of the soul, while his elucidations on the practical spiritual life are mostly contained in "Discourses", although it also covers many metaphysical areas mirroring or amplifying "God Speaks".
In "God Speaks", Meher Baba describes the journey of the soul from its original state of unconscious divinity to the ultimate attainment of conscious divinity. The whole journey is a journey of imagination, where the original indivisible state of God imagines becoming countless individualized souls which he likens to bubbles within an ocean of infinite size. Each soul, powered by the desire to become conscious, starts its journey in the most rudimentary form of consciousness. This limitation brings the need of a more developed form to advance it towards an increasingly conscious state. Consciousness grows in relation to the impressions each form is capable of gathering.
According to Baba, each soul pursues conscious divinity by evolving, that is, experiencing itself in a succession of imagined forms through seven "kingdoms" of stone/metal, vegetable, worm, fish, bird, animal, and human. The soul identifies itself with each successive form, becoming thus tied to illusion. During this evolution of forms, the power of thought increases, until in human form thought becomes infinite. Although in human form, the soul is capable of conscious divinity, all the impressions that it has gathered during evolution are illusory ones that create a barrier against the soul knowing itself. For this barrier to be overcome, further births in human form are needed in a process known as reincarnation.
The soul will reach a stage where its previously gathered impressions grow thin or weak enough that it enters a final stage called involution. This stage also requires a series of human births, during which the soul begins an inner journey, by which it realizes its true identity as God. Baba breaks this inner journey of Realization into seven stages he called "planes." The process culminates, at the seventh plane, with God-realization, at which the goal of life for the soul is reached.
The "Discourses" are a collection of explanations and elucidations that Meher Baba has given on many topics that concern the advancement of the spiritual aspirant. Some of the most important topics treated are: sanskaras (mental impressions), Maya (the principle of illusion), the nature of the ego, reincarnation, karma, violence and non-violence, meditation, love, discipleship, and God-realization. His explanations often include stories from the lore of India and the Sufi culture. One such story, the wise man and the ghost, shows the power that superstitious beliefs can have on a person, while another, Majnun and Layla, show how selfless love, even in human relations, can lead one to discipleship.
Thus Meher Baba offers many suggestions that keep one moving towards God-realization. These suggestions include putting theory into practice, the internal renunciation of desires, offering selfless service to humanity or the master, spontaneity, while avoiding actions that bind one to illusion. But rather than lay out moral rules, Baba offers an understanding as to why some actions bind the individual whereas some others help towards his emancipation. Many chapters offer a better understanding of the mechanisms by which consciousness gets caught up between the opposites of experience, such as pleasure and pain, good and evil, and point to a way of transcending them.
Baba related that there are 56 incarnate God-realized souls on Earth at any given time. Of these souls there are always five who constitute the five Perfect Masters of their era. When one of the five Perfect Masters dies, according to Baba, another God-realized soul among the 56 would replace him or her at once.
The Avatar, according to Baba, is a special Perfect Master, the first soul to achieve God-realization. This soul, the "original" Perfect Master, or Ancient One, never ceases to incarnate. Baba indicated that this particular soul personifies the state of God called Vishnu in Hinduism and Parvardigar in Sufism, i.e. the sustainer or preserver state of God. The Avatar, in Baba's testimony, appears on Earth every 700–1400 years and is "brought down" into human form by the five Perfect Masters of the time to aid in the process of moving creation in its never-ending journey toward Godhood. Baba claimed that in other ages this role had been fulfilled by Zoroaster, Rama, Krishna, Buddha, Jesus, and Muhammad.
Baba described the Avatar as "a gauge against which man can measure what he is and what he may become. He trues the standard of human values by interpreting them in terms of divinely human life."
The majority of Meher Baba's followers accept his claim of avatarhood, and he is said to be "revered by millions around the world as the Avatar of the age and a God-realized being".
Baba's travels and teachings left a legacy of followers and devotees worldwide.
The Avatar Meher Baba Charitable Trust, established by Meher Baba in 1959, maintains his tomb and pilgrimage facilities, as well as a free school and dispensary, a cataract clinic, and a veterinary clinic. The Trust follows the charter left for it by Meher Baba in his lifetime, but does not act as spiritual authority over groups. Likewise, the Trust does not engage in propaganda, promote creeds or dogmas, or seek converts. Baba discouraged evangelizing, stating, "I need no propaganda or publicity." Rather, he encouraged his followers to "let your life itself be my message of love and truth to others" and to "spread my message of Love and Truth as far and wide as possible". Followers of Meher Baba have no established rituals. Many do, however, perform practices of choice such as pujas, aartis, prayers, music, plays, viewing films of Baba and so forth, but the choice is personal. The primary focus for followers is living a life Meher Baba would approve of, for example, refraining from the use of psychedelic drugs, including marijuana, and trying to remember God with love.
Gatherings of Baba followers are generally informal. Special effort is made to gather together on Amartithi, the anniversary of Baba's death, and on his birthday. Many Baba followers keep silent on 10 July (Silence Day), observing the request Baba frequently made of his followers during his lifetime. Aarti is performed morning and evening at Baba's samadhi in India. Also at Meherabad, his followers maintain Baba's practice of lighting a dhuni fire on the 12th of each month.
Although Baba had initially begun gaining public attention in the West as early as 1932 as the result of contacts with some celebrities of the time and from the rather disillusioned account of Paul Brunton ("A Search in Secret India", 1934), he received further attention after his death through various mentions in western pop-culture.
Pete Townshend of The Who, who became a follower of Baba, dedicated his 1969 rock-opera "Tommy" to Meher Baba in the record's gatefold. The Who's 1971 song "Baba O'Riley" was named in part after Meher Baba, and Townshend recorded several Meher Baba tribute albums including "Happy Birthday", "I Am", "Who Came First", and "With Love". In 1970, Melanie Safka mentioned Baba in the spoken word intro to her song "Lay Down (Candles in the Rain)". Listed as a standalone piece entitled Candles In The Rain, the lyrics are "Meher Baba lives again". Bobby McFerrin's 1988 Grammy Award-winning song "Don't Worry, Be Happy" was inspired by a popular quote of Baba seen in numerous Baba posters and inspirational cards. Concepts of Meher Baba's philosophy, as well as a character based on Baba but unnamed, have also frequently appeared in works of comic book writer J. M. DeMatteis, including "Doctor Fate" and "Seekers into the Mystery".
In 2012, the feature film "Nema Aviona za Zagreb" premiered in the Netherlands with an exclusive interview with Meher Baba filmed in 1967. In the interview Baba explains the difference between God-realization and drug-induced hallucinations and the scene plays a pivotal role in the documentary's narrative.
|
https://en.wikipedia.org/wiki?curid=20860
|
Cavity magnetron
The cavity magnetron is a high-powered vacuum tube that generates microwaves using the interaction of a stream of electrons with a magnetic field while moving past a series of open metal cavities (cavity resonators). Electrons pass by the openings to these cavities and cause microwaves to oscillate within, similar to the way a whistle produces a tone when excited by an air stream blown past its opening. The frequency of the microwaves produced, the resonant frequency, is determined by the cavities' physical dimensions. Unlike other vacuum tubes such as a klystron or a traveling-wave tube (TWT), the magnetron cannot function as an amplifier in order to increase the intensity of an applied microwave signal; the magnetron serves solely as an oscillator, generating a microwave signal from direct current electricity supplied to the vacuum tube.
An early form of magnetron was invented by H. Gerdien in 1910. Another form of magnetron tube, the split-anode magnetron, was invented by Albert Hull of General Electric Research Laboratory in 1920, but it achieved only a frequency of 30 kHz. Similar devices were experimented with by many teams through the 1920s and 1930s. Hans Erich Hollmann filed a patent on a design similar to the modern tube in 1935, but the more frequency stable klystron was preferred for most German radars during World War II. An important advance was the multi-cavity magnetron, first proposed in 1934 by A. L. Samuel of Bell Telephone Laboratories. However, the first truly successful example was developed by Aleksereff and Malearoff in USSR in 1936, which achieved 300 watts at 3 GHz (10 cm wavelength).
The cavity magnetron was radically improved by John Randall and Harry Boot at the University of Birmingham, England in 1940. They invented a valve that could produce multi-kilowatt pulses at 10 cm wavelength, an unprecedented achievement. The high power of pulses from the device made centimeter-band radar practical for the Allies of World War II, with shorter wavelength radars allowing detection of smaller objects from smaller antennas. The compact cavity magnetron tube drastically reduced the size of radar sets so that they could be more easily installed in night-fighter aircraft, anti-submarine aircraft and escort ships.
At the same time, Yoji Ito was experimenting with magnetrons in Japan, and proposed a system of collision avoidance using frequency modulation. Only low power output was achieved. Visiting Germany, where he had earlier received his doctorate, Ito learned that the Germans were using pulse modulation at VHF with great success. Back in Japan, he produced a prototype pulse magnetron with 2 kW output in October 1941, which was then widely deployed.
In the post-war era the magnetron was less widely used for radar applications, because the output changes from pulse to pulse, both in frequency and phase. This renders the method unsuitable for pulse-to-pulse comparisons for detecting and removing "clutter" from the radar display. The magnetron remains in use in some radar systems, but has become much more common as a low-cost source for microwave ovens. In this form, over one billion magnetrons are in use today.
In a conventional electron tube (vacuum tube), electrons are emitted from a negatively charged, heated component called the cathode and are attracted to a positively charged component called the anode. The components are normally arranged concentrically, placed within a tubular-shaped container from which all air has been evacuated, so that the electrons can move freely (hence the name "vacuum" tubes, called "valves" by the British).
If a third electrode (called a control grid) is inserted between the cathode and the anode, the flow of electrons between the cathode and anode can be regulated by varying the voltage on this third electrode. This allows the resulting electron tube (called a "triode" because it now has three electrodes) to function as an amplifier because small variations in the electric charge applied to the control grid will result in identical variations in the much larger current of electrons flowing between the cathode and anode.
The idea of using a grid for control was patented by Lee de Forest, resulting in considerable research into alternate tube designs that would avoid his patents. One concept used a magnetic field instead of an electrical charge to control current flow, leading to the development of the magnetron tube. In this design, the tube was made with two electrodes, typically with the cathode in the form of a metal rod in the center, and the anode as a cylinder around it. The tube was placed between the poles of a horseshoe magnet arranged such that the magnetic field was aligned parallel to the axis of the electrodes.
With no magnetic field present, the tube operates as a diode, with electrons flowing directly from the cathode to the anode. In the presence of the magnetic field, the electrons will experience a force at right angles to their direction of motion, according to the left-hand rule. In this case, the electrons follow a curved path between the cathode and anode. The curvature of the path can be controlled by varying either the magnetic field, using an electromagnet, or by changing the electrical potential between the electrodes.
At very high magnetic field settings the electrons are forced back onto the cathode, preventing current flow. At the opposite extreme, with no field, the electrons are free to flow straight from the cathode to the anode. There is a point between the two extremes, the critical value or Hull cut-off magnetic field (and cut-off voltage), where the electrons just reach the anode. At fields around this point, the device operates similar to a triode. However, magnetic control, due to hysteresis and other effects, results in a slower and less faithful response to control current than electrostatic control using a control grid in a conventional triode (not to mention greater weight and complexity), so magnetrons saw limited use in conventional electronic designs.
It was noticed that when the magnetron was operating at the critical value, it would emit energy in the radio frequency spectrum. This occurs because a few of the electrons, instead of reaching the anode, continue to circle in the space between the cathode and the anode. Due to an effect now known as cyclotron radiation, these electrons radiate radio frequency energy. The effect is not very efficient. Eventually the electrons hit one of the electrodes, so the number in the circulating state at any given time is a small percentage of the overall current. It was also noticed that the frequency of the radiation depends on the size of the tube, and even early examples were built that produced signals in the microwave region.
Early conventional tube systems were limited to the high frequency bands, and although very high frequency systems became widely available in the late 1930s, the ultra high frequency and microwave regions were well beyond the ability of conventional circuits. The magnetron was one of the few devices able to generate signals in the microwave band and it was the only one that was able to produce high power at centimeter wavelengths.
The original magnetron was very difficult to keep operating at the critical value, and even then the number of electrons in the circling state at any time was fairly low. This meant that it produced very low-power signals. Nevertheless, as one of the few devices known to create microwaves, interest in the device and potential improvements was widespread.
The first major improvement was the split-anode magnetron, also known as a negative-resistance magnetron. As the name implies, this design used an anode that was split in two—one at each end of the tube—creating two half-cylinders. When both were charged to the same voltage the system worked like the original model. But by slightly altering the voltage of the two plates, the electron's trajectory could be modified so that they would naturally travel towards the lower voltage side. The plates were connected to an oscillator that reversed the relative voltage of the two plates at a given frequency.
At any given instant, the electron will naturally be pushed towards the lower-voltage side of the tube. The electron will then oscillate back and forth as the voltage changes. At the same time, a strong magnetic field is applied, stronger than the critical value in the original design. This would normally cause the electron to circle back to the cathode, but due to the oscillating electrical field, the electron instead follows a looping path that continues toward the anodes.
Since all of the electrons in the flow experienced this looping motion, the amount of RF energy being radiated was greatly improved. And as the motion occurred at any field level beyond the critical value, it was no longer necessary to carefully tune the fields and voltages, and the overall stability of the device was greatly improved. Unfortunately, the higher field also meant that electrons often circled back to the cathode, depositing their energy on it and causing it to heat up. As this normally causes more electrons to be released, it could sometimes lead to a runaway effect, damaging the device.
The great advance in magnetron design was the resonant cavity magnetron or electron-resonance magnetron, which works on entirely different principles. In this design the oscillation is created by the physical shape of the anode, rather than external circuits or fields.
Mechanically, the cavity magnetron consists of a large, solid cylinder of metal with a hole drilled through the center of the circular face. A wire acting as the cathode is run down the center of this hole, and the metal block itself forms the anode. Around this hole, known as the "interaction space", are a number of similar holes ("resonators") drilled parallel to the interaction space, connected to the interaction space by a short channel. The resulting block looks something like the cylinder on a revolver, with a somewhat larger central hole. (Early models were actually cut using Colt pistol jigs) Remembering that in an AC circuit the electrons travel along the surface, not the core, of the conductor, the parallel sides of the slot acts as a capacitor while the round holes form an inductor: an LC circuit made of solid copper, with the resonant frequency defined entirely by its dimensions.
The magnetic field is set to a value well below the critical, so the electrons follow arcing paths towards the anode. When they strike the anode, they cause it to become negatively charged in that region. As this process is random, some areas will become more or less charged than the areas around them. The anode is constructed of a highly conductive material, almost always copper, so these differences in voltage cause currents to appear to even them out. Since the current has to flow around the outside of the cavity, this process takes time. During that time additional electrons will avoid the hot spots and be deposited further along the anode, as the additional current flowing around it arrives too. This causes an oscillating current to form as the current tries to equalize one spot, then another.
The oscillating currents flowing around the cavities, and their effect on the electron flow within the tube, causes large amounts of microwave radiofrequency energy to be generated in the cavities. The cavities are open on one end, so the entire mechanism forms a single, larger, microwave oscillator. A "tap", normally a wire formed into a loop, extracts microwave energy from one of the cavities. In some systems the tap wire is replaced by an open hole, which allows the microwaves to flow into a waveguide.
As the oscillation takes some time to set up, and is inherently random at the start, subsequent startups will have different output parameters. Phase is almost never preserved, which makes the magnetron difficult to use in phased array systems. Frequency also drifts from pulse to pulse, a more difficult problem for a wider array of radar systems. Neither of these present a problem for continuous-wave radars, nor for microwave ovens.
All cavity magnetrons consist of a heated cathode placed at a high (continuous or pulsed) negative potential created by a high-voltage, direct-current power supply. The cathode is placed in the center of an evacuated, lobed, circular chamber. A magnetic field parallel to the filament is imposed by a permanent magnet. The magnetic field causes the electrons, attracted to the (relatively) positive outer part of the chamber, to spiral outward in a circular path, a consequence of the Lorentz force. Spaced around the rim of the chamber are cylindrical cavities. Slots are cut along the length of the cavities that open into the central, common cavity space. As electrons sweep past these slots, they induce a high-frequency radio field in each resonant cavity, which in turn causes the electrons to bunch into groups. (This principle of cavity resonator is very similar to blowing a stream of air across the open top of a glass bottle.) A portion of the radio frequency energy is extracted by a short antenna that is connected to a waveguide (a metal tube, usually of rectangular cross section). The waveguide directs the extracted RF energy to the load, which may be a cooking chamber in a microwave oven or a high-gain antenna in the case of radar.
The sizes of the cavities determine the resonant frequency, and thereby the frequency of the emitted microwaves. However, the frequency is not precisely controllable. The operating frequency varies with changes in load impedance, with changes in the supply current, and with the temperature of the tube. This is not a problem in uses such as heating, or in some forms of radar where the receiver can be synchronized with an imprecise magnetron frequency. Where precise frequencies are needed, other devices, such as the klystron are used.
The magnetron is a self-oscillating device requiring no external elements other than a power supply. A well-defined threshold anode voltage must be applied before oscillation will build up; this voltage is a function of the dimensions of the resonant cavity, and the applied magnetic field. In pulsed applications there is a delay of several cycles before the oscillator achieves full peak power, and the build-up of anode voltage must be coordinated with the build-up of oscillator output.
Where there are an even number of cavities, two concentric rings can connect alternate cavity walls to prevent inefficient modes of oscillation. This is called pi-strapping because the two straps lock the phase difference between adjacent cavities at pi radians (180°).
The modern magnetron is a fairly efficient device. In a microwave oven, for instance, a 1.1-kilowatt input will generally create about 700 watts of microwave power, an efficiency of around 65%. (The high-voltage and the properties of the cathode determine the power of a magnetron.) Large S band magnetrons can produce up to 2.5 megawatts peak power with an average power of 3.75 kW. Some large magnetrons are water cooled. The magnetron remains in widespread use in roles which require high power, but where precise control over frequency and phase is unimportant.
In a radar set, the magnetron's waveguide is connected to an antenna. The magnetron is operated with very short pulses of applied voltage, resulting in a short pulse of high-power microwave energy being radiated. As in all primary radar systems, the radiation reflected from a target is analyzed to produce a radar map on a screen.
Several characteristics of the magnetron's output make radar use of the device somewhat problematic. The first of these factors is the magnetron's inherent instability in its transmitter frequency. This instability results not only in frequency shifts from one pulse to the next, but also a frequency shift within an individual transmitted pulse. The second factor is that the energy of the transmitted pulse is spread over a relatively wide frequency spectrum, which requires the receiver to have a correspondingly wide bandwidth. This wide bandwidth allows ambient electrical noise to be accepted into the receiver, thus obscuring somewhat the weak radar echoes, thereby reducing overall receiver signal-to-noise ratio and thus performance. The third factor, depending on application, is the radiation hazard caused by the use of high-power electromagnetic radiation. In some applications, for example, a marine radar mounted on a recreational vessel, a radar with a magnetron output of 2 to 4 kilowatts is often found mounted very near an area occupied by crew or passengers. In practical use these factors have been overcome, or merely accepted, and there are today thousands of magnetron aviation and marine radar units in service. Recent advances in aviation weather-avoidance radar and in marine radar have successfully replaced the magnetron with microwave semiconductor oscillators, which have a narrower output frequency range. These allow a narrower receiver bandwidth to be used, and the higher signal-to-noise ratio in turn allows a lower transmitter power, reducing exposure to EMR.
In microwave ovens, the waveguide leads to a radio-frequency-transparent port into the cooking chamber. As the fixed dimensions of the chamber and its physical closeness to the magnetron would normally create standing wave patterns in the chamber, the pattern is randomized by a motorized fan-like "stirrer" in the waveguide (more often in commercial ovens), or by a turntable that rotates the food (most common in consumer ovens).
In microwave-excited lighting systems, such as a sulfur lamp, a magnetron provides the microwave field that is passed through a waveguide to the lighting cavity containing the light-emitting substance (e.g., sulfur, metal halides, etc.). Although efficient, these lamps are much more complex than other methods of lighting and therefore not commonly used.
More modern variants use HEMTs or GaN-on-SiC power semiconductors to generate the microwaves, which are substantially less complex and can be adjusted to maximize light output using a PID system.
In 1910 Hans Gerdien (1877–1951) of the Siemens corporation invented a magnetron.
|
https://en.wikipedia.org/wiki?curid=20861
|
Manorialism
Manorialism or seignorialism was an organizing principle of rural economies which vested legal and economic power in a lord of the manor. If the core feudalism is defined as a set of legal and military relationships among nobles, manorialism extended this system to the legal and economic relationships between nobles and peasants. (Manorialism is sometimes included in the definition of feudalism.) Each lord of the manor was supported economically from his own direct landholding in a manor (sometimes called a fief), and from the obligatory contributions of a legally subject part of the peasant population under his jurisdiction and that of his manorial court. These obligations could be payable in several ways, in labor (the French term "corvée" is conventionally applied), in kind or in coin.
Manorialism originated in the Roman villa system of the Late Roman Empire, and was widely practiced in medieval western and parts of central Europe as well as China. An essential element of feudal society, manorialism was slowly replaced by the advent of a money-based market economy and new forms of agrarian contract.
In examining the origins of the monastic cloister, Walter Horn found that "as a manorial entity the Carolingian monastery ... differed little from the fabric of a feudal estate, save that the corporate community of men for whose sustenance this organization was maintained consisted of monks who served God in chant and spent much of their time in reading and writing."
Manorialism died slowly and piecemeal, along with its most vivid feature in the landscape, the open field system. It outlasted serfdom in the sense that it continued with freehold laborers. As an economic system, it outlasted feudalism, according to Andrew Jones, because "it could maintain a warrior, but it could equally well maintain a capitalist landlord. It could be self-sufficient, yield produce for the market, or it could yield a money rent." The last feudal dues in France were abolished at the French Revolution. In parts of eastern Germany, the "Rittergut" manors of Junkers remained until World War II. In Quebec, the last feudal rents were paid in 1970 under the modified provisions of the "Seigniorial Dues Abolition Act" of 1935.
The term is most often used with reference to medieval Western Europe. Antecedents of the system can be traced to the rural economy of the later Roman Empire (Dominate). With a declining birthrate and population, labor was the key factor of production. Successive administrations tried to stabilise the imperial economy by freezing the social structure into place: sons were to succeed their fathers in their trade, councilors were forbidden to resign, and "coloni", the cultivators of land, were not to move from the land they were attached to. The workers of the land were on their way to becoming serfs.
Several factors conspired to merge the status of former slaves and former free farmers into a dependent class of such "coloni": it was possible to be described as "servus et colonus", "both slave and "colonus"". Laws of Constantine I around 325 both reinforced the semi-servile status of the "coloni" and limited their rights to sue in the courts; the "Codex Theodosianus" promulgated under Theodosius II extended these restrictions. The legal status of "adscripti", "bound to the soil", contrasted with barbarian "foederati", who were permitted to settle within the imperial boundaries, remaining subject to their own traditional law.
As the Germanic kingdoms succeeded Roman authority in the West in the fifth century, Roman landlords were often simply replaced by Germanic ones, with little change to the underlying situation or displacement of populations.
The process of rural self-sufficiency was given an abrupt boost in the eighth century, when normal trade in the Mediterranean Sea was disrupted. The thesis put forward by Henri Pirenne supposes that the Arab conquests forced the medieval economy into even greater ruralization and gave rise to the classic feudal pattern of varying degrees of servile peasantry underpinning a hierarchy of localised power centers.
The word derives from traditional inherited divisions of the countryside, reassigned as local jurisdictions known as manors or seigneuries; each manor being subject to a lord (French "seigneur"), usually holding his position in return for undertakings offered to a higher lord (see Feudalism). The lord held a manorial court, governed by public law and local custom. Not all territorial seigneurs were secular; bishops and abbots also held lands that entailed similar obligations.
By extension, the word "manor" is sometimes used in England to mean any home area or territory in which authority is held, often in a police or criminal context.
In the generic plan of a medieval manor from "Shepherd's Historical Atlas", the strips of individually worked land in the open field system are immediately apparent. In this plan, the manor house is set slightly apart from the village, but equally often the village grew up around the forecourt of the manor, formerly walled, while the manor lands stretched away outside, as still may be seen at Petworth House. As concerns for privacy increased in the 18th century, manor houses were often located a farther distance from the village. For example, when a grand new house was required by the new owner of Harlaxton Manor, Lincolnshire, in the 1830s, the site of the existing manor house at the edge of its village was abandoned for a new one, isolated in its park, with the village out of view.
In an agrarian society, the conditions of land tenure underlie all social or economic factors. There were two legal systems of pre-manorial landholding. One, the most common, was the system of holding land ""allodially"" in full outright ownership. The other was a use of "precaria" or benefices, in which land was held conditionally (the root of the English word "precarious").
To these two systems, the Carolingian monarchs added a third, the "aprisio", which linked manorialism with feudalism. The "aprisio" made its first appearance in Charlemagne's province of Septimania in the south of France, when Charlemagne had to settle the Visigothic refugees who had fled with his retreating forces after the failure of his Zaragoza expedition of 778. He solved this problem by allotting "desert" tracts of uncultivated land belonging to the royal "fisc" under direct control of the emperor. These holdings "aprisio" entailed specific conditions. The earliest specific "aprisio" grant that has been identified was at Fontjoncouse, near Narbonne (see Lewis, links). In former Roman settlements, a system of villas, dating from Late Antiquity, was inherited by the medieval world.
Manors each consisted of up to three classes of land:
Additional sources of income for the lord included charges for use of his mill, bakery or wine-press, or for the right to hunt or to let pigs feed in his woodland, as well as court revenues and single payments on each change of tenant. On the other side of the account, manorial administration involved significant expenses, perhaps a reason why smaller manors tended to rely less on villein tenure.
Dependent holdings were held nominally by arrangement of lord and tenant, but tenure became in practice almost universally hereditary, with a payment made to the lord on each succession of another member of the family. Villein land could not be abandoned, at least until demographic and economic circumstances made flight a viable proposition; nor could they be passed to a third party without the lord's permission, and the customary payment.
Although not free, villeins were by no means in the same position as slaves: they enjoyed legal rights, subject to local custom, and had recourse to the law subject to court charges, which were an additional source of manorial income. Sub-letting of villein holdings was common, and labour on the demesne might be commuted into an additional money payment, as happened increasingly from the 13th century.
This description of a manor house at Chingford, Essex in England was recorded in a document for the Chapter of St Paul's Cathedral when it was granted to Robert Le Moyne in 1265:
Like feudalism which, together with manorialism, formed the legal and organizational framework of feudal society, manorial structures were not uniform or coordinated. In the later Middle Ages, areas of incomplete or non-existent manorialization persisted while the manorial economy underwent substantial development with changing economic conditions.
Not all manors contained all three classes of land. Typically, demesne accounted for roughly a third of the arable area, and villein holdings rather more; but some manors consisted solely of demesne, others solely of peasant holdings. The proportion of unfree and free tenures could likewise vary greatly, with more or less reliance on wage labour for agricultural work on the demesne.
The proportion of the cultivated area in demesne tended to be greater in smaller manors, while the share of villein land was greater in large manors, providing the lord of the latter with a larger supply of obligatory labour for demesne work. The proportion of free tenements was generally less variable, but tended to be somewhat greater on the smaller manors.
Manors varied similarly in their geographical arrangement: most did not coincide with a single village, but rather consisted of parts of two or more villages, most of the latter containing also parts of at least one other manor. This situation sometimes led to replacement by cash payments or their equivalents in kind of the demesne labour obligations of those peasants living furthest from the lord's estate.
As with peasant plots, the demesne was not a single territorial unit, but consisted rather of a central house with neighbouring land and estate buildings, plus strips dispersed through the manor alongside free and villein ones: in addition, the lord might lease free tenements belonging to neighbouring manors, as well as holding other manors some distance away to provide a greater range of produce.
Nor were manors held necessarily by lay lords rendering military service (or again, cash in lieu) to their superior: a substantial share (estimated by value at 17% in England in 1086) belonged directly to the king, and a greater proportion (rather more than a quarter) were held by bishoprics and monasteries. Ecclesiastical manors tended to be larger, with a significantly greater villein area than neighbouring lay manors.
The effect of circumstances on manorial economy is complex and at times contradictory: upland conditions tended to preserve peasant freedoms (livestock husbandry in particular being less labour-intensive and therefore less demanding of villein services); on the other hand, some upland areas of Europe showed some of the most oppressive manorial conditions, while lowland eastern England is credited with an exceptionally large free peasantry, in part a legacy of Scandinavian settlement.
Similarly, the spread of money economy stimulated the replacement of labour services by money payments, but the growth of the money supply and resulting inflation after 1170 initially led nobles to take back leased estates and to re-impose labour dues as the value of fixed cash payments declined in real terms.
Specific:
General:
|
https://en.wikipedia.org/wiki?curid=20862
|
Margaret Mitchell
Margaret Munnerlyn Mitchell (November 8, 1900 – August 16, 1949) was an American novelist and journalist. Mitchell wrote only one novel, published during her lifetime, the American Civil War-era novel "Gone with the Wind", for which she won the National Book Award for Most Distinguished Novel of 1936 and the Pulitzer Prize for Fiction in 1937. In recent years long after her death, a collection of Mitchell's girlhood writings and a novella she wrote as a teenager, titled "Lost Laysen", have been published. A collection of newspaper articles written by Mitchell for "The Atlanta Journal" was republished in book form.
Margaret Mitchell was a Southerner and a native and lifelong resident of Georgia. She was born in 1900 into a wealthy and politically prominent family. Her father, Eugene Muse Mitchell, was an attorney, and her mother, Mary Isabel "May Belle" (or "Maybelle") Stephens, was a suffragist. She had two brothers, Russell Stephens Mitchell, who died in infancy in 1894, and Alexander Stephens Mitchell, born in 1896.
Mitchell's family on her father's side were descendants of Thomas Mitchell, originally of Aberdeenshire, Scotland, who settled in Wilkes County, Georgia in 1777, and served in the American Revolutionary War. Her grandfather, Russell Crawford Mitchell, of Atlanta, enlisted in the Confederate States Army on June 24, 1861, and served in Hood's Texas Brigade. He was severely wounded at the Battle of Sharpsburg, demoted for "inefficiency," and detailed as a nurse in Atlanta. After the Civil War, he made a large fortune supplying lumber for the rapid rebuilding of Atlanta. Russell Mitchell had thirteen children from two wives; the eldest was Eugene, who graduated from the University of Georgia Law School.
Mitchell's maternal great-grandfather, Philip Fitzgerald, emigrated from Ireland and eventually settled on a slaveholding plantation near Jonesboro, Georgia, where he had one son and seven daughters with his wife, Elenor. Mitchell's grandparents, married in 1863, were Annie Fitzgerald and John Stephens; he had also emigrated from Ireland and became a Captain in the Confederate States Army. John Stephens was a prosperous real estate developer after the Civil War and one of the founders of the Gate City Street Railroad (1881), a mule-drawn Atlanta trolley system. John and Annie Stephens had twelve children together; the seventh child was May Belle Stephens, who married Eugene Mitchell. May Belle Stephens had studied at the Bellevue Convent in Quebec and completed her education at the Atlanta Female Institute.
The "Atlanta Constitution" reported that May Belle Stephens and Eugene Mitchell were married at the Jackson Street mansion of the bride's parents on November 8, 1892:
the maid of honor, Miss Annie Stephens, was as pretty as a French pastel, in a directoire costume of yellow satin with a long coat of green velvet sleeves, and a vest of gold brocade...The bride was a fair vision of youthful loveliness in her robe of exquisite ivory white and satin...her slippers were white satin wrought with pearls...an elegant supper was served. The dining room was decked in white and green, illuminated with numberless candles in silver candlelabras...The bride's gift from her father was an elegant house and lot...At 11 o'clock Mrs. Mitchell donned a pretty going-away gown of green English cloth with its jaunty velvet hat to match and bid goodbye to her friends.
Margaret Mitchell spent her early childhood on Jackson Hill, east of downtown Atlanta. Her family lived near her maternal grandmother, Annie Stephens, in a Victorian house painted bright red with yellow trim. Mrs. Stephens had been a widow for several years prior to Margaret's birth; Captain John Stephens died in 1896. After his death, she inherited property on Jackson Street where Margaret's family lived.
Grandmother Annie Stephens was quite a character, both vulgar and a tyrant. After gaining control of her father Philip Fitzgerald's money after he died, she splurged on her younger daughters, including Margaret's mother, and sent them to finishing school in the north. There they learned that Irish Americans were not treated as equal to other immigrants, and that it was shameful to be a daughter of an Irishman. Margaret's relationship with her grandmother would become quarrelsome in later years as she entered adulthood. However, for Margaret, her grandmother was a great source of "eye-witness information" about the Civil War and Reconstruction in Atlanta prior to her death in 1934.
In an accident that was traumatic for her mother although she was unharmed, when Mitchell was about three years old, her dress caught fire on an iron grate. Fearing it would happen again, her mother began dressing her in boys' pants, and she was nicknamed "Jimmy", the name of a character in the comic strip, "Little Jimmy". Her brother insisted she would have to be a boy named Jimmy to play with him. Having no sisters to play with, Mitchell said she was a boy named Jimmy until she was fourteen.
Stephens Mitchell said his sister was a tomboy who would happily play with dolls occasionally, and she liked to ride her Texas plains pony. As a little girl, Mitchell went riding every afternoon with a Confederate veteran and a young lady of "beau-age". She was raised in an era when children were "seen and not heard" and was not allowed to express her personality by running and screaming on Sunday afternoons while her family was visiting relatives.
Mitchell learned the gritty details of specific battles from these visits with aging Confederate soldiers. But she didn't learn that the South had actually lost the war until she was 10 years of age: "I heard everything in the world except that the Confederates lost the war. When I was ten years old, it was a violent shock to learn that General Lee had been defeated. I didn't believe it when I first heard it and I was indignant. I still find it hard to believe, so strong are childhood impressions."
Her mother would swat her with a hairbrush or a slipper as a form of discipline.
May Belle Mitchell was "hissing blood-curdling threats" to her daughter to make her behave the evening she took her to a women's suffrage rally led by Carrie Chapman Catt. Her daughter sat on a platform wearing a Votes-for-Women banner, blowing kisses to the gentlemen, while her mother gave an impassioned speech. She was nineteen years old when the Nineteenth Amendment was ratified, which gave women the right to vote.
May Belle Mitchell was president of the Atlanta Woman's Suffrage League (1915), chairwoman of press publicity for the Georgia Mothers' Congress and Parent Teacher Association, a member of the Pioneer Society, the Atlanta Woman's Club, and several church and literary societies.
Mitchell's father was not in favor of corporal punishment in school. During his tenure as president of the educational board (1911–1912), corporal punishment in the public schools was abolished. Reportedly, Eugene Mitchell received a whipping on the first day he attended school and the mental impression of the thrashing lasted far longer than the physical marks.
Jackson Hill was an old, affluent part of the city. At the bottom of Jackson Hill was an area of African American homes and businesses called "Darktown". The mayhem of the Atlanta Race Riot occurred over four days in September 1906 when Mitchell was five years old. Local newspapers alleged that several white women had been assaulted by black men, prompting an angry mob of 10,000 to assemble in the streets.
Eugene Mitchell went to bed early the night the rioting began, but was awakened by the sounds of gunshots. The following morning, as he later wrote, to his wife, he learned "16 negroes had been killed and a multitude had been injured" and that rioters "killed or tried to kill every Negro they saw." As the rioting continued, rumors ran wild that Negroes would burn Jackson Hill. At his daughter's suggestion, Eugene Mitchell, who did not own a gun, stood guard with a sword.
Though she and her family were unharmed, Mitchell recalled twenty years later the terror she felt during the riot. Mitchell grew up in a Southern culture where the threat of black on white rape incited mob violence, and in this world, white Georgians lived in fear of the "black beast rapist".
Soon after the riot, the Mitchell family decided to move away from Jackson Hill. In 1912, they moved to the east side of Peachtree Street just north of Seventeenth Street in Atlanta. Past the nearest neighbor's house was forest and beyond it the Chattahoochee River. Mitchell's former Jackson Hill home was destroyed in the Great Atlanta Fire of 1917.
While "the South" exists as a geographical region of the United States, it is also said to exist as "a place of the imagination" of writers. An image of "the South" was fixed in Mitchell's imagination when at six years old her mother took her on a buggy tour through ruined plantations and "Sherman's sentinels", the brick and stone chimneys that remained after William Tecumseh Sherman's "March and torch" through Georgia. Mitchell would later recall what her mother had said to her:
She talked about the world those people had lived in, such a secure world, and how it had exploded beneath them. And she told me that my world was going to explode under me, someday, and God help me if I didn't have some weapon to meet the new world.
From an imagination cultivated in her youth, Margaret Mitchell's defensive weapon would become her writing.
Mitchell said she heard Civil War stories from her relatives when she was growing up:
On Sunday afternoons when we went calling on the older generation of relatives, those who had been active in the Sixties, I sat on the bony knees of veterans and the fat slippery laps of great aunts and heard them talk.
On summer vacations, she visited her maternal great-aunts, Mary Ellen ("Mamie") Fitzgerald and Sarah ("Sis") Fitzgerald, who still lived at her great-grandparents' plantation home in Jonesboro. Mamie had been twenty-one years old and Sis was thirteen when the Civil War began.
An avid reader, young Margaret read "boys' stories" by G.A. Henty, the Tom Swift series, and the Rover Boys series by Edward Stratemeyer. Her mother read Mary Johnston's novels to her before she could read. They both wept reading Johnston's "The Long Roll" (1911) and "Cease Firing" (1912). Between the "scream of shells, the mighty onrush of charges, the grim and grisly aftermath of war", "Cease Firing" is a romance novel involving the courtship of a Confederate soldier and a Louisiana plantation belle with Civil War illustrations by N. C. Wyeth. She also read the plays of William Shakespeare, and novels by Charles Dickens and Sir Walter Scott. Mitchell's two favorite children's books were by author Edith Nesbit: "Five Children and It" (1902) and "The Phoenix and the Carpet" (1904). She kept both on her bookshelf even as an adult and gave them as gifts. Another author whom Mitchell read as a teenager and who had a major impact in her understanding of the Civil War and Reconstruction was Thomas Dixon. Dixon's popular trilogy of novels "The Leopard's Spots: A Romance of the White Man's Burden" (1902), "" (1905) and "The Traitor: A Story of the Rise and Fall of the Invisible Empire" (1907) all depicted in vivid terms a white South victimized during the Reconstruction by Northern carpetbaggers and freed slaves, with an especial emphasis upon Reconstruction as a nightmarish time when black men ran amok, raping white women with impunity. As a teenager, Mitchell liked Dixon's books so much that she organized the local children to put on dramatizations of his books. The picture the white supremacist Dixon drew of Reconstruction is now rejected as inaccurate, but at the time, the memory of the past was such it was widely believed by white Americans. In a letter to Dixon dated August 10, 1936, Mitchell wrote: "I was practically raised on your books, and love them very much."
An imaginative and precocious writer, Margaret Mitchell began with stories about animals, then progressed to fairy tales and adventure stories. She fashioned book covers for her stories, bound the tablet paper pages together and added her own artwork. At age eleven she gave a name to her publishing enterprise: "Urchin Publishing Co." Later her stories were written in notebooks. Mary Belle Mitchell kept her daughter's stories in white enamel bread boxes and several boxes of her stories were stored in the house by the time Margaret went off to college.
"Margaret" is a character riding a galloping pony in "The Little Pioneers", and plays "Cowboys and Indians" in "When We Were Shipwrecked".
Romantic love and honor emerged as themes of abiding interest for Mitchell in "The Knight and the Lady" (ca. 1909), in which a "good knight" and a "bad knight" duel for the hand of the lady. In "The Arrow Brave and the Deer Maiden" (ca. 1913), a half-white Indian brave, Jack, must withstand the pain inflicted upon him to uphold his honor and win the girl. The same themes were treated with increasing artistry in "Lost Laysen", the novella Mitchell wrote as a teenager in 1916, and, with much greater sophistication, in Mitchell's last known novel, "Gone with the Wind", which she began in 1926.
In her pre-teens, Mitchell also wrote stories set in foreign locations, such as "The Greaser" (1913), a cowboy story set in Mexico. In 1913 she wrote two stories with Civil War settings; one includes her notation that "237 pages are in this book".
While the Great War carried on in Europe (1914–1918), Margaret Mitchell attended Atlanta's Washington Seminary (now The Westminster Schools), a "fashionable" private girls' school with an enrollment of over 300 students. She was very active in the Drama Club. Mitchell played the male characters: Nick Bottom in Shakespeare's "A Midsummer Night's Dream" and Launcelot Gobbo in Shakespeare's "The Merchant of Venice", among others. She wrote a play about snobbish college girls that she acted in as well. She also joined the Literary Club and had two stories published in the yearbook: "Little Sister" and "Sergeant Terry". Ten-year-old "Peggy" is the heroine in "Little Sister". She hears her older sister being raped and shoots the rapist:
Coldly, dispassionately she viewed him, the chill steel of the gun giving her confidence. She must not miss now—she would not miss—and she did not.
Mitchell received encouragement from her English teacher, Mrs. Paisley, who recognized her writing talent. A demanding teacher, Paisley told her she had ability if she worked hard and would not be careless in constructing sentences. A sentence, she said, must be "complete, concise and coherent".
Mitchell read the books of Thomas Dixon, Jr., and in 1916, when the silent film, "The Birth of a Nation", was showing in Atlanta, she dramatized Dixon's "The Traitor: A Story of the Fall of the Invisible Empire" (1907). As both playwright and actress, she took the role of Steve Hoyle. For the production, she made a Ku Klux Klan costume from a white crepe dress and wore a boy's wig. (Note: Dixon rewrote "The Traitor" as "The Black Hood" (1924) and Steve Hoyle was renamed George Wilkes.)
During her years at Washington Seminary, Mitchell's brother, Stephens, was away studying at Harvard College (1915–1917), and he left in May 1917 to enlist in the army, about a month after the U.S. declared war on Germany. He set sail for France in April 1918, participated in engagements in the Lagny and Marbache sectors, then returned to Georgia in October as a training instructor. While Margaret and her mother were in New York in September 1918 preparing for Margaret to attend college, Stephens wired his father that he was safe after his ship had been torpedoed en route to New York from France.
Stephens Mitchell thought college was the "ruination of girls". However, May Belle Mitchell placed a high value on education for women and she wanted her daughter's future accomplishments to come from using her mind. She saw education as Margaret's weapon and "the key to survival". The classical college education she desired for her daughter was one that was on par with men's colleges, and this type of education was available only at northern schools. Her mother chose Smith College in Northampton, Massachusetts for Margaret because she considered it to be the best women's college in the United States.
Upon graduating from Washington Seminary in June 1918, Mitchell fell in love with a Harvard graduate, a young army lieutenant, Clifford West Henry, who was chief bayonet instructor at Camp Gordon from May 10 until the time he set sail for France on July 17. Henry was "slightly effeminate", "ineffectual", and "rather effete-looking" with "homosexual tendencies", according to biographer Anne Edwards. Before departing for France, he gave Mitchell an engagement ring.
On September 14, while she was enrolled at Smith College, Henry was mortally wounded in action in France and died on October 17. As Henry waited in the Verdun trenches, shortly before being wounded, he composed a poem on a leaf torn from his field notebook, found later among his effects. The last stanza of Lieutenant Clifford W. Henry's poem follows:
Henry repeatedly advanced in front of the platoon he commanded, drawing machine-gun fire so that the German nests could be located and wiped out by his men. Although wounded in the leg in this effort, his death was the result of shrapnel wounds from an air bomb dropped by a German plane. He was awarded the French "Croix de guerre avec palme" for his acts of heroism. From the President of the United States, the Commander in Chief of the United States Armed Forces, he was presented with the Distinguished Service Cross and an Oak Leaf Cluster in lieu of a second Distinguished Service Cross.
Clifford Henry was the great love of Margaret Mitchell's life, according to her brother. In a letter to a friend (A. Edee, March 26, 1920), Mitchell wrote of Clifford that she had a "memory of a love that had in it no trace of physical passion".
Mitchell had vague aspirations of a career in psychiatry, but her future was derailed by an event that killed over fifty million people worldwide, the 1918 flu pandemic. On January 25, 1919, her mother, May Belle Mitchell, succumbed to pneumonia from the "Spanish flu". Mitchell arrived home from college a day after her mother had died. Knowing her death was imminent, May Belle Mitchell wrote her daughter a brief letter and advised her:
Give of yourself with both hands and overflowing heart, but give only the excess after you have lived your own life.
An average student at Smith College, Mitchell did not excel in any area of academics. She held a low estimation of her writing abilities. Even though her English professor had praised her work, she felt the praise was undue. After finishing her freshman year at Smith, Mitchell returned to Atlanta to take over the household for her father and never returned to college. In October 1919, while regaining her strength after an appendectomy, she confided to a friend that giving up college and her dreams of a "journalistic career" to keep house and take her mother's place in society meant "giving up all the worthwhile things that counted for—nothing!"
Margaret began using the name "Peggy" at Washington Seminary, and the abbreviated form "Peg" at Smith College when she found an icon for herself in the mythological winged horse, "Pegasus", that inspires poets. Peggy made her Atlanta society debut in the 1920 winter season. In the "gin and jazz style" of the times, she did her "flapping" in the 1920s. At a 1921 Atlanta debutante charity ball, she performed an Apache dance. The dance included a kiss with her male partner that shocked Atlanta "high society". The Apache and the Tango were scandalous dances for their elements of eroticism, the latter popularized in a 1921 silent film, "The Four Horsemen of the Apocalypse", that made its lead actor, Rudolph Valentino, a sex symbol for his ability to Tango.
Mitchell was, in her own words, an "unscrupulous flirt". She found herself engaged to five men, but maintained that she neither lied to or misled any of them. A local gossip columnist, who wrote under the name Polly Peachtree, described Mitchell's love life in a 1922 column:
...she has in her brief life, perhaps, had more men really, truly 'dead in love' with her, more honest-to-goodness suitors than almost any other girl in Atlanta.
In April 1922, Mitchell was seeing two men almost daily: one was Berrien ("Red") Kinnard Upshaw (March 10, 1901 – January 13, 1949), whom she is thought to have met in 1917 at a dance hosted by the parents of one of her friends, and the other, Upshaw's roommate and friend, John Robert Marsh (October 6, 1895 – March 5, 1952), a copy editor from Kentucky who worked for the Associated Press. Upshaw was an Atlanta boy, a few months younger than Mitchell, whose family moved to Raleigh, North Carolina in 1916. In 1919 he was appointed to the United States Naval Academy, but resigned for academic deficiencies on January 5, 1920. He was readmitted in May, then 19 years old, and spent two months at sea before resigning a second time on September 1, 1920. Unsuccessful in his educational pursuits and with no job, in 1922 Upshaw earned money bootlegging alcohol out of the Georgia mountains.
Although her family disapproved, Peggy and Red married on September 2, 1922; the best man at their wedding was John Marsh, who would become her second husband. The couple resided at the Mitchell home with her father. By December the marriage to Upshaw had dissolved and he left. Mitchell suffered physical and emotional abuse, the result of Upshaw's alcoholism and violent temper. Upshaw agreed to an uncontested divorce after John Marsh gave him a loan and Mitchell agreed not to press assault charges against him. Upshaw and Mitchell were divorced on October 16, 1924.
On July 4, 1925, 24-year-old Margaret Mitchell and 29-year-old John Marsh were married in the Unitarian-Universalist Church. The Marshes made their home at the Crescent Apartments in Atlanta, taking occupancy of Apt. 1, which they affectionately named "The Dump" (now the Margaret Mitchell House and Museum).
While still legally married to Upshaw and needing income for herself, Mitchell got a job writing feature articles for "The Atlanta Journal Sunday Magazine". She received almost no encouragement from her family or "society" to pursue a career in journalism, and had no prior newspaper experience. Medora Field Perkerson, who hired Mitchell said:
There had been some skepticism on the Atlanta Journal Magazine staff when Peggy came to work as a reporter. Debutantes slept late in those days and didn't go in for jobs.
Her first story, "Atlanta Girl Sees Italian Revolution", by Margaret Mitchell Upshaw, appeared on December 31, 1922. She wrote on a wide range of topics, from fashions to Confederate generals and King Tut. In an article that appeared on July 1, 1923, "Valentino Declares He Isn't a Sheik", she interviewed celebrity actor Rudolph Valentino, referring to him as "Sheik" from his film role. Less thrilled by his looks than his "chief charm", his "low, husky voice with a soft, sibilant accent", she described his face as "swarthy":
His face was swarthy, so brown that his white teeth flashed in startling contrast to his skin; his eyes—tired, bored, but courteous.
Mitchell was quite thrilled when Valentino took her in his arms and carried her inside from the rooftop of the Georgian Terrace Hotel.
Many of her stories were vividly descriptive. In an article titled, "Bridesmaid of Eighty-Seven Recalls Mittie Roosevelt's Wedding", she wrote of a white-columned mansion in which lived the last surviving bridesmaid at Theodore Roosevelt's mother's wedding:
The tall white columns glimpsed through the dark green of cedar foliage, the wide veranda encircling the house, the stately silence engendered by the century-old oaks evoke memories of Thomas Nelson Page's "On Virginia". The atmosphere of dignity, ease, and courtesy that was the soul of the Old South breathes from this old mansion...
In another article, "Georgia's Empress and Women Soldiers", she wrote short sketches of four notable Georgia women. One was the first woman to serve in the United States Senate, Rebecca Latimer Felton, a suffragist who held white supremacist views. The other women were: Nancy Hart, Lucy Mathilda Kenny (also known as Private Bill Thompson of the Confederate States Army) and Mary Musgrove. The article generated mail and controversy from her readers. Mitchell received criticism for depicting "strong women who did not fit the accepted standards of femininity."
Mitchell's journalism career, which began in 1922, came to an end less than four years later; her last article appeared on May 9, 1926. Several months after marrying John Marsh, Mitchell quit due to an ankle injury that would not heal properly and chose to become a full-time wife. During the time Mitchell worked for the "Atlanta Journal", she wrote 129 feature articles, 85 news stories, and several book reviews.
Mitchell began collecting erotica from book shops in New York City while in her twenties. The newlywed Marshes and their social group were interested in "all forms of sexual expression". Mitchell discussed her interest in "dirty" book shops and sexually explicit prose in letters to a friend, Harvey Smith. Smith noted her favorite reads were "Fanny Hill", "The Perfumed Garden", and "".
Mitchell developed an appreciation for the works of Southern writer James Branch Cabell, and his 1919 classic, "Jurgen, A Comedy of Justice". She read books about sexology and took particular interest in the case studies of Havelock Ellis, a British physician who studied human sexuality. During this period in which Mitchell was reading pornography and sexology, she was also writing "Gone with the Wind".
Mitchell wrote a romance novella, "Lost Laysen", when she was fifteen years old (1916). She gave "Lost Laysen", which she had written in two notebooks, to a boyfriend, Henry Love Angel. He died in 1945 and the novella remained undiscovered among some letters she had written to him until 1994. The novella was published in 1996, eighty years after it was written, and became a "New York Times" Best Seller.
In "Lost Laysen", Mitchell explores the dynamics of three male characters and their relationship to the only female character, Courtenay Ross, a strong-willed American missionary to the South Pacific island of Laysen. The narrator of the tale is Billy Duncan, "a rough, hardened soldier of fortune", who is frequently involved in fights that leave him near death. Courtenay quickly observes Duncan's hard-muscled body as he works shirtless aboard a ship called "Caliban". Courtenay's suitor is Douglas Steele, an athletic man who apparently believes Courtenay is helpless without him. He follows Courtenay to Laysen to protect her from perceived foreign savages. The third male character is the rich, powerful yet villainous Juan Mardo. He leers at Courtenay and makes rude comments of a sexual nature, in Japanese nonetheless. Mardo provokes Duncan and Steele, and each feels he must defend Courtenay's honor. Ultimately Courtenay defends her own honor rather than submit to shame.
Mitchell's half-breed
antagonist, Juan Mardo, lurks in the shadows of the story and has no dialogue. The reader learns of Mardo's evil intentions through Duncan:
They were saying that Juan Mardo had his eye on you—and intended to have you—any way he could get you!
Mardo's desires are similar to those of Rhett Butler in his ardent pursuit of Scarlett O'Hara in Mitchell's epic novel, "Gone with the Wind". Rhett tells Scarlett:
I always intended having you, one way or another.
The "other way" is rape. In "Lost Laysen" the male seducer is replaced with the male rapist.
In Mitchell's teenage years, she is known to have written a 400-page novel about girls in a boarding school, "The Big Four". The novel is thought to be lost; Mitchell destroyed some of her manuscripts herself and others were destroyed after her death.
In the 1920s Mitchell completed a novelette, "Ropa Carmagin", about a Southern white girl who loves a biracial man. Mitchell submitted the manuscript to Macmillan Publishers in 1935 along with her manuscript for "Gone with the Wind". The novelette was rejected; Macmillan thought the story was too short for book form.
In May 1926, after Mitchell had left her job at the "Atlanta Journal" and was recovering at home from her ankle injury, she wrote a society column for the "Sunday Magazine", "Elizabeth Bennet's Gossip", which she continued to write until August. Meanwhile, her husband was growing weary of lugging armloads of books home from the library to keep his wife's mind occupied while she hobbled around the house; he emphatically suggested that she write her own book instead:
For God's sake, Peggy, can't you write a book instead of reading thousands of them?
To aid her in her literary endeavors, John Marsh brought home a Remington Portable No. 3 typewriter (c. 1928). For the next three years Mitchell worked exclusively on writing a Civil War-era novel whose heroine was named Pansy O'Hara (prior to "Gone with the Wind"'s publication Pansy was changed to Scarlett). She used parts of the manuscript to prop up a wobbly couch.
During World War II, Margaret Mitchell was a volunteer for the American Red Cross and she raised money for the war effort by selling war bonds. She was active in Home Defense, sewed hospital gowns and put patches on trousers. Her personal attention, however, was devoted to writing letters to men in uniform—soldiers, sailors, and marines, sending them humor, encouragement, and her sympathy.
The USS "Atlanta" (CL-51) was a light cruiser used as an anti-aircraft ship of the United States Navy sponsored by Margaret Mitchell and used in the naval Battle of Midway and the Eastern Solomons. The ship was heavily damaged during night surface action on November 13, 1942, during the Naval Battle of Guadalcanal and subsequently scuttled on orders of her captain having earned five battle stars and a Presidential Unit Citation as a "heroic example of invincible fighting spirit".
Mitchell sponsored a second light cruiser named after the city of Atlanta, the USS "Atlanta" (CL-104). On February 6, 1944, she christened "Atlanta" in Camden, New Jersey, and the cruiser began fighting operations in May 1945. "Atlanta" was a member of task forces protecting fast carriers, was operating off the coast of Honshū when the Japanese surrendered on August 15, 1945, and earned two battle stars. She was finally sunk during explosive testing off San Clemente Island on October 1, 1970.
Margaret Mitchell was struck by a speeding automobile as she crossed Peachtree Street at 13th Street in Atlanta with her husband, John Marsh, while on her way to see the movie "A Canterbury Tale" on the evening of August 11, 1949. She died at age 48 at Grady Hospital five days later on August 16 without fully regaining consciousness.
The driver, Hugh Gravitt, was an off-duty taxi driver who was driving his personal vehicle when he struck Mitchell. After the collision, Gravitt was arrested for drunken driving and released on a $5,450 bond until Mitchell's death.
Gravitt was originally charged with drunken driving, speeding, and driving on the wrong side of the road. He was convicted of involuntary manslaughter in November 1949 and sentenced to 18 months in jail. He served almost 11 months. Gravitt died in 1994 at the age of 73.
Margaret Mitchell was buried at Oakland Cemetery, Georgia. When her husband John died in 1952, he was buried next to his wife.
In 1994, Mitchell was inducted into Georgia Women of Achievement, and into the Georgia Writers Hall of Fame in 2000.
|
https://en.wikipedia.org/wiki?curid=20864
|
Metamorphosis
Metamorphosis is a biological process by which an animal physically develops after birth or hatching, involving a conspicuous and relatively abrupt change in the animal's body structure through cell growth and differentiation. Some insects, fish, amphibians, mollusks, crustaceans, cnidarians, echinoderms, and tunicates undergo metamorphosis, which is often accompanied by a change of nutrition source or behavior. Animals can be divided into species that undergo complete metamorphosis ("holometaboly"), incomplete metamorphosis ("hemimetaboly"), or no metamorphosis ("ametaboly").
Scientific usage of the term is technically precise, and it is not applied to general aspects of cell growth, including rapid growth spurts. References to "metamorphosis" in mammals are imprecise and only colloquial, but historically idealist ideas of transformation and monadology, as in Goethe's "Metamorphosis of Plants", have influenced the development of ideas of evolution.
The word "metamorphosis" derives from Greek , "transformation, transforming", from ('), "after" and ('), "form".
Metamorphosis is iodothyronine-induced and an ancestral feature of all chordates.
In insects, growth and metamorphosis are controlled by hormones synthesized by endocrine glands near the front of the body (anterior). Neurosecretory cells in an insect's brain secrete a hormone, the prothoracicotropic hormone (PTTH) that activates prothoracic glands, which secrete a second hormone, usually ecdysone (an ecdysteroid), that induces ecdysis.
PTTH also stimulates the corpora allata, a retrocerebral organ, to produce juvenile hormone, which prevents the development of adult characteristics during ecdysis. In holometabolous insects, molts between larval instars have a high level of juvenile hormone, the moult to the pupal stage has a low level of juvenile hormone, and the final, or imaginal, molt has no juvenile hormone present at all. Experiments on firebugs have shown how juvenile hormone can affect the number of nymph instar stages in hemimetabolous insects.
All three categories of metamorphosis can be found in the diversity of insects, including no metamorphosis ("ametaboly"), incomplete or partial metamorphosis ("hemimetaboly"), and complete metamorphosis ("holometaboly"). While ametabolous insects show very little difference between larval and adult forms (also known as "direct development"), both hemimetabolous and holometabolous insects have significant morphological and behavioral differences between larval and adult forms, the most significant being the inclusion, in holometabolus organisms, of a pupal or resting stage between the larval and adult forms.
In hemimetabolous insects, immature stages are called nymphs. Development proceeds in repeated stages of growth and ecdysis (moulting); these stages are called instars. The juvenile forms closely resemble adults, but are smaller and lack adult features such as wings and genitalia. The size and morphological differences between nymphs in different instars are small, often just differences in body proportions and the number of segments; in later instars, external wing buds form.
In holometabolous insects, immature stages are called larvae and differ markedly from adults. Insects which undergo holometabolism pass through a larval stage, then enter an inactive state called pupa (called a "chrysalis" in butterfly species), and finally emerge as adults.
The earliest insect forms showed direct development (ametabolism), and the evolution of metamorphosis in insects is thought to have fuelled their dramatic radiation (1,2). Some early ametabolous "true insects" are still present today, such as bristletails and silverfish. Hemimetabolous insects include cockroaches, grasshoppers, dragonflies, and true bugs. Phylogenetically, all insects in the Pterygota undergo a marked change in form, texture and physical appearance from immature stage to adult. These insects either have hemimetabolous development, and undergo an incomplete or partial metamorphosis, or holometabolous development, which undergo a complete metamorphosis, including a pupal or resting stage between the larval and adult forms.
A number of hypotheses have been proposed to explain the evolution of holometaboly from hemimetaboly, mostly centering on whether or not the intermediate hemimetabolous forms are homologous to pupal form of holometabolous forms.
More recently, scientific attention has turned to characterizing the mechanistic basis of metamorphosis in terms of its hormonal control, by characterizing spatial and temporal patterns of hormone expression relative to metamorphosis in a wide range of insects.
According to research from 2008, adult "Manduca sexta" is able to retain behavior learned as a caterpillar. Another caterpillar, the ornate moth caterpillar, is able to carry toxins that it acquires from its diet through metamorphosis and into adulthood, where the toxins still serve for protection against predators.
Many observations published in 2002, and supported in 2013 indicate that programmed cell death plays a considerable role during physiological processes of multicellular organisms, particularly during embryogenesis, and metamorphosis.
Below is the sequence of steps in the metamorphosis of the butterfly (illustrated):
1 – The larva of a butterfly
2 – The pupa is now spewing the thread to form chrysalis
3 – The chrysalis is fully formed
4 – Adult butterfly coming out of the chrysalis
In cephalochordata, metamorphosis is iodothyronine-induced and it could be an ancestral feature of all chordates.
Some fish, both bony fish (Osteichthyes) and jawless fish (Agnatha), undergo metamorphosis. Fish metamorphosis is typically under strong control by the thyroid hormone.
Examples among the non-bony fish include the lamprey. Among the bony fish, mechanisms are varied.
The salmon is diadromous, meaning that it changes from a freshwater to a saltwater lifestyle.
Many species of flatfish begin their life bilaterally symmetrical, with an eye on either side of the body; but one eye moves to join the other side of the fish – which becomes the upper side – in the adult form.
The European eel has a number of metamorphoses, from the larval stage to the leptocephalus stage, then a quick metamorphosis to glass eel at the edge of the continental shelf (eight days for the Japanese eel), two months at the border of fresh and salt water where the glass eel undergoes a quick metamorphosis into elver, then a long stage of growth followed by a more gradual metamorphosis to the migrating phase. In the pre-adult freshwater stage, the eel also has phenotypic plasticity because fish-eating eels develop very wide mandibles, making the head look blunt. Leptocephali are common, occurring in all Elopomorpha (tarpon- and eel-like fish).
Most other bony fish undergo metamorphosis from embryo to larva (fry) and then to the juvenile stage during absorption of the yolk sac, because after that phase the individual needs to be able to feed for itself.
In typical amphibian development, eggs are laid in water and larvae are adapted to an aquatic lifestyle. Frogs, toads, and newts all hatch from the eggs as larvae with external gills but it will take some time for the amphibians to interact outside with pulmonary respiration. Afterwards, newt larvae start a predatory lifestyle, while tadpoles mostly scrape food off surfaces with their horny tooth ridges.
Metamorphosis in amphibians is regulated by thyroxin concentration in the blood, which stimulates metamorphosis, and prolactin, which counteracts its effect. Specific events are dependent on threshold values for different tissues. Because most embryonic development is outside the parental body, development is subject to many adaptations due to specific ecological circumstances. For this reason tadpoles can have horny ridges for teeth, whiskers, and fins. They also make use of the lateral line organ. After metamorphosis, these organs become redundant and will be resorbed by controlled cell death, called apoptosis. The amount of adaptation to specific ecological circumstances is remarkable, with many discoveries still being made.
With frogs and toads, the external gills of the newly hatched tadpole are covered with a gill sac after a few days, and lungs are quickly formed. Front legs are formed under the gill sac, and hindlegs are visible a few days later. Following that there is usually a longer stage during which the tadpole lives off a vegetarian diet. Tadpoles use a relatively long, spiral‐shaped gut to digest that diet.
Rapid changes in the body can then be observed as the lifestyle of the frog changes completely. The spiral‐shaped mouth with horny tooth ridges is resorbed together with the spiral gut. The animal develops a big jaw, and its gills disappear along with its gill sac. Eyes and legs grow quickly, a tongue is formed, and all this is accompanied by associated changes in the neural networks (development of stereoscopic vision, loss of the lateral line system, etc.) All this can happen in about a day, so it is truly a metamorphosis. It is not until a few days later that the tail is reabsorbed, due to the higher thyroxin concentrations required for tail resorption.
The Salamander development is highly diverse; some species go through a dramatic reorganization when transitioning from aquatic larvae to terrestrial adults, while others, such as the axolotl, display paedomorphosis and never develop into terrestrial adults. Within the genus "Ambystoma", species have evolved to be paedomorphic several times, and paedomorphosis and complete development can both occur in some species.
In newts, there is no true metamorphosis because newt larvae already feed as predators and continue doing so as adults. Newts' gills are never covered by a gill sac and will be resorbed only just before the animal leaves the water. Just as in tadpoles, their lungs are functional early, but newts use them less frequently than tadpoles. Newts often have an aquatic phase in spring and summer, and a land phase in winter. For adaptation to a water phase, prolactin is the required hormone, and for adaptation to the land phase, thyroxin. External gills do not return in subsequent aquatic phases because these are completely absorbed upon leaving the water for the first time.
Basal caecilians such as "Ichthyophis" go through a metamorphosis in which aquatic larva transition into fossorial adults, which involves a loss of the lateral line. More recently diverged caecilians (the Teresomata) do not undergo an ontogenetic niche shift of this sort and are in general fossorial throughout their lives. Thus, most caecilians do not undergo an anuran-like metamorphosis.
|
https://en.wikipedia.org/wiki?curid=20866
|
Monoamine oxidase inhibitor
Monoamine oxidase inhibitors (MAOIs) are a class of drugs that inhibit the activity of one or both monoamine oxidase enzymes: monoamine oxidase A (MAO-A) and monoamine oxidase B (MAO-B). They are best known as highly efficacious anti-depressants, as well as effective therapeutic agents for panic disorder and social phobia. They are particularly effective in treatment-resistant depression and atypical depression. They are also used in the treatment of Parkinson's disease and several other disorders.
Reversible inhibitors of monoamine oxidase A (RIMAs) are a subclass of MAOIs that selectively and reversibly inhibit the MAO-A enzyme. RIMAs are used clinically in the treatment of depression and dysthymia. Due to their reversibility, they are safer in single-drug overdose than the older, irreversible MAOIs, and weaker in increasing the monoamines important in depressive disorder. RIMAs have not gained widespread market share in the United States.
New research into MAOIs indicates that much of the concern over their supposed dangerous dietary side effects stems from misconceptions and misinformation, and that they are still underutilized despite demonstrated efficacy. New research also questions the validity of the perceived severity of dietary reactions, which has been based on outdated research. Despite this, many psychiatrists, who have little or no knowledge of and experience with monoamine oxidase inhibitors (and are thus unaware of their significant benefits), still reserve them as a last line of treatment, used only when other classes of antidepressant drugs (for example selective serotonin reuptake inhibitors and tricyclic antidepressants) have failed.
MAOIs have been found to be effective in the treatment of panic disorder with agoraphobia, social phobia, atypical depression or mixed anxiety disorder and depression, bulimia, and post-traumatic stress disorder, as well as borderline personality disorder, and Obsessive Compulsive Disorder (OCD). MAOIs appear to be particularly effective in the management of bipolar depression according to a recent retrospective-analysis. There are reports of MAOI efficacy in obsessive-compulsive disorder (OCD), trichotillomania, dysmorphophobia, and avoidant personality disorder, but these reports are from uncontrolled case reports.
MAOIs can also be used in the treatment of Parkinson's disease by targeting MAO-B in particular (therefore affecting dopaminergic neurons), as well as providing an alternative for migraine prophylaxis. Inhibition of both MAO-A and MAO-B is used in the treatment of clinical depression and anxiety.
MAOIs appear to be particularly indicated for outpatients with dysthymia complicated by panic disorder or hysteroid dysphoria
Newer MAOIs such as selegiline (typically used in the treatment of Parkinson's disease) and the reversible MAOI moclobemide provide a safer alternative and are now sometimes used as first-line therapy.
People taking MAOIs generally need to change their diets to limit or avoid foods and beverages containing tyramine. If large amounts of tyramine are consumed, they may suffer hypertensive crisis, which can be fatal. Examples of foods and beverages with potentially high levels of tyramine include animal liver and fermented substances, such as alcoholic beverages and aged cheeses. Excessive concentrations of tyramine in blood plasma can lead to hypertensive crisis by increasing the release of norepinephrine (NE), which causes blood vessels to constrict by activating alpha-1 adrenergic receptors. Ordinarily, MAO-A would destroy the excess NE; when MAO-A is inhibited, however, NE levels get too high, leading to dangerous increases in blood pressure.
RIMAs are displaced from MAO-A in the presence of tyramine, rather than inhibiting its breakdown in the liver as general MAOIs do. Additionally, MAO-B remains free and continues to metabolize tyramine in the stomach, although this is less significant than the liver action. Thus, RIMAs are unlikely to elicit tyramine-mediated hypertensive crisis; moreover, dietary modifications are not usually necessary when taking a reversible inhibitor of MAO-A (i.e., moclobemide) or low doses of selective MAO-B inhibitors (e.g., selegiline 6 mg/24 hours transdermal patch).
The most significant risk associated with the use of MAOIs is the potential for drug interactions with over-the-counter and prescription medicines, ‘controlled’ drugs or medications, and some dietary supplements (e.g., St. John's wort, tryptophan). It is vital that a doctor supervise such combinations to avoid adverse reactions. For this reason, many users carry an MAOI-card, which lets emergency medical personnel know what drugs to avoid (e.g. adrenaline (epinephrine) dosage should be reduced by 75%, and duration is extended.)
Tryptophan supplements should not be consumed with MAOIs as the potentially fatal serotonin syndrome may result.
MAOIs should not be combined with other psychoactive substances (antidepressants, painkillers, stimulants, including prescribed, OTC and ‘controlled drugs’, etc.) except under expert care. Certain combinations can cause lethal reactions, common examples including SSRIs, tricyclics, MDMA, meperidine, tramadol, and dextromethorphan. Drugs that affect the release or reuptake of epinephrine, norepinephrine, or dopamine typically need to be administered at lower doses due to the resulting potentiated and prolonged effect. MAOIs also interact with tobacco-containing products (e.g. cigarettes) and may potentiate the effects of certain compounds in tobacco. This may be reflected in the difficulty of smoking cessation, as tobacco contains naturally occurring MAOI compounds in addition to the nicotine.
While safer than general MAOIs, still possess significant and potentially serious drug interactions with many common drugs; in particular, they can cause serotonin syndrome or hypertensive crisis when combined with almost any antidepressant or stimulant, common migraine medications, certain herbs, or most cold medicines (including decongestants, antihistamines, and cough syrup).
Ocular alpha-2 agonists such as brimonidine and apraclonidine are glaucoma medications which reduce intraocular pressure by decreasing aqueous production. These alpha-2 agonists should not be given with oral MAOIs due to the risk of hypertensive crisis.
Antidepressants including MAOIs have some dependence-producing effects, the most notable one being a discontinuation syndrome, which may be severe especially if MAOIs are discontinued abruptly or too rapidly. The dependence-producing potential of MAOIs or antidepressants in general is not as significant as benzodiazepines, however. Discontinuation symptoms can be managed by a gradual reduction in dosage over a period of weeks, months or years to minimize or prevent withdrawal symptoms.
MAOIs, as with most antidepressant medication, may not alter the course of the disorder in a significant, permanent way, so it is possible that discontinuation can return the patient to the pre-treatment state. This consideration complicates prescribing between a MAOI and a SSRI, because it is necessary to clear the system completely of one drug before starting another. One physician organization recommendeds the dose to be tapered down over a minimum of four weeks, followed by a two week washout period. The result is that a depressed patient will have to bear the depression without chemical help during the drug-free interval. This may be preferable to risking the effects of an interaction between the two drugs.
The MAOIs are infamous for their numerous drug interactions, including the following kinds of substances:
Such substances that can react with MAOIs include:
MAOIs act by inhibiting the activity of monoamine oxidase, thus preventing the breakdown of monoamine neurotransmitters and thereby increasing their availability. There are two isoforms of monoamine oxidase, MAO-A and MAO-B. MAO-A preferentially deaminates serotonin, melatonin, epinephrine, and norepinephrine. MAO-B preferentially deaminates phenethylamine and certain other trace amines; in contrast, MAO-A preferentially deaminates other trace amines, like tyramine, whereas dopamine is equally deaminated by both types.
The early MAOIs covalently bound to the monoamine oxidase enzymes, thus inhibiting them irreversibly; the bound enzyme could not function and thus enzyme activity was blocked until the cell made new enzymes. The enzymes turn over approximately every two weeks. A few newer MAOIs, a notable one being moclobemide, are reversible, meaning that they are able to detach from the enzyme to facilitate usual catabolism of the substrate. The level of inhibition in this way is governed by the concentrations of the substrate and the MAOI.
Harmaline found in "Peganum harmala", "Banisteriopsis caapi", and "Passiflora incarnata" is a reversible inhibitor of monoamine oxidase A (RIMA).
In addition to reversibility, MAOIs differ by their selectivity of the MAO enzyme subtype. Some MAOIs inhibit both MAO-A and MAO-B equally, other MAOIs have been developed to target one over the other.
MAO-A inhibition reduces the breakdown of primarily serotonin, norepinephrine, and dopamine; selective inhibition of MAO-A allows for tyramine to be metabolised via MAO-B. Agents that act on serotonin if taken with another serotonin-enhancing agent may result in a potentially fatal interaction called serotonin syndrome or with irreversible and unselective inhibitors (such as older MAOIs), of MAO a hypertensive crisis as a result of tyramine food interactions is particularly problematic with older MAOIs. Tyramine is broken down by MAO-A and MAO-B, therefore inhibiting this action may result in its excessive build-up, so diet must be monitored for tyramine intake.
MAO-B inhibition reduces the breakdown mainly of dopamine and phenethylamine so there are no dietary restrictions associated with this. MAO-B would also metabolize tyramine, as the only differences between dopamine, phenethylamine, and tyramine are two phenylhydroxyl groups on carbons 3 and 4. The 4-OH would not be a steric hindrance to MAO-B on tyramine. Selegiline is selective for MAO-B at low doses, but non-selective at higher doses.
MAOIs started off due to the serendipitous discovery that iproniazid was a potent MAO inhibitor (MAOI). Originally intended for the treatment of tuberculosis, in 1952, iproniazid's antidepressant properties were discovered when researchers noted that the depressed patients given iproniazid experienced a relief of their depression. Subsequent in vitro work led to the discovery that it inhibited MAO and eventually to the monoamine theory of depression. MAOIs became widely used as antidepressants in the early 1950s. The discovery of the 2 isoenzymes of MAO has led to the development of selective MAOIs that may have a more favorable side-effect profile.
The older MAOIs' heyday was mostly between the years 1957 and 1970. The initial popularity of the 'classic' non-selective irreversible MAO inhibitors began to wane due to their serious interactions with sympathomimetic drugs and tyramine-containing foods that could lead to dangerous hypertensive emergencies. As a result, the use by medical practitioners of these older MAOIs declined. When scientists discovered that there are two different MAO enzymes (MAO-A and MAO-B), they developed selective compounds for MAO-B, (for example, selegiline, which is used for Parkinson's disease), to reduce the side-effects and serious interactions. Further improvement occurred with the development of compounds (moclobemide and toloxatone) that not only are selective but cause reversible MAO-A inhibition and a reduction in dietary and drug interactions. Moclobemide, was the first reversible inhibitor of MAO-A to enter widespread clinical practice.
A transdermal patch form of the MAOI selegiline, called Emsam, was approved for use in depression by the Food and Drug Administration in the United States on 28 February 2006.
Linezolid is an antibiotic drug with weak, reversible MAO-inhibiting activity.
Methylene blue, the antidote indicated for drug-induced methemoglobinemia, among a plethora of other off-label uses, is a highly potent, reversible MAO inhibitor.
Marketed pharmaceuticals
Naturally occurring RIMAs in plants
Research compounds
|
https://en.wikipedia.org/wiki?curid=20869
|
Mycology
Mycology is the branch of biology concerned with the study of fungi, including their genetic and biochemical properties, their taxonomy and their use to humans as a source for tinder, traditional medicine, food, and entheogens, as well as their dangers, such as toxicity or infection.
A biologist specializing in mycology is called a mycologist.
Mycology branches into the field of phytopathology, the study of plant diseases, and the two disciplines remain closely related because the vast majority of plant pathogens are fungi.
Historically, mycology was a branch of botany because, although fungi are evolutionarily more closely related to animals than to plants, this was not recognized until a few decades ago. Pioneer "mycologists" included Elias Magnus Fries, Christian Hendrik Persoon, Anton de Bary, and Lewis David von Schweinitz.
Many fungi produce toxins, antibiotics, and other secondary metabolites. For example, the cosmopolitan (worldwide) genus "Fusarium" and their toxins associated with fatal outbreaks of alimentary toxic aleukia in humans were extensively studied by Abraham Joffe.
Fungi are fundamental for life on earth in their roles as symbionts, e.g. in the form of mycorrhizae, insect symbionts, and lichens. Many fungi are able to break down complex organic biomolecules such as lignin, the more durable component of wood, and pollutants such as xenobiotics, petroleum, and polycyclic aromatic hydrocarbons. By decomposing these molecules, fungi play a critical role in the global carbon cycle.
Fungi and other organisms traditionally recognized as fungi, such as oomycetes and myxomycetes (slime molds), often are economically and socially important, as some cause diseases of animals (such as histoplasmosis) as well as plants (such as Dutch elm disease and Rice blast).
Apart from pathogenic fungi, many fungal species are very important in controlling the plant diseases caused by different pathogens. For example, species of the filamentous fungal genus "Trichoderma" considered as one of the most important biological control agents as an alternative to chemical based products for effective crop diseases management.
Field meetings to find interesting species of fungi are known as 'forays', after the first such meeting organized by the Woolhope Naturalists' Field Club in 1868 and entitled "A foray among the funguses"["sic"].
Some fungi can cause disease in humans and other animals - The study of pathogenic fungi that infect animals is referred to as medical mycology.
It is believed that humans started collecting mushrooms as food in prehistoric times. Mushrooms were first written about in the works of Euripides (480-406 B.C.). The Greek philosopher Theophrastos of Eresos (371-288 B.C.) was perhaps the first to try to systematically classify plants; mushrooms were considered to be plants missing certain organs. It was later Pliny the Elder (23–79 A.D.), who wrote about truffles in his encyclopedia "Naturalis historia". The word "mycology" comes from the Greek: μύκης ("mukēs"), meaning "fungus" and the suffix ("-logia"), meaning "study".
The Middle Ages saw little advancement in the body of knowledge about fungi. However, the invention of the printing press allowed authors to dispel superstitions and misconceptions about the fungi that had been perpetuated by the classical authors.
The start of the modern age of mycology begins with Pier Antonio Micheli's 1737 publication of "Nova plantarum genera". Published in Florence, this seminal work laid the foundations for the systematic classification of grasses, mosses and fungi. He originated the still current genus names "Polyporus" P. Micheli and "Tuber" P. Micheli, both dated 1729 (though the descriptions were later amended as invalid by modern rules). Note that when referring to the scientific name of a genus, the author abbreviation can optionally be added afterwards.
The founding nomenclaturist Linnaeus included fungi in his "binomial" naming system of 1753, where each type of organism has a two-word name consisting of the "genus" and the "species" (whereas up to then organisms were often designated with Latin phrases containing many words). He originated the scientific names, still used today, of numerous well-known mushroom taxa, such as "Boletus" L. and "Agaricus" L.. At that period fungi were considered to belong to the plant kingdom, and so they find their place in his magnum opus "Species plantarum", but he was much more interested in higher plants and for instance he grouped together as genus "Agaricus" all gilled mushrooms which have a stem. There are many thousands of such gilled species, which later were divided into dozens of diverse genera and in its modern usage the genus "Agaricus" only refers to mushrooms closely related to the common shop mushroom, "Agaricus bisporus" (J.E. Lange) Imbach. As an example, Linnaeus gave the name "Agaricus deliciosus" to the saffron milk-cap, but its current name is "Lactarius deliciosus" (L.) Gray. On the other hand the field mushroom "Agaricus campestris" L. has kept the same name ever since Linnaeus's publication. The English word "agaric" is still used for any gilled mushroom, which corresponds to Linnaeus's sense of the word.
The term "mycology" and the complementary term "mycologist" were first used in 1836 by M.J. Berkeley.
For centuries, certain mushrooms have been documented as a folk medicine in China, Japan, and Russia. Although the use of mushrooms in folk medicine is centered largely on the Asian continent, people in other parts of the world like the Middle East, Poland, and Belarus have been documented using mushrooms for medicinal purposes.
Mushrooms produce large amounts of vitamin D when exposed to ultraviolet (UV) light. Penicillin, ciclosporin, griseofulvin, cephalosporin and psilocybin are examples of drugs that have been isolated from molds or other fungi.
|
https://en.wikipedia.org/wiki?curid=20874
|
Melancholia
Melancholia (from ' "black bile", "blackness of the bile" also Latin "lugere" lugubriousness to mourn, Latin "morosus" moroseness of self-will or fastidious habit, and old English "wist wistfulness" of intent or saturnine) is a mental condition characterized by extreme depression, bodily complaints, and sometimes hallucinations and delusions. Melancholia is a concept from ancient or pre-modern medicine. Melancholy was one of the four temperaments matching the four humours. In the 19th century, "melancholia" could be physical as well as mental, and melancholic conditions were classified as such by their common cause rather than by their properties. It is the predecessor of the mental health diagnosis of clinical depression and still exists as a subtype for major depression known as melancholic depression.
The name "melancholia" comes from the old medical belief of the four humours: disease or ailment being caused by an imbalance in one or more of the four basic bodily liquids, or humours. Personality types were similarly determined by the dominant humor in a particular person. According to Hippocrates and subsequent tradition, melancholia was caused by an excess of black bile, hence the name, which means "black bile", from Ancient Greek μέλας ("melas"), "dark, black", and χολή ("kholé"), "bile"; a person whose constitution tended to have a preponderance of black bile had a "melancholic" disposition. In the complex elaboration of humorist theory, it was associated with the earth from the Four Elements, the season of autumn, the spleen as the originating organ and cold and dry as related qualities. In astrology it showed the influence of Saturn, hence the related adjective "saturnine".
Melancholia was described as a distinct disease with particular mental and physical symptoms in the 5th and 4th centuries BC. Hippocrates, in his "Aphorisms", characterized all "fears and despondencies, if they last a long time" as being symptomatic of melancholia. Other symptoms mentioned by Hippocrates include: poor appetite, abulia, sleeplessness, irritability, agitation. Despite the early age, the Hippocratic clinical description of melancholia shows significant overlaps with contemporary nosography of depressive syndromes (6 symptoms out of the 9 included in DSM diagnostic criteria for a Major Depressive).
In the middle ages the humoral, somatic paradigm for understanding sustained sadness lost primacy in front of the prevailing religious perspective. Sadness came to be a vice (λύπη in the Greek vice list by Evagrius Ponticus, tristitia vel acidia in the 7 vice list by Gregorius Magnus). When a patient could not be cured of the disease it was thought that the melancholia was a result of demonic possession.
In his study of French and Burgundian courtly culture, Johan Huizinga noted that "at the close of the Middle Ages, a sombre melancholy weighs on people's souls." In chronicles, poems, sermons, even in legal documents, an immense sadness, a note of despair and a fashionable sense of suffering and deliquescence at the approaching end of times, suffuses court poets and chroniclers alike: Huizinga quotes instances in the ballads of Eustache Deschamps, "monotonous and gloomy variations of the same dismal theme", and in Georges Chastellain's prologue to his Burgundian chronicle, and in the late fifteenth-century poetry of Jean Meschinot. Ideas of reflection and the workings of imagination are blended in the term "merencolie", embodying for contemporaries "a tendency", observes Huizinga, "to identify all serious occupation of the mind with sadness".
Painters were considered by Vasari and other writers to be especially prone to melancholy by the nature of their work, sometimes with good effects for their art in increased sensitivity and use of fantasy. Among those of his contemporaries so characterised by Vasari were Pontormo and Parmigianino, but he does not use the term of Michelangelo, who used it, perhaps not very seriously, of himself. A famous allegorical engraving by Albrecht Dürer is entitled "Melencolia I". This engraving has been interpreted as portraying melancholia as the state of waiting for inspiration to strike, and not necessarily as a depressive affliction. Amongst other allegorical symbols, the picture includes a magic square and a truncated rhombohedron. The image in turn inspired a passage in "The City of Dreadful Night" by James Thomson (B.V.), and, a few years later, a sonnet by Edward Dowden.
The most extended treatment of melancholia comes from Robert Burton, whose "The Anatomy of Melancholy" (1621) treats the subject from both a literary and a medical perspective. Burton wrote in the 17th century that music and dance were critical in treating mental illness, especially melancholia.
in 10th century Persian physician Al-Akhawayni describes Melancholia as a chronic illness and relates it to brain, which is one of the main aspects of his view on Melancholia. He describes Melancholia's initial clinical manifestations as "suffering from an unexplained fear, inability to answer questions or providing false answers, self-laughing and self-crying and speaking meaninglessly, yet with no fever"
In the Encyclopédie of Diderot and d'Alembert, the causes of melancholia are stated to be similar to those that cause Mania: "grief, pains of the spirit, passions, as well as all the love and sexual appetites that go unsatisfied."
During the later 16th and early 17th centuries, a curious cultural and literary cult of melancholia arose in England. In an influential 1964 essay in Apollo, art historian Roy Strong traced the origins of this fashionable melancholy to the thought of the popular Neoplatonist and humanist Marsilio Ficino (1433–1499), who replaced the medieval notion of melancholia with something new:
"The Anatomy of Melancholy" ("The Anatomy of Melancholy, What it is: With all the Kinds, Causes, Symptomes, Prognostickes, and Several Cures of it... Philosophically, Medicinally, Historically, Opened and Cut Up") by Burton, was first published in 1621 and remains a defining literary monument to the fashion. Another major English author who made extensive expression upon being of an melancholic disposition is Sir Thomas Browne in his Religio Medici (1643).
"Night-Thoughts" ("The Complaint: or, Night-Thoughts on Life, Death, & Immortality"), a long poem in blank verse by Edward Young was published in nine parts (or "nights") between 1742 and 1745, and hugely popular in several languages. It had a considerable influence on early Romantics in England, France and Germany. William Blake was commissioned to illustrate a later edition.
In the visual arts, this fashionable intellectual melancholy occurs frequently in portraiture of the era, with sitters posed in the form of "the lover, with his crossed arms and floppy hat over his eyes, and the scholar, sitting with his head resting on his hand"—descriptions drawn from the frontispiece to the 1638 edition of Burton's "Anatomy", which shows just such by-then stock characters. These portraits were often set out of doors where Nature provides "the most suitable background for spiritual contemplation" or in a gloomy interior.
In music, the post-Elizabethan cult of melancholia is associated with John Dowland, whose motto was "Semper Dowland, semper dolens" ("Always Dowland, always mourning"). The melancholy man, known to contemporaries as a "malcontent", is epitomized by Shakespeare's Prince Hamlet, the "Melancholy Dane".
A similar phenomenon, though not under the same name, occurred during the German Sturm und Drang movement, with such works as "The Sorrows of Young Werther" by Goethe or in Romanticism with works such as "Ode on Melancholy" by John Keats or in Symbolism with works such as "Isle of the Dead" by Arnold Böcklin. In the 20th century, much of the counterculture of modernism was fueled by comparable alienation and a sense of purposelessness called "anomie"; earlier artistic preoccupation with death has gone under the rubric of memento mori. The medieval condition of acedia ("acedie" in English) and the Romantic Weltschmerz were similar concepts, most likely to affect the intellectual.
|
https://en.wikipedia.org/wiki?curid=20875
|
Mimosa
Mimosa is a genus of about 400 species of herbs and shrubs, in the mimosoid clade of the legume family Fabaceae. The generic name is derived from the Greek word ("mimos"), an "actor" or "mime", and the feminine suffix -"osa", "resembling", suggesting its 'sensitive leaves' which seem to 'mimic conscious life'.
Two species in the genus are especially notable. One is "Mimosa pudica", which folds its leaves when touched or exposed to heat. It is native to southern Central and South America but is widely cultivated elsewhere for its curiosity value, both as a houseplant in temperate areas, and outdoors in the tropics. Outdoor cultivation has led to weedy invasion in some areas, notably Hawaii. The other is "Mimosa tenuiflora", which is best known for its use in shamanic ayahuasca brews due to the psychedelic drug dimethyltryptamine found in its root bark.
The taxonomy of the genus "Mimosa" has had a tortuous history, having gone through periods of splitting and lumping, ultimately accumulating over 3,000 names, many of which have either been synonymized under other species or transferred to other genera. In part due to these changing circumscriptions, the name "Mimosa" has also been applied to several other related species with similar pinnate or bipinnate leaves, but are now classified in other genera. The most common examples of this are "Albizia julibrissin" (Persian silk tree) and "Acacia dealbata" (wattle).
Members of this genus are among the few plants capable of rapid movement; examples outside of "Mimosa" include the telegraph plant, "Aldrovanda", some species of "Drosera" and the Venus flytrap. The leaves of the "Mimosa pudica" close quickly when touched. Some mimosas raise their leaves in the day and lower them at night, and experiments done by Jean-Jacques d'Ortous de Mairan on mimosas in 1729 provided the first evidence of biological clocks.
"Mimosa" can be distinguished from the large related genera, "Acacia" and "Albizia", since its flowers have ten or fewer stamens. Botanically, what appears to be a single globular flower is actually a cluster of many individual ones. "Mimosa" contains some level of heptanoic acid.
There are about 400 species including:
The bark, leaves and flowers have been used for centuries in traditional Chinese medicine. The bark is still known as 'Collective Happiness Bark' and is used for cleansing the body's energetic pathways (heart and liver meridians), allegedly providing a spiritual boost for those who take it.
The ancient Mayans also used it regularly to treat injuries and burns.
Despite this, modern research remains surprisingly insignificant, but the powdered bark is commonly used by homeopaths as an anti-inflammatory, anti-septic, cough/cold relief and painkiller.
|
https://en.wikipedia.org/wiki?curid=20876
|
Manhattan (cocktail)
A Manhattan is a cocktail made with whiskey, sweet vermouth, and bitters. While rye is the traditional whiskey of choice, other commonly used whiskies include Canadian whisky, bourbon, blended whiskey, and Tennessee whiskey. The cocktail is usually stirred then strained into a cocktail glass and garnished traditionally with a Luxardo cherry and not a Maraschino cherry. A Manhattan may also be served on the rocks in a lowball glass.
The whiskey-based Manhattan is one of five cocktails named for a New York City borough. It is closely related to the Brooklyn cocktail, which uses dry vermouth and Maraschino liqueur in place of the Manhattan's sweet vermouth, and Amer Picon in place of the Manhattan's angostura bitters.
The Manhattan is one of six basic drinks listed in David A. Embury's 1948 classic "The Fine Art of Mixing Drinks".
Popular history suggests that the drink originated at the Manhattan Club in New York City in the mid 1870s, where it was invented by Dr. Iain Marshall for a banquet hosted by Jennie Jerome (Lady Randolph Churchill, mother of Winston) in honor of presidential candidate Samuel J. Tilden. The success of the banquet made the drink fashionable, later prompting several people to request the drink by referring to the name of the club where it originated—"the "Manhattan" cocktail". However, Lady Randolph was in France at the time and pregnant, so the story is likely a fiction.
However, there are prior references to various similar cocktail recipes called "Manhattan" and served in the Manhattan area. By one account it was invented in the 1860s by a bartender named Black at a bar on Broadway near Houston Street.
The original "Manhattan cocktail" was a mix of "American Whiskey, Italian Vermouth, and Angostura bitters". During Prohibition (1920–1933) Canadian whisky was primarily used because it was available.
An early record of the cocktail can be found in William Schmidt's "The Flowing Bowl", published in 1891. In it, he details a drink containing 2 dashes of gum (gomme syrup), 2 dashes of bitters, 1 dash of absinthe, portion of whiskey, and portion of vermouth.
The same cocktail appears listed as a "Tennessee Cocktail" in "Shake 'em Up!" by V. Elliott and P. Strong: "Two parts of whiskey, one part of Italian Vermouth, and a dash of bitters poured over ice and stirred vigorously."
On the small North Frisian island of Föhr, the Manhattan cocktail is a standard drink at almost every cafe, restaurant, and "get together" of locals. The story goes that many of the people of Föhr emigrated to Manhattan during deep sea fishing trips, took a liking to the drink, and brought it back to Föhr with them. The drink is usually mixed 1 part vermouth to 2 parts whiskey, with a dash of bitters, served ice cold, in an ice cold glass, or with ice and a cherry garnish.
Traditional views insist that a Manhattan be made with American rye whiskey. However it can also be made with bourbon or Canadian whisky. The Manhattan is subject to considerable variation and innovation, and is often a way for the best bartenders to show off their creativity. Some shake the ingredients with ice in a cocktail shaker instead of stirring it, creating a froth on the surface of the drink. Angostura is the classic bitters, but orange bitters or Peychaud's Bitters may be used. Some make their own bitters and syrups, substitute comparable digestifs in place of vermouth, specialize in local or rare whiskeys, or use other exotic ingredients. A lemon peel may be used as garnish. Some add juice from the cherry jar or Maraschino liqueur to the cocktail for additional sweetness and color.
Originally, bitters were considered an integral part of any cocktail, as the ingredient that differentiated a cocktail from a sling. Over time, those definitions of "cocktail" and "sling" have become archaic, as "sling" has fallen out of general use (other than in certain drink names), and "cocktail" can mean any drink that resembles a martini, or simply any mixed drink.
The following are other variations on the classic Manhattan:
|
https://en.wikipedia.org/wiki?curid=20879
|
Mira
Mira (), designation Omicron Ceti (ο Ceti, abbreviated Omicron Cet, ο Cet), is a red giant star estimated to be 200–400 light-years from the Sun in the constellation Cetus.
ο Ceti is a binary stellar system, consisting of a variable red giant (Mira A) along with a white dwarf companion (Mira B). Mira A is a pulsating variable star and was the first non-supernova variable star discovered, with the possible exception of Algol. It is the prototype of the Mira variables.
ο Ceti (Latinised to "Omicron Ceti") is the star's Bayer designation. It was named Mira (Latin for 'wonderful' or 'astonishing') by Johannes Hevelius in his "Historiola Mirae Stellae" (1662). In 2016, the International Astronomical Union organized a Working Group on Star Names (WGSN) to catalog and standardize proper names for stars. The WGSN's first bulletin of July 2016 included a table of the first two batches of names approved by the WGSN, which included Mira for this star.
Evidence that the variability of Mira was known in ancient China, Babylon or Greece is at best only circumstantial. What is certain is that the variability of Mira was recorded by the astronomer David Fabricius beginning on August 3, 1596. Observing what he thought was the planet Mercury (later identified as Jupiter), he needed a reference star for comparing positions and picked a previously unremarked third-magnitude star nearby. By August 21, however, it had increased in brightness by one magnitude, then by October had faded from view. Fabricius assumed it was a nova, but then saw it again on February 16, 1609.
In 1638 Johannes Holwarda determined a period of the star's reappearances, eleven months; he is often credited with the discovery of Mira's variability. Johannes Hevelius was observing it at the same time and named it Mira in 1662, for it acted like no other known star. Ismail Bouillaud then estimated its period at 333 days, less than one day off the modern value of 332 days. Bouillaud's measurement may not have been erroneous: Mira is known to vary slightly in period, and may even be slowly changing over time. The star is estimated to be a six-billion-year-old red giant.
There is considerable speculation as to whether Mira had been observed prior to Fabricius. Certainly Algol's history (known for certain as a variable only in 1667, but with legends and such dating back to antiquity showing that it had been observed with suspicion for millennia) suggests that Mira might have been known too. Karl Manitius, a modern translator of Hipparchus' "Commentary on Aratus", has suggested that certain lines from that second-century text may be about Mira. The other pre-telescopic Western catalogs of Ptolemy, al-Sufi, Ulugh Beg, and Tycho Brahe turn up no mentions, even as a regular star. There are three observations from Chinese and Korean archives, in 1596, 1070, and the same year when Hipparchus would have made his observation (134 BC) that are suggestive, but the Chinese practice of pinning down observations no more precisely than within a given Chinese constellation makes it difficult to be sure.
The distance to Mira is uncertain; pre-Hipparcos estimates centered on 220 light-years; while Hipparcos data from the 2007 reduction suggest a distance of 299 light-years, with a margin of error of 11%.
This binary star system consists of a red giant (Mira, designated Mira A) undergoing mass loss and a high temperature white dwarf companion (Mira B) that is accreting mass from the primary. Such an arrangement of stars is known as a symbiotic system and this is the closest such symbiotic pair to the Sun. Examination of this system by the Chandra X-ray Observatory shows a direct mass exchange along a bridge of matter from the primary to the white dwarf. The two stars are currently separated by about 70 astronomical units.
Mira A is currently an asymptotic giant branch (AGB) star, in the thermally pulsing AGB phase. Each pulse lasts a decade or more, and an amount of time on the order of 10,000 years passes between each pulse. With every pulse cycle Mira increases in luminosity and the pulses grow stronger. This is also causing dynamic instability in Mira, resulting in dramatic changes in luminosity and size over shorter, irregular time periods.
The overall shape of Mira A has been observed to change, exhibiting pronounced departures from symmetry. These appear to be caused by bright spots on the surface that evolve their shape on time scales of 3–14 months. Observations of Mira A in the ultraviolet band by the Hubble Space Telescope have shown a plume-like feature pointing toward the companion star.
Mira A is a well-known example of a category of variable stars known as Mira variables, which are named after it. The 6,000 to 7,000 known stars of this class are all red giants whose surfaces pulsate in such a way as to increase and decrease in brightness over periods ranging from about 80 to more than 1,000 days.
In the particular case of Mira, its increases in brightness take it up to about magnitude 3.5 on average, placing it among the brighter stars in the Cetus constellation. Individual cycles vary too; well-attested maxima go as high as magnitude 2.0 in brightness and as low as 4.9, a range almost 15 times in brightness, and there are historical suggestions that the real spread may be three times this or more. Minima range much less, and have historically been between 8.6 and 10.1, a factor of four times in luminosity. The total swing in brightness from absolute maximum to absolute minimum (two events which did not occur on the same cycle) is 1,700 times. Mira emits the vast majority of its radiation in the infrared, and its variability in that band is only about two magnitudes. The shape of its light curve is of an increase over about 100 days, and the return to minimum taking twice as long.
Contemporary approximate maxima for Mira:
From northern temperate latitudes, Mira is generally not visible between late March and June due to its proximity to the Sun. This means that at times several years can pass without it appearing as a naked-eye object.
The pulsations of Mira variables cause the star to expand and contract, but also to change its temperature. The temperature is highest slightly after the visual maximum, and lowest slightly before minimum. The photosphere, measured at the Rosseland radius, is smallest just before visual maximum and close to the time of maximum temperature. The largest size is reached slightly before the time of lowest temperature. The bolometric luminosity is proportional to the fourth power of the temperature and the square of the radius, but the radius varies by over 20% and the temperature by less than 10%.
In Mira, the highest luminosity occurs close to the time when the star is hottest and smallest. The visual magnitude is determined both by the luminosity and by the proportion of the radiation that occurs at visual wavelengths. Only a small proportion of the radiation is emitted at visual wavelengths and this proportion is very strongly influenced by the temperature (Planck's law). Combined with the overall luminosity changes, this creates the very big visual magnitude variation with the maximum occurring when the temperature is high.
Infrared VLTI measurements of Mira at phases 0.13, 0.18, 0.26, 0.40, and 0.47, show that the radius varies from at phase 0.13 just after maximum to at phase 0.40 approaching minimum. The temperature at phase 0.13 is and at phase 0.26 about halfway from maximum to minimum. The luminosity is calculated to be at phase 0.13 and at phase 0.26.
The pulsations of Mira have the effect of expanding its photosphere by around 50% compared to a non-pulsating star. In the case of Mira, if it was not pulsating it is modelled to have a radius of only around .
Ultraviolet studies of Mira by NASA's Galaxy Evolution Explorer (GALEX) space telescope have revealed that it sheds a trail of material from the outer envelope, leaving a tail 13 light-years in length, formed over tens of thousands of years. It is thought that a hot bow-wave of compressed plasma/gas is the cause of the tail; the bow-wave is a result of the interaction of the stellar wind from Mira A with gas in interstellar space, through which Mira is moving at an extremely high speed of 130 kilometres/second (291,000 miles per hour). The tail consists of material stripped from the head of the bow-wave, which is also visible in ultraviolet observations. Mira's bow-shock will eventually evolve into a planetary nebula, the form of which will be considerably affected by the motion through the interstellar medium (ISM).
The companion star was resolved by the Hubble Space Telescope in 1995, when it was 70 astronomical units from the primary; and results were announced in 1997. The HST ultraviolet images and later X-ray images by the Chandra space telescope show a spiral of gas rising off Mira in the direction of Mira B. The companion's orbital period around Mira is approximately 400 years.
In 2007, observations showed a protoplanetary disc around the companion, Mira B. This disc is being accreted from material in the solar wind from Mira and could eventually form new planets. These observations also hinted that the companion was a main sequence star of around 0.7 solar masses and spectral type K, instead of a white dwarf as originally thought. However, in 2010 further research indicated that Mira B is, in fact, a white dwarf.
|
https://en.wikipedia.org/wiki?curid=20880
|
MV Virginian (T-AK 9205)
MV "Virginian" (T-AK 9205), formerly named the MV "Strong Virginian" (T-AKR-9205), is a combination container, heavy lift, and roll-on/roll-off ship. Owned and operated by Sealift Incorporated of Oyster Bay, New York, the ship is one of seventeen containerroll-on/roll-off ships in use by the Military Sealift Command, and one of 28 ships assigned to that organization's Sealift Program Office. The ship was previously known as the MV "Saint Magnus and the MV "Jolly Indaco.
The ship has one large cargo hold with a tween deck that can be set at three different heights. It has a single 800-ton derrick for heavy-lift use. In addition it has a single traveling gantry crane fitted with dual portal cranes, both of which are rated at 75 metric tons independently, and can be operated together for lifts up to 150 metric tons. For roll-on/roll-off (roro) cargo, the ship has two trailer elevators and roro ramps.
Built as "Saint Magnus" at Bremer Vulkan, Bremen, Germany in 1984, "Virginian" spent her first years in the commercial shipping service. Ironically, the ship that would later be known for carrying military supplies to the Middle East was accidentally hit by an Exocet missile while off-loading commercial cargo in Iraq in 1986. In these early years, the ship was also renamed "Jolly Indaco".
MSC first chartered the ship, then known as MV "Strong Virginian", in 1992. For the next five years, a 500-bed fleet hospital was prepositioned aboard the ship as she carried out a variety of missions for the Department of Defense. Some of its jobs during this time included delivering equipment and supplies to Africa as part of Operation Restore Hope, transporting a bio-safety lab from Inchon, Korea, to Jakarta, Indonesia, and ferrying harbor tugs used by the U.S. Navy from Diego Garcia to Guam and back.
On March 14, 1997, the United States Department of Defense announced a new charter for the "Strong Virginian". This contract, number N00033-97-C-3007, was a $23,592,099 time charter contract from the Military Sealift Command to operator Van Ommeren Shipping (USA), Inc., of Stamford, Connectucut. Under the contract, the "Strong Virginian" was to be used in the prepositioning of United States Army cargo in the Indian Ocean at the island of Diego Garcia. The contract included options which could have brought the cumulative value up to US$47,992,099 and was to expire by March 1999. This contract was competitively procured with 250 proposals solicited and four offers received.
Virginian was chartered again in 1998 and, for the next four years, the ship was used to support the U.S. Army. Virginian delivered combat craft, tugboats and barges and other elements of the Army's port opening packages. These packages are used to give the military access to rarely used ports in areas vital to U.S. military operations. On September 30, 2002, the ship was released from MSC service and returned to its owner.
Sealift Incorporated bought the ship from Van Ommeren Shipping USA, Inc. taking delivery on June 10, 2003. At that point, Sealift renamed the ship the "Virginian". Between November 2002 and May 2006, the "Virginian" completed 21 missions for the U.S. military, delivering almost , or nearly 30 football fields, of cargo.
On October 16, 2007 the United States Department of Defense announced that it awarded contract N00033-08-C-5500 to Sealift Incorporated. This was a $10,614,000 firm-fixed-price contract plus reimbursables for the "Virginian". The ship was contracted to carry containers laden with ammunition to support the global war on terrorism and the United States Central Command. The contract includes options, which, if exercised, would bring the cumulative value of this contract to $39,814,000. If options are exercised, work may continue through October 2011. This contract was competitively procured via Federal Business Opportunities and the Military Sealift Command websites, with more than 200 proposals solicited and three offers received. The U.S. Navy’s Military Sealift Command is the contracting authority.
The ship was sold for scrap in August 2012 in Singapore and was recycled in Bangladesh that same month.
|
https://en.wikipedia.org/wiki?curid=20881
|
Mojito
Mojito (; ) is a traditional Cuban highball. Traditionally, a mojito is a cocktail that consists of five ingredients: white rum, sugar (traditionally sugar cane juice), lime juice, soda water, and mint. Its combination of sweetness, citrus, and herbaceous mint flavors is intended to complement the rum, and has made the mojito a popular summer drink.
When preparing a mojito, fresh lime juice is added to sugar (or to simple syrup) and mint leaves. The mixture is then gently mashed with a muddler. The mint leaves should only be bruised to release the essential oils and should not be shredded. Then rum is added and the mixture is briefly stirred to dissolve the sugar and to lift the mint leaves up from the bottom for better presentation. Finally, the drink is topped with crushed ice and sparkling soda water. Mint leaves and lime wedges are used to garnish the glass.
The mojito is one of the most famous rum-based highballs. There are several versions of the mojito.
Havana, Cuba, is the birthplace of the mojito, although its exact origin is the subject of debate. It was known that the local South American Indians had remedies for various tropical illnesses, so a small boarding party went ashore on Cuba and came back with ingredients for an effective medicine. The ingredients were "aguardiente de caña" (translated as "burning water", a crude form of rum made from sugar cane) mixed with local tropical ingredients: lime, sugarcane juice, and mint. Lime juice on its own would have significantly prevented scurvy and dysentery, and tafia/rum was soon added as it became widely available to the British (ca. 1650). Mint, lime and sugar were also helpful in hiding the harsh taste of this spirit. While this drink was not called a mojito at this time, it was the original combination of these ingredients.
Some historians contend that African slaves who worked in the Cuban sugar cane fields during the 19th century were instrumental in the cocktail's origin. Guarapo, the sugar cane juice often used in mojitos, was a popular drink among the slaves who named it. It never originally contained lime juice.
There are several theories behind the origin of the name "mojito": one such theory holds that name relates to mojo, a Cuban seasoning made from lime and used to flavour dishes. Another theory is that the name Mojito is simply a derivative of "mojadito" (Spanish for "a little wet"), the diminutive of "mojado" ("wet").
The mojito has routinely been presented as a favorite drink of author Ernest Hemingway. It has also often been said that Hemingway made the bar called La Bodeguita del Medio famous when he became one of its regulars and wrote "My mojito in La Bodeguita, My daiquiri in El Floridita" on a wall of the bar. This epigraph, handwritten and signed in his name, persists despite doubts expressed by Hemingway biographers about such patronage and the author's taste for mojitos. La Bodeguita del Medio is better known for its food than its drink.
A survey by an international market research company found that in 2016 the mojito was the most popular cocktail in Britain and France.
Many hotels in Havana also add Angostura bitters to cut the sweetness of the Mojito; while icing sugar is often muddled with the mint leaves rather than cane sugar, and many establishments simply use sugar syrup to control sweetness. Many bars today in Havana use lemon juice rather than fresh lime. The "Rose Mojito," which is a Mojito variation containing the rose-flavored spirit, Lanique, was first created at the Albert's Schloss bar in Manchester, England. A Mojito without alcohol is called a "Virgin Mojito" or "Nojito". The Cojito adds coconut flavor, often through the use of coconut-flavored rum. The South Side is made with gin instead of rum; the South Side Fizz adds seltzer water. A dirty mojito calls for gold rum instead of white rum and to use raw sugar or demerara sugar. Demerara is a light brown, partially refined, sugar produced from the first crystallization during processing cane juice into sugar crystals. Adding this to a mojito gives it a caramel-like flavor. A dark rum mojito simply calls for a dark rum to be used instead of white.
In Mexico, tequila brand Don Julio offers the "Mojito Blanco" by simply replacing rum with tequila.
In Peru, there are mojito variations that are made by adding fruits like grapefruit, called "Mojito de toronja", or with passionfruit, called "Mojito de maracuyá". Many restaurants serve them, and these added ingredients enhance the cocktail and its original flavor. Some other fruits are found in other mojito recipes: pears, raspberries, and oranges. Purees of such fruits may also be used instead of the whole fruit itself.
The strawberry mojito includes muddled strawberries; a further departure along these lines substitutes gin for the light rum and lemon juice for lime juice, and adds tonic.
|
https://en.wikipedia.org/wiki?curid=20883
|
Mohammed Zahir Shah
Mohammed Zahir Shah (, ; 15 October 1914 – 23 July 2007) was the last King of Afghanistan, reigning from 8 November 1933 until he was deposed on 17 July 1973. He expanded Afghanistan's diplomatic relations with many countries, including with both Cold War sides. In the 1950s, Zahir Shah began modernizing the country, culminating in the creation of a new constitution and a constitutional monarchy system. His long reign was marked by peace in the country that was lost afterwards.
While on vacation in Italy, Zahir Shah's regime was overthrown in a "white coup" in 1973 by his cousin and former prime minister, Mohammed Daoud Khan, who established a republic. He remained in exile near Rome until 2002, returning to Afghanistan after the end of the Taliban government. He was given the title Father of the Nation, which he held until his death in 2007.
Zahir Shah was born on 15 October 1914, in a city quarter called Deh Afghanan (Afghans village) in Kabul, Afghanistan. He was the son of Mohammed Nadir Shah (1883-1933) a senior member of the Muhamadzai Royal family and commander in chief of the Afghan Army for former king Amanullah Khan, and of Begum Mah Parwar Begum (d. 1941), a Persian-speaking woman. Nadir Shah assumed the throne after the execution of Habibullah Kalakani on 10 October 1929. Mohammed Zahir's father, son of Sardar Mohammad Yusuf Khan, was born in Dehradun, British India, his family having been exiled after the Second Anglo-Afghan War. Nadir Shah was a descendant of Sardar Sultan Mohammed Khan Telai, half-brother of Emir Dost Mohammad Khan. His grandfather Mohammad Yahya Khan (father in law of Emir Yaqub Khan) was in charge of the negotiations with the British resulting in the Treaty of Gandamak. After the British invasion after the killing of Sir Louis Cavagnari during 1879, Yaqub Khan, Yahya Khan and his sons, Princes Mohammad Yusuf Khan and Mohammad Asef Khan, were seized by the British and transferred to the British Raj, where they remained forcibly until the two princes were invited back to Afghanistan by Emir Abdur Rahman Khan during the last year of his reign (1901). During the reign of Amir Habibullah they received the title of Companions of the King (Musahiban).
Zahir Shah was educated in a special class for princes at Elementary Primary built by 1904 of United Kingdom Habibia High School, many subjects were taught in English, and Secondary in went to the "Amaniya High School" (built during King Amanullah of France, in which many subjects were taught in French. This School was renamed by Nadir Shah to Esteqlal High School after the fall of King Amanullah. For example, Prince Zahir Khan had to study in Infantiere Military School in Winter (school year in Kabul, 21 March to November). Then he was sent to France for further training in Kabul. He continued his education in France where his father had served as a diplomatic envoy, studying at the Pasteur Institute and the University of Montpellier. When he returned to Afghanistan he helped his father and uncles restore order and reassert government control during a period of lawlessness in the country. He was later enrolled at an Infantry School and appointed a privy counsellor. Zahir Shah served in the government positions of deputy war minister and minister of education.
Zahir Khan was proclaimed king (shah) on 8 November 1933 at the age of 19, after the assassination of his father Mohammed Nadir Shah. After his ascension to the throne he was given the regnal title ""He who puts his trust in God, follower of the firm religion of Islam"". For the first almost thirty years he did not effectively rule, ceding power to his paternal uncles, Mohammad Hashim Khan and Shah Mahmud Khan, both serving as prime ministers. This period fostered a growth in Afghanistan's relations with the international community as during 1934, Afghanistan joined the League of Nations while also receiving formal recognition from the United States. By the end of the 1930s, agreements on foreign assistance and trade had been reached with many countries, most notably with the 'Axis powers': Germany, Italy, and Japan.
Zahir Shah provided aid, weapons and Afghan fighters to the Uighur and Kirghiz Muslim rebels who had established the First East Turkestan Republic. The aid was not capable of saving the First East Turkestan Republic, as the Afghan, Uighur and Kirghiz forces were defeated during 1934 by the Chinese Muslim 36th Division (National Revolutionary Army) commanded by General Ma Zhancang at the Battle of Kashgar and Battle of Yarkand. All the Afghan volunteers were killed by the Chinese Muslim troops, who then abolished the First East Turkestan Republic, and reestablished Chinese government control over the area.
Despite close relations to the Axis powers, Zahir Shah refused to take sides during World War II and Afghanistan remained one of the few countries in the world to remain neutral. From 1944 to 1947, Afghanistan experienced a series of revolts by various tribes. After the end of the Second World War, Zahir Shah recognised the need for the modernisation of Afghanistan and recruited a number of foreign advisers to assist with the process. During this period Afghanistan's first modern university was founded. During his reign a number of potential advances and reforms were derailed as a result of factionalism and political infighting. He also requested financial aid from both the United States and the Soviet Union, and Afghanistan was one of few countries in the world to receive aid from both the Cold War enemies. In a 1969 interview, Zahir Shah said that he is "not a capitalist. But I also don’t want socialism. I don’t want socialism that would bring about the kind of situation [that exists] in Czechoslovakia. I don’t want us to become the servants of Russia or China or the servant of any other place."
Zahir Shah was able to govern on his own during 1963 and despite the factionalism and political infighting a new constitution was introduced during 1964 which made Afghanistan a modern democratic state by introducing free elections, a parliament, civil rights, women's rights and universal suffrage. However the new system of governance was unstable and Zahir Shah acted cautious in the reforms. Some groups believe these moves paved the way for the eventual communist takeover in 1978.
At least five Afghani little Pul coins during his reign bore the Arabic title:
المتوكل على الله محمد ظاهر شاه, "AlMutawakkil 'ala Allah Muhammad Zhahir Shah" which means "The leaner on Allah, Muhammad Zhahir Shah". The title "AlMutawakkil 'ala Allah", "The leaner on Allah" is taken from the Quran, Sura 8, verse 61.
By the time he returned to Afghanistan in 2002, his rule was characterized by a lengthy span of peace.
Amid civilian and military discontent over the regime, in 1973 while Zahir Shah was abroad in Italy his cousin Mohammed Daoud Khan staged a non-violent military coup d'état and established a republican government. As a former Prime Minister, Daoud Khan had been forced to resign by Zahir Shah a decade earlier and felt that Zahir Shah lacked leadership and that the parliamentary system prevented real progressivism. In August 1973, Zahir Shah sent a letter from Rome to Khan in Kabul declaring his abdication, saying he respected "the will of my compatriots" after realizing the people of Afghanistan "with absolute majority welcomed a Republican regime".
Zahir Shah lived in exile in Italy for twenty-nine years in a villa in the affluent community of Olgiata on Via Cassia, north of Rome, where he spent his time playing golf and chess, as well as tending to his garden. He was prohibited from returning to Afghanistan during the late 1970s by the Soviet-assisted Communist government. In 1983 during the Soviet–Afghan War, Zahir Shah was cautiously involved with plans to develop a government in exile. Ultimately these plans failed because he could not reach a consensus with powerful Islamist factions. It has also been reported that Afghanistan, the Soviet Union and India had all tried to persuade Zahir Shah to return as chief of a neutral, possibly interim, administration in Kabul. Both the Soviet Union and the United States sent representatives to meet him, and President Mohammed Najibullah supported Zahir Shah to play a role in a possible interim government in the quest for peace. In May 1990, Zahir Shah issued a long statement through Voice of America and the BBC calling for unity and peace among Afghans, and offering his services. This reportedly led to a spark of interest and approval among the Kabul populace. However, the idea of a revived political role for Zahir Shah was met with hostility by some, notably radical Islamist Gulbuddin Hekmatyar.
In 1991, Zahir Shah survived an attempt on his life by a knife-wielding assassin masquerading as a Portuguese journalist. After the fall of the pro-Soviet government, Zahir Shah was favored by many to return and restore the monarchy to unify the country and as he was acceptable to most factions. However these efforts were blocked mostly by Pakistan's ISI, who feared his stance on the Durand Line issue. In June 1995, Zahir Shah's former envoy Sardar Wali announced at talks in Islamabad, Pakistan, that Zahir Shah was willing to participate in peace talks to end the Afghan Civil War, but no consensus was ever reached.
On 18 April 2002, at the age of 87 and four months after the end of Taliban rule, Zahir Shah returned to Afghanistan, flown in an Italian military plane, and welcomed at Kabul's airport by Hamid Karzai and other officials. His return was widely welcomed by Afghans, and he was liked by all ethnic groups. There were proposals for a return to the monarchy – Zahir Shah himself let it be known that he would accept whatever responsibility was given him by the Loya Jirga, which he initiated in June 2002. However he was obliged to publicly renounce monarchical leadership at the behest of the United States as many of the delegates to the Loya Jirga were prepared to vote for Zahir Shah and block the U.S.-backed Hamid Karzai. While he was prepared to become chief of state he made it known that it would not necessarily be as monarch: "I will accept the responsibility of head of state if that is what the Loya Jirga demands of me, but I have no intention to restore the monarchy. I do not care about the title of king. The people call me Baba and I prefer this title." Karzai called Zahir Shah a "symbol of unity, a very kind man" and a "fatherly figure."
Hamid Karzai, who was favored by Zahir Shah, became president of Afghanistan after the Loya Jirga. Karzai, from the Pashtun Popalzai clan, provided Zahir Shah's relatives with major jobs in the transitional government. Following the Loya Jirga he was given the title "Father of the Nation" by Karzai, symbolizing his role in Afghanistan's history as a symbol of national unity. This title ended with his death. In August 2002 he relocated back to his old palace after 29 years.
In an October 2002 visit to France, he slipped in a bathroom, bruising his ribs, and on 21 June 2003, while in France for a medical check-up, he broke his femur.
On 3 February 2004, Zahir was flown from Kabul to New Delhi, India, for medical treatment after complaining of an intestinal problem. He was hospitalized for two weeks and remained in New Delhi under observation. On 18 May 2004, he was brought to a hospital in the United Arab Emirates because of nose bleeding caused by heat.
Zahir Shah attended the 7 December 2004 swearing-in of Hamid Karzai as President of Afghanistan. During his final years, he was frail and required a microphone pinned to his collar so that his faint voice could be heard. During January 2007, Zahir was reported to be seriously ill and bedridden.
On 23 July 2007, Zahir Shah died in the compound of the presidential palace in Kabul after prolonged illness. His death was announced on national television by President Karzai, who said "He was the servant of his people, the friend of his people, he was a very kind person, kind hearted. He believed in the rule of the people and in human rights." His funeral was held on 24 July. It began on the premises of the presidential palace, where politicians and dignitaries paid their respects; his coffin was then taken to a mosque before being moved to the royal mausoleum on Maranjan Hill in eastern Kabul.
Zahir Shah was reportedly shy, modest and "soft-spoken". He liked photography, chess, and smoking cigars.
Zahir Shah was fluent in Pashto, Dari (his "mother tongue"), and could speak English and perfect French.
He married his first cousin Humaira Begum (1918–2002) on 7 November 1931 in Kabul. They had six sons and two daughters:
In January 2009 an article by Ahmad Majidyar of the American Enterprise Institute included one of his grandsons, Mustafa Zahir, on a list of fifteen possible candidates in the 2009 Afghan presidential election. However, Mostafa Zaher did not become a candidate.
Mohammed Zahir Shah used the title "His Majesty" during his reign.
|
https://en.wikipedia.org/wiki?curid=20886
|
Miso
Typically, "miso" is salty, but its flavor and aroma depend on various factors in the ingredients and fermentation process. Different varieties of miso have been described as salty, sweet, earthy, fruity, and savory.
The origin of the miso of Japan is not completely clear.
In the Kamakura period (1192–1333), a common meal was made up of a bowl of rice, some dried fish, a serving of miso, and a fresh vegetable. Until the Muromachi period (1337 to 1573), miso was made without grinding the soybeans, somewhat like "nattō". In the Muromachi era, Buddhist monks discovered that soybeans could be ground into a paste, spawning new cooking methods using miso to flavor other foods. In medieval times, the word "temaemiso", meaning home-made miso, appeared. Miso production is a relatively simple process, so home-made versions spread throughout Japan. Miso was used as military provisions during the Sengoku period, and making miso was an important economic activity for "daimyō"s of that era.
During the Edo period (1603–1868), miso was also called "hishio" (醤) and "kuki" (豆支) and various types of miso that fit with each local climate and culture emerged throughout Japan.
Today, miso is produced industrially in large quantities, and traditional home-made miso has become a rarity. In recent years, many new types of miso have appeared, including ones with added soup stocks or calcium, or made with beans other than soy, or having reduced salt for health, among other varieties, are available.
The ingredients used to produce miso may include any mix of soybeans, barley, rice, buckwheat, millet, rye, wheat, hemp seed, and cycad, among others. Lately, producers in other countries have also begun selling miso made from chickpeas, corn, azuki beans, amaranth, and quinoa. Fermentation time ranges from as little as five days to several years. The wide variety of Japanese miso is difficult to classify, but is commonly done by grain type, color, taste, and background.
Many regions have their own specific variation on the miso standard. For example, the soybeans used in Sendai miso are much more coarsely mashed than in normal soy miso.
Miso made with rice such as "shinshu" miso (信州味噌) and "shiro" miso (白味噌) are called "kome" miso (米味噌).
The taste, aroma, texture, and appearance of miso all vary by region and season. Other important variables that contribute to the flavor of a particular miso include temperature, duration of fermentation, salt content, variety of "kōji", and fermenting vessel. The most common flavor categories of miso are:
Although white and red ("shiromiso" and "akamiso") are the most common types of misos available, different varieties may be preferred in particular regions of Japan. In the eastern Kantō region that includes Tokyo, the darker brownish "akamiso" is popular while in the western Kansai region encompassing Osaka, Kyoto, and Kobe, the lighter "shiromiso" is preferred.
A more nuanced breakdown of the flavors is:
The distinct and unique aroma of miso determines its quality. Many reactions occur among the components of miso, primarily the Maillard reaction, a non-enzymatic reaction of an amino group with a reducing sugar. The volatile compounds produced from this reaction give miso its characteristic flavor and aroma. Depending on the microorganism in combination with the variety of soybean or cereal used, many classes of flavor compounds are produced that give rise to the different types of miso. Fermentation products such as furanone compounds, including 4-hydroxy-2(or 5)-ethyl-5(or 2)-methyl-3(2H)-furanone (HEMF) and 4-hydroxy-2,5 dimethyl-3(2H)-furanone (HDMF) are novel flavor compounds of miso. HEMF is especially known for its sweet aroma and is very important for the sensory evaluation of the aroma of rice miso.
The unique sensory properties of miso are complex; however, the key factor involved in the overall quality of the final product is the enzymatic activity of microorganisms. They use the composition of miso (rice, barley, and soybeans) to produce different pigments, flavor and aroma compounds.
Proteolysis of soybean protein produces constituent amino acids that impart an umami taste that enhance the relatively dull taste of soybean by itself. Soy protein contains a substantial amount of glutamate, the salt of which is known as MSG or monosodium glutamate, a popular ingredient used by food manufacturers to improve the taste of their products. The umami effect of MSG itself is one-dimensional; however the umami taste of miso is multi-dimensional because of the myriad of different amino acids and fermentation products present.
Barley miso is a traditional farmhouse variety made for personal use. Often called “rural miso”, domestic barley is more often used than imported barley. Containing glutamic acid and aromatic compounds such as ferulic acid and vanillic acid, barley miso is distinguished by a characteristic flavor.
Miso's unique properties and flavour profile can be attributed to the compounds produced through the fermentation process. Miso, depending on the variety, consists of a starter culture called koji (麹), soybeans, and usually a grain (either rice, barley, or rye). The miso goes through a two step process; first creating the koji, and second the koji is combined with the other components and the mixture is left to be enzymatically digested, fermented and aged.
Koji is produced by introducing the mould "Aspergillus oryzae" onto steamed white rice. This mould culture comes from dried "A. oryzae" spores called "tane-koji" or "starter koji" and is isolated from plant matter (usually rice) and cultivated. In the past, the natural presence of "A. oryzae" spores was relied upon to create koji, but because of the difficulty of producing the culture, tane-koji is added almost exclusively in both industrial and traditional production of miso. Tane-koji is produced much in the same way as koji, but also has a small portion of wood ash added to the mixture which gives important nutrients to the fungus as well as promoting sporulation.
"A. oryzae" is an aerobic fungus and is the most active fermenting agents in koji as it produces amylolytic, and proteolytic enzymes which are essential to creating the final miso product. Amyloytic enzymes such as amylase aid in the breakdown of starch in the grains to sugar and dextrin, while proteolytic enzymes such as protease catalyze the breakdown of proteins into smaller peptides or amino acids. These both aid in the enzymatic digestion of the mixture of rice and soybeans. Depending on the strain of "A. oryzae", enzymatic composition varies thereby changing the characteristics of the final miso product. For example, the strain used to create the sweeter white miso would likely produce a higher content of amylolytic enzymes, while comparatively a soybean miso might have a higher content of proteolytic enzyme.
To create optimal conditions for enzymatic production and the growth of "A. oryzae", the koji's environment must be carefully regulated. Temperature, humidity and oxygen content, are all important factors in not only maximizing mould growth and enzyme production, but to prevent other harmful bacteria from producing. Once the koji has reached a desirable flavour profile it is usually mixed with salt to prevent further fermentation.
Although other strains of fungi have been used to produce koji, "A. oryzae" is the most desirable because of a number of properties, including the fact that it does not produce aflatoxin.
Miso typically comes as a paste in a sealed container requiring refrigeration after opening. Natural miso is a living food containing many beneficial microorganisms such as "Tetragenococcus halophilus" which can be killed by overcooking. For this reason, the miso should be added to soups or other foods being prepared just before they are removed from the heat. Using miso without any cooking may be even better. Outside Japan, a popular practice is to add miso only to foods that have cooled to preserve "kōjikin" cultures in miso. Nonetheless, miso and soy foods play a large role in the Japanese diet, and many cooked miso dishes are popular.
Miso is a part of many Japanese-style meals. It most commonly appears as the main ingredient of miso soup, which is eaten daily by much of the Japanese population. The pairing of plain rice and miso soup is considered a fundamental unit of Japanese cuisine. This pairing is the basis of a traditional Japanese breakfast.
Miso is used in many other types of soup and soup-like dishes, including some kinds of" ramen, udon, nabe", and" imoni". Generally, such dishes have the title "miso" prefixed to their name (for example, "miso-udon"), and have a heavier, earthier flavor and aroma compared to other Japanese soups that are not miso-based.
Many traditional confections use a sweet, thick miso glaze, such as "mochidango". Miso-glazed treats are strongly associated with Japanese festivals, although they are available year-round at supermarkets. The consistency of miso glaze ranges from thick and taffy-like to thin and drippy.
Soy miso is used to make a type of pickle called "misozuke". These pickles are typically made from cucumber, daikon, Nappa cabbage, or eggplant, and are sweeter and less salty than the standard Japanese salt pickle.
Other foods with miso as an ingredient include:
Claims that miso is high in vitamin B12 have been contradicted in some studies.
Some experts suggest that miso is a source of "Lactobacillus acidophilus". Miso is relatively high in salt which can contribute to increased blood pressure in the small percentage of the population with sodium-sensitive prehypertension or hypertension.
|
https://en.wikipedia.org/wiki?curid=20889
|
Malcolm III of Scotland
Malcolm III (; c. 26 March 1031 – 13 November 1093 (age 63)) was King of Scots from 1058 to 1093. He was later nicknamed "Canmore" ("ceann mòr", Gaelic for "Great Chief": "ceann" denotes "leader", "head" (of state) and "mòr" denotes "pre-eminent", "great", and "big"). Malcolm's long reign of 35 years preceded the beginning of the Scoto-Norman age. Henry I of England and Eustace III of Boulogne were his sons-in-law, making him the maternal grandfather of Empress Matilda, William Adelin and Matilda of Boulogne. All three of them were prominent in English politics during the 12th century.
Malcolm's kingdom did not extend over the full territory of modern Scotland: the north and west of Scotland remained under Scandinavian rule following the Norse invasions. Malcolm III fought a series of wars against the Kingdom of England, which may have had as its objective the conquest of the English earldom of Northumbria. These wars did not result in any significant advances southward. Malcolm's primary achievement was to continue a lineage that ruled Scotland for many years, although his role as founder of a dynasty has more to do with the propaganda of his youngest son David I and his descendants than with history.
Malcolm's second wife, St. Margaret of Scotland, is Scotland's only royal saint. Malcolm himself had no reputation for piety; with the notable exception of Dunfermline Abbey in Fife, he is not definitely associated with major religious establishments or ecclesiastical reforms.
Malcolm's father Duncan I became king in late 1034, on the death of Malcolm II, Duncan's maternal grandfather and Malcolm's great-grandfather. According to John of Fordun, whose account is the original source of part of William Shakespeare's "Macbeth", Malcolm's mother was a niece of Siward, Earl of Northumbria, but an earlier king-list gives her the Gaelic name Suthen. Other sources claim that either a daughter or niece would have been too young to fit the timeline, thus the likely relative would have been Siward's own sister Sybil, which may have translated into Gaelic as Suthen.
Duncan's reign was not successful and he was killed in battle with the men of Moray, led by Macbeth, on 15 August 1040. Duncan was young at the time of his death, and Malcolm and his brother Donalbane were children. Malcolm's family attempted to overthrow Macbeth in 1045, but Malcolm's grandfather Crínán of Dunkeld was killed in the attempt.
Soon after the death of Duncan his two young sons were sent away for greater safety—exactly where is the subject of debate. According to one version, Malcolm (then aged about nine) was sent to England, and his younger brother Donalbane was sent to the Isles. Based on Fordun's account, it was assumed that Malcolm passed most of Macbeth's seventeen-year reign in the Kingdom of England at the court of Edward the Confessor.
Today's British royal family can trace their family history back to Malcolm III via his daughter Matilda as well as his son David I, an ancestor of Robert the Bruce and thus also the Stewart/Stuart kings.
According to an alternative version, Malcolm's mother took both sons into exile at the court of Thorfinn Sigurdsson, Earl of Orkney, an enemy of Macbeth's family, and perhaps Duncan's kinsman by marriage.
An English invasion in 1054, with Siward, Earl of Northumbria in command, had as its goal the installation of one "Máel Coluim, son of the king of the Cumbrians". This Máel Coluim has traditionally been identified with the later Malcolm III. This interpretation derives from the "Chronicle" attributed to the 14th-century chronicler of Scotland, John of Fordun, as well as from earlier sources such as William of Malmesbury. The latter reported that Macbeth was killed in the battle by Siward, but it is known that Macbeth outlived Siward by two years. A. A. M. Duncan argued in 2002 that, using the "Anglo-Saxon Chronicle" entry as their source, later writers innocently misidentified "Máel Coluim" with the later Scottish king of the same name. Duncan's argument has been supported by several subsequent historians specialising in the era, such as Richard Oram, Dauvit Broun and Alex Woolf. It has also been suggested that Máel Coluim may have been a son of Owain Foel, British king of Strathclyde perhaps by a daughter of Malcolm II, King of Scotland.
In 1057 various chroniclers report the death of Macbeth at Malcolm's hand, on 15 August 1057 at Lumphanan in Aberdeenshire. Macbeth was succeeded by his stepson Lulach, who was crowned at Scone, probably on 8 September 1057. Lulach was killed by Malcolm, "by treachery", near Huntly on 23 April 1058. After this, Malcolm became king, perhaps being inaugurated on 25 April 1058, although only John of Fordun reports this.
If Orderic Vitalis is to be relied upon, one of Malcolm's earliest actions as king was to travel to the court of Edward the Confessor in 1059 to arrange a marriage with Edward's kinswoman Margaret, who had arrived in England two years before from Hungary. If a marriage agreement was made in 1059, it was not kept, and this may explain the Scots invasion of Northumbria in 1061 when Lindisfarne was plundered. Equally, Malcolm's raids in Northumbria may have been related to the disputed "Kingdom of the Cumbrians", reestablished by Earl Siward in 1054, which was under Malcolm's control by 1070.
The "Orkneyinga saga" reports that Malcolm married the widow of Thorfinn Sigurdsson, Ingibiorg, a daughter of Finn Arnesson. Although Ingibiorg is generally assumed to have died shortly before 1070, it is possible that she died much earlier, around 1058. The "Orkneyinga Saga" records that Malcolm and Ingibiorg had a son, Duncan II (Donnchad mac Maíl Coluim), who was later king. Some Medieval commentators, following William of Malmesbury, claimed that Duncan was illegitimate, but this claim is propaganda reflecting the need of Malcolm's descendants by Margaret to undermine the claims of Duncan's descendants, the Meic Uilleim. Malcolm's son Domnall, whose death is reported in 1085, is not mentioned by the author of the "Orkneyinga Saga". He is assumed to have been born to Ingibiorg.
Malcolm's marriage to Ingibiorg secured him peace in the north and west. The "Heimskringla" tells that her father Finn had been an adviser to Harald Hardraade and, after falling out with Harald, was then made an Earl by Sweyn Estridsson, King of Denmark, which may have been another recommendation for the match. Malcolm enjoyed a peaceful relationship with the Earldom of Orkney, ruled jointly by his stepsons, Paul and Erlend Thorfinnsson. The "Orkneyinga Saga" reports strife with Norway but this is probably misplaced as it associates this with Magnus Barefoot, who became king of Norway only in 1093, the year of Malcolm's death.
Although he had given sanctuary to Tostig Godwinson when the Northumbrians drove him out, Malcolm was not directly involved in the ill-fated invasion of England by Harald Hardraade and Tostig in 1066, which ended in defeat and death at the battle of Stamford Bridge. In 1068, he granted asylum to a group of English exiles fleeing from William of Normandy, among them Agatha, widow of Edward the Confessor's nephew Edward the Exile, and her children: Edgar Ætheling and his sisters Margaret and Cristina. They were accompanied by Gospatric, Earl of Northumbria. The exiles were disappointed, however, if they had expected immediate assistance from the Scots.
In 1069 the exiles returned to England, to join a spreading revolt in the north. Even though Gospatric and Siward's son Waltheof submitted by the end of the year, the arrival of a Danish army under Sweyn Estridsson seemed to ensure that William's position remained weak. Malcolm decided on war, and took his army south into Cumbria and across the Pennines, wasting Teesdale and Cleveland then marching north, loaded with loot, to Wearmouth. There Malcolm met Edgar and his family, who were invited to return with him, but did not. As Sweyn had by now been bought off with a large Danegeld, Malcolm took his army home. In reprisal, William sent Gospatric to raid Scotland through Cumbria. In return, the Scots fleet raided the Northumbrian coast where Gospatric's possessions were concentrated. Late in the year, perhaps shipwrecked on their way to a European exile, Edgar and his family again arrived in Scotland, this time to remain. By the end of 1070, Malcolm had married Edgar's sister Margaret (later known as Saint Margaret).
The naming of their children represented a break with the traditional Scots regal names such as Malcolm, Cináed and Áed. The point of naming Margaret's sons—Edward after her father Edward the Exile, Edmund for her grandfather Edmund Ironside, Ethelred for her great-grandfather Ethelred the Unready and Edgar for her great-great-grandfather Edgar and her brother, briefly the elected king, Edgar Ætheling—was unlikely to be missed in England, where William of Normandy's grasp on power was far from secure. Whether the adoption of the classical Alexander for the future Alexander I of Scotland (either for Pope Alexander II or for Alexander the Great) and the biblical David for the future David I of Scotland represented a recognition that William of Normandy would not be easily removed, or was due to the repetition of Anglo-Saxon royal name—another Edmund had preceded Edgar—is not known. Margaret also gave Malcolm two daughters, Edith, who married Henry I of England, and Mary, who married Eustace III of Boulogne.
In 1072, with the Harrying of the North completed and his position again secure, William of Normandy came north with an army and a fleet. Malcolm met William at Abernethy and, in the words of the "Anglo-Saxon Chronicle" "became his man" and handed over his eldest son Duncan as a hostage and arranged peace between William and Edgar. Accepting the overlordship of the king of the English was no novelty, as previous kings had done so without result. The same was true of Malcolm; his agreement with the English king was followed by further raids into Northumbria, which led to further trouble in the earldom and the killing of Bishop William Walcher at Gateshead. In 1080, William sent his son Robert Curthose north with an army while his brother Odo punished the Northumbrians. Malcolm again made peace, and this time kept it for over a decade.
Malcolm faced little recorded internal opposition, with the exception of Lulach's son Máel Snechtai. In an unusual entry, for the "Anglo-Saxon Chronicle" contains little on Scotland, it says that in 1078:
Whatever provoked this strife, Máel Snechtai survived until 1085.
When William Rufus became king of England after his father's death, Malcolm did not intervene in the rebellions by supporters of Robert Curthose which followed. In 1091, William Rufus confiscated Edgar Ætheling's lands in England, and Edgar fled north to Scotland. In May, Malcolm marched south, not to raid and take slaves and plunder, but to besiege Newcastle, built by Robert Curthose in 1080. This appears to have been an attempt to advance the frontier south from the River Tweed to the River Tees. The threat was enough to bring the English king back from Normandy, where he had been fighting Robert Curthose. In September, learning of William Rufus's approaching army, Malcolm withdrew north and the English followed. Unlike in 1072, Malcolm was prepared to fight, but a peace was arranged by Edgar Ætheling and Robert Curthose whereby Malcolm again acknowledged the overlordship of the English king.
In 1092, the peace began to break down. Based on the idea that the Scots controlled much of modern Cumbria, it had been supposed that William Rufus's new castle at Carlisle and his settlement of English peasants in the surrounds was the cause. It is unlikely that Malcolm controlled Cumbria, and the dispute instead concerned the estates granted to Malcolm by William Rufus's father in 1072 for his maintenance when visiting England. Malcolm sent messengers to discuss the question and William Rufus agreed to a meeting. Malcolm travelled south to Gloucester, stopping at Wilton Abbey to visit his daughter Edith and sister-in-law Cristina. Malcolm arrived there on 24 August 1093 to find that William Rufus refused to negotiate, insisting that the dispute be judged by the English barons. This Malcolm refused to accept, and returned immediately to Scotland.
It does not appear that William Rufus intended to provoke a war, but, as the "Anglo-Saxon Chronicle" reports, war came:
Malcolm was accompanied by Edward, his eldest son by Margaret and probable heir-designate (or tánaiste), and by Edgar. Even by the standards of the time, the ravaging of Northumbria by the Scots was seen as harsh.
While marching north again, Malcolm was ambushed by Robert de Mowbray, Earl of Northumbria, whose lands he had devastated, near Alnwick on 13 November 1093. There he was killed by Arkil Morel, steward of Bamburgh Castle. The conflict became known as the Battle of Alnwick. Edward was mortally wounded in the same fight. Margaret, it is said, died soon after receiving the news of their deaths from Edgar. The Annals of Ulster say:
Malcolm's body was taken to Tynemouth Priory for burial. The king's body was sent north for reburial, in the reign of his son Alexander, at Dunfermline Abbey, or possibly Iona.
On 19 June 1250, following the canonisation of Malcolm's wife Margaret by Pope Innocent IV, Margaret's remains were disinterred and placed in a reliquary. Tradition has it that as the reliquary was carried to the high altar of Dunfermline Abbey, past Malcolm's grave, it became too heavy to move. As a result, Malcolm's remains were also disinterred, and buried next to Margaret beside the altar.
Malcolm and Ingibiorg had three sons:
Malcolm and Margaret had eight children, six sons and two daughters:
Malcolm appears in William Shakespeare’s "Macbeth" as Malcolm. He is the son of King Duncan and heir to the throne. He first appears in the second scene where he is talking to a sergeant, with Duncan. The sergeant tells them how the battle was won thanks to Macbeth. Then Ross comes and Duncan decides that Macbeth should take the title of Thane of Cawdor. Then he later appears in Act 1.4 talking about the execution of the former Thane of Cawdor. Macbeth then enters and they congratulate him on his victory. He later appears in Macbeth's castle as a guest. When his father is killed he is suspected of the murder so he escapes to England. He later makes an appearance in Act 4.3, where he talks to Macduff about Macbeth and what to do. They both decide to start a war against him. In Act 5.4 he is seen in Dunsinane getting ready for war. He orders the troops to hide behind branches and slowly advance towards the castle. In Act 5.8 he watches the battle against Macbeth and Macduff with Siward and Ross. When eventually Macbeth is killed, Malcolm takes over as king.
The married life of Malcolm III and Margaret has been the subject of three historical novels: "A Goodly Pearl" (1905) by Mary H. Debenham, and "Malcolm Canmore's Pearl" (1907) by Agnes Grant Hay, and "Sing, Morning Star" by Jane Oliver (1949). They focus on court life in Dunfermline, and the Margaret helping introduce Anglo-Saxon culture in Scotland. The latter two novels cover events to 1093, ending with Malcolm's death. Malcolm's conflict with William the Conqueror is depicted in the screenplay, Malcolm Son of Duncan, by Laura Ballou which is part of the reading reference library at the Clan Donnachaidh Society Center in Perthshire, Scotland.
|
https://en.wikipedia.org/wiki?curid=20892
|
Maximum transmission unit
In computer networking, the maximum transmission unit (MTU) is the size of the largest protocol data unit (PDU) that can be communicated in a single network layer transaction. The MTU relates to, but is not identical to the maximum frame size that can be transported on the data link layer, e.g. Ethernet frame.
Larger MTU is associated with reduced overhead. Smaller MTU values can reduce network delay. In many cases, MTU is dependent on underlying network capabilities and must be adjusted manually or automatically so as to not exceed these capabilities. MTU parameters may appear in association with a communications interface or standard. Some systems may decide MTU at connect time.
MTUs apply to communications protocols and network layers. The MTU is specified in terms of bytes or octets of the largest PDU that the layer can pass onwards. MTU parameters usually appear in association with a communications interface (NIC, serial port, etc.). Standards (Ethernet, for example) can fix the size of an MTU; or systems (such as point-to-point serial links) may decide MTU at connect time.
Underlying data link and physical layers usually add overhead to the network layer data to be transported, so for a given maximum frame size of a medium one needs to subtract the amount of overhead to calculate that medium's MTU. For example, with Ethernet, the maximum frame size is 1518 bytes, 18 bytes of which are overhead (header and frame check sequence), resulting in an MTU of 1500 bytes.
A larger MTU brings greater efficiency because each network packet carries more user data while protocol overheads, such as headers or underlying per-packet delays, remain fixed; the resulting higher efficiency means an improvement in bulk protocol throughput. A larger MTU also requires processing of fewer packets for the same amount of data. In some systems, per-packet-processing can be a critical performance limitation.
However, this gain is not without a downside. Large packets occupy a slow link for more time than a smaller packet, causing greater delays to subsequent packets, and increasing network delay and delay variation. For example, a 1500-byte packet, the largest allowed by Ethernet at the network layer, ties up a 14.4k modem for about one second.
Large packets are also problematic in the presence of communications errors. If no forward error correction is used, corruption of a single bit in a packet requires that the entire packet be retransmitted, which can be costly. At a given bit error rate, larger packets are more susceptible to corruption. Their greater payload makes retransmissions of larger packets take longer. Despite the negative effects on retransmission duration, large packets can still have a net positive effect on end-to-end TCP performance.
The Internet protocol suite was designed to work over many different networking technologies, each of which may use packets of different sizes. While a host will know the MTU of its own interface and possibly that of its peers (from initial handshakes), it will not initially know the lowest MTU in a chain of links to other peers. Another potential problem is that higher-level protocols may create packets larger than even the local link supports.
IPv4 allows fragmentation which divides the datagram into pieces, each small enough to accommodate a specified MTU limitation. This fragmentation process takes place at the internet layer. The fragmented packets are marked so that the IP layer of the destination host knows it should reassemble the packets into the original datagram.
All fragments of a packet must arrive for the packet to be considered received. If the network drops any fragment, the entire packet is lost.
When the number of packets that must be fragmented or the number of fragments is great, fragmentation can cause unreasonable or unnecessary overhead. For example, various tunneling situations may exceed the MTU by very little as they add just a header's worth of data. The addition is small, but each packet now has to be sent in two fragments, the second of which carries very little payload. The same amount of payload is being moved, but every intermediate router has to forward twice as many packets.
The Internet Protocol requires that hosts must be able to process IP datagrams of at least 576 bytes (for IPv4) or 1280 bytes (for IPv6). However, this does not preclude link layers with an MTU smaller than this minimum MTU from conveying IP data. For example, according to IPv6's specification, if a particular link layer cannot deliver an IP datagram of 1280 bytes in a single frame, then the link layer must provide its own fragmentation and reassembly mechanism, separate from the IP fragmentation mechanism, to ensure that a 1280-byte IP datagram can be delivered, intact, to the IP layer.
In the context of Internet Protocol, MTU refers to the maximum size of an IP packet that can be transmitted without fragmentation over a given medium. The size of an IP packet includes IP headers but excludes headers from the link layer. In the case of an Ethernet frame this adds an overhead of 18 bytes, or 22 bytes with an IEEE 802.1Q tag for VLAN tagging or class of service.
The MTU should not be confused with the minimum datagram size that all hosts must be prepared to accept. This is 576 bytes for IPv4 and of 1280 bytes for IPv6.
The IP MTU and Ethernet maximum frame size are configured separately. In Ethernet switch configuration, MTU may refer to Ethernet maximum frame size. In Ethernet-based routers, MTU normally refers to the IP MTU. If jumbo frames are allowed in a network, the IP MTU should also be adjusted upwards to take advantage of this.
Since the IP packet is carried by an Ethernet frame, the Ethernet frame has to be larger than the IP packet. With the normal untagged Ethernet frame overhead of 18 bytes, the Ethernet maximum frame size is 1518 bytes. If a 1500 byte IP packet is to be carried over a tagged Ethernet connection, the Ethernet frame maximum size needs to be 1522 due to the larger size of an 802.1Q tagged frame. 802.3ac increases the standard Ethernet maximum frame size to accommodate this.
The Internet Protocol defines the "path MTU" of an Internet transmission path as the smallest MTU supported by any of the hops on the path between a source and destination. Put another way, the path MTU is the largest packet size that can traverse this path without suffering fragmentation.
Standard Ethernet supports an MTU of 1500 bytes and Ethernet implementation supporting jumbo frames, allow for an MTU up to 9000 bytes. However, border protocols like PPPoE will reduce this. Path MTU Discovery exposes the difference between the MTU seen by Ethernet end-nodes and the Path MTU
Unfortunately, increasing numbers of networks drop ICMP traffic (for example, to prevent denial-of-service attacks), which prevents path MTU discovery from working. , Packetization Layer Path MTU Discovery, describes a Path MTU Discovery technique which responds more robustly to ICMP filtering. In an IP network, the path from the source address to the destination address may change in response to various events (load-balancing, congestion, outages, etc.) and this could result in the path MTU changing (sometimes repeatedly) during a transmission, which may introduce further packet drops before the host finds a new reliable MTU.
A failure of Path MTU Discovery carries the possible result of making some sites behind badly configured firewalls unreachable. A connection with mismatched MTU may work for low-volume data but fail as soon as a host sends a large block of data. For example, with Internet Relay Chat a connecting client might see the initial messages up to and including the initial ping (sent by the server as an anti-spoofing measure), but get no response after that. This is because the large set of welcome messages sent at that point are packets that exceed the path MTU. One can possibly work around this, depending on which part of the network one controls; for example one can change the MSS (maximum segment size) in the initial packet that sets up the TCP connection at one's firewall.
MTU is sometimes used to describe the maximum PDU sizes in communication layers other than the network layer.
The transmission of a packet on a physical network segment that is larger than the segment's MTU is known as jabber. This is almost always caused by faulty devices. Network switches and some repeater hubs have a built-in capability to detect when a device is jabbering.
|
https://en.wikipedia.org/wiki?curid=20894
|
MV Buffalo Soldier (T-AK-9301)
MV "Buffalo Soldier" (T-AK-9301) is a roll-on/roll-off ship, formerly of the French Government Line (now merged into CMA CGM). She was sold and reflagged US, renamed to honor Buffalo Soldiers, and chartered by the United States Navy Military Sealift Command as a Maritime Prepositioning ship serving at Diego Garcia laden with U.S. Air Force munitions. She is self-sustaining, that is, she can unload herself, an asset in harbors with little or no infrastructure. Her 120-long-ton capacity roll-on/roll-off ramp accommodates tracked and wheeled vehicles of every description. While she is not currently in service with MSC, ships with her general characteristics are designated "Buffalo Soldier" class, fleet designation AK 2222.
|
https://en.wikipedia.org/wiki?curid=20895
|
Mount Baker
Mount Baker (Lummi: '; or '), also known as Koma Kulshan or simply Kulshan, is a active glaciated andesitic stratovolcano in the Cascade Volcanic Arc and the North Cascades of Washington in the United States. Mount Baker has the second-most thermally active crater in the Cascade Range after Mount Saint Helens. About due east of the city of Bellingham, Whatcom County, Mount Baker is the youngest volcano in the Mount Baker volcanic field. While volcanism has persisted here for some 1.5 million years, the current glaciated cone is likely no more than 140,000 years old, and possibly no older than 80–90,000 years. Older volcanic edifices have mostly eroded away due to glaciation.
After Mount Rainier, Mount Baker is the most heavily glaciated of the Cascade Range volcanoes; the volume of snow and ice on Mount Baker, is greater than that of all the other Cascades volcanoes (except Rainier) combined. It is also one of the snowiest places in the world; in 1999, Mount Baker Ski Area, located to the northeast, set the world record for recorded snowfall in a single season—.
Mt. Baker is the third-highest mountain in Washington and the fifth-highest in the Cascade Range, if Little Tahoma Peak, a subpeak of Mount Rainier, and Shastina, a subpeak of Mount Shasta, are not counted. Located in the Mount Baker Wilderness, it is visible from much of Greater Victoria, Nanaimo, and Greater Vancouver in British Columbia, and to the south, from Seattle (and on clear days Tacoma) in Washington.
Indigenous peoples have known the mountain for thousands of years, but the first written record of the mountain is from Spanish explorer Gonzalo Lopez de Haro, who mapped it in 1790 as "", "Great Mount Carmel". The explorer George Vancouver renamed the mountain for 3rd Lieutenant Joseph Baker of HMS "Discovery", who saw it on April 30, 1792.
Mount Baker was well-known to indigenous peoples of the Pacific Northwest. Indigenous names for the mountain include Koma Kulshan or Kulshan (Lummi: ', "white sentinel", , and ', "puncture wound", "i.e." "crater"); Quck Sam-ik (, "white mountain"); Kobah (, "white sentinel", and "Tukullum" or "Nahcullum" (in the language of the unidentified "Koma tribe").
In 1790, Manuel Quimper of the Spanish Navy set sail from Nootka, a temporary settlement on Vancouver Island, with orders to explore the newly discovered Strait of Juan de Fuca. Accompanying Quimper was first-pilot Gonzalo Lopez de Haro, who drew detailed charts during the six-week expedition. Although Quimper's journal of the voyage does not refer to the mountain, one of Haro's manuscript charts includes a sketch of Mount Baker. The Spanish named the snowy volcano "", as it reminded them of the white-clad monks of the Carmelite Monastery.
The British explorer George Vancouver left England a year later. His mission was to survey the northwest coast of America. Vancouver and his crew reached the Pacific Northwest coast in 1792. While anchored in Dungeness Bay on the south shore of the Strait of Juan de Fuca, Joseph Baker made an observation of Mount Baker, which Vancouver recorded in his journal:
Six years later, the official narrative of this voyage was published, including the first printed reference to the mountain. By the mid-1850s, Mount Baker was a well-known feature on the horizon to the explorers and fur traders who traveled in the Puget Sound region. Isaac I. Stevens, the first governor of Washington Territory, wrote about Mount Baker in 1853:
Edmund Thomas Coleman, an Englishman who resided in Victoria, British Columbia, Canada and a veteran of the Alps, made the first attempt to ascend the mountain in 1866. He chose a route via the Skagit River, but was forced to turn back when local Native Americans refused him passage.
Later that same year, Coleman recruited Whatcom County settlers Edward Eldridge, John Bennett, and John Tennant to aid him in his second attempt to scale the mountain. After approaching via the North Fork of the Nooksack River, the party navigated through what is now known as Coleman Glacier and ascended to within several hundred feet of the summit before turning back in the face of an "overhanging cornice of ice" and threatening weather. Coleman later returned to the mountain after two years. At 4:00 pm on August 17, 1868, Coleman, Eldridge, Tennant and two new companions (David Ogilvy and Thomas Stratton) scaled the summit via the Middle Fork Nooksack River, Marmot Ridge, Coleman Glacier, and the north margin of the Roman Wall.
The present-day cone of Mount Baker is relatively young; it is perhaps less than 100,000 years old. The volcano sits atop a similar older volcanic cone called Black Buttes, which was active between 500,000 and 300,000 years ago. Much of Mount Baker's earlier geological record eroded away during the last ice age (which culminated 15,000–20,000 years ago), by thick ice sheets that filled the valleys and surrounded the volcano. In the last 14,000 years, the area around the mountain has been largely ice-free, but the mountain itself remains heavily covered with snow and ice.
Isolated ridges of lava and hydrothermally altered rock, especially in the area of Sherman Crater, are exposed between glaciers on the upper flanks of the volcano; the lower flanks are steep and heavily vegetated. Volcanic rocks of Mount Baker and Black Buttes rest on a foundation of non-volcanic rocks.
Deposits recording the last 14,000 years at Mount Baker indicate that Mount Baker has not had highly explosive eruptions like those of other volcanoes in the Cascade Volcanic Arc, such as Mount St. Helens, Glacier Peak, or the Mount Meager massif, nor has it erupted frequently. During this period, four episodes of magmatic eruptive activity have been recently recognized.
Magmatic eruptions have produced tephra, pyroclastic flows, and lava flows from summit vents and the Schriebers Meadow Cone. The most destructive and most frequent events at Mount Baker have been lahars or debris flows and debris avalanches; many, if not most, of these were not related to magmatic eruptions, but may have been induced by magma intrusion, steam eruptions, earthquakes, gravitational instability, or possibly even heavy rainfall.
Research beginning in the late 1990s shows that Mount Baker is the youngest of several volcanic centers in the area and one of the youngest volcanoes in the Cascade Range. The Pliocene Hannegan caldera is preserved northeast of Mount Baker Volcanic activity in the Mount Baker volcanic field began more than one million years ago, but many of the earliest lava and tephra deposits have been removed by glacial erosion. The pale-colored rocks northeast of the modern volcano mark the site of the ancient (1.15 million years old) Kulshan caldera that collapsed after an enormous ash eruption one million years ago. Subsequently, eruptions in the Mount Baker area have produced cones and lava flows of andesite, the rock that constitutes much of other Cascade Range volcanoes such as Rainier, Adams, and Hood. From about 900,000 years ago to the present, numerous andesitic volcanic centers in the area have come and disappeared through glacial erosion. The largest of these cones is the Black Buttes edifice, active between 500,000 and 300,000 years ago and formerly bigger than today's Mount Baker.
Mount Baker was built from stacks of lava and volcanic breccia prior to the end of the last glacial period, which ended about 15,000 years ago. Two craters are on the mountain. Ice-filled Carmelo Crater is under the summit ice dome. This crater is the source for the last cone-building eruptions
The highest point of Mount Baker, Grant Peak, is on the exposed southeast rim of Carmelo Crater, which is a small pile of andesitic scoria lying on top of a stack of lava flows just below. Carmelo Crater is deeply dissected on its south side by the younger Sherman Crater. This crater is south of the summit, and its ice-covered floor is below the summit ice dome. This crater is the site of all Holocene eruptive activity. Hundreds of fumaroles vent gases, primarily , , and .
Lava flows from the summit vent erupted between 30,000 and 10,000 years ago, and during the final stages of edifice construction, blocky pyroclastic flows entered the volcano's southeastern drainages. An eruption from Sherman Crater 6,600 years ago erupted a blanket of ash that extended more than to the east. Today, sulfurous gases reach the surface via two fumarole pathways: Dorr Fumarole, northeast of the summit, and Sherman Crater, south of the summit. Both are sites of hydrothermal alteration, converting lavas to weak, white-to-yellow clays; sulfur is a common mineral around these fumaroles. At Sherman Crater, collapses of this weakened rock generated lahars in the 1840s.
Around 6,600 years ago, a series of discrete events culminated in the largest tephra-producing eruption in postglacial time at Mount Baker. This is the last episode of undoubted magmatic activity preserved in the geologic record. First, the largest collapse in the history of the volcano occurred from the Roman Wall and transformed into a lahar that was over deep in the upper reaches of the Middle Fork of the Nooksack River. It was at least deep downstream from the volcano. At that time, the Nooksack River is believed to have drained north into the Fraser River; this lahar is unlikely to have reached Bellingham Bay. Next, a small hydrovolcanic eruption occurred at Sherman Crater, triggering a second collapse of the flank just east of the Roman Wall. That collapse also became a lahar that mainly followed the course of the first lahar for at least , and also spilled into tributaries of the Baker River. Finally, an eruption cloud deposited ash as far as downwind to the northeast and east.
Several eruptions occurred from Sherman Crater during the 19th century; they were witnessed from the Bellingham area. A possible eruption was seen in June 1792 during the Spanish expedition of Dionisio Alcalá Galiano and Cayetano Valdés. Their report read, in part:
In 1843, explorers reported a widespread layer of newly fallen rock fragments "like a snowfall" and that the forest was "on fire for miles around". These fires were unlikely to have been caused by ashfall, however, as charred material is not found with deposits of this fine-grained volcanic ash, which was almost certainly cooled in the atmosphere before falling. Rivers south of the volcano were reportedly clogged with ash, and Native Americans reported that many salmon perished. Reports of flooding on the Skagit River from the eruption are, however, probably greatly exaggerated. A short time later, two collapses of the east side of Sherman Crater produced two lahars, the first and larger of which flowed into the natural Baker Lake, increasing its level by at least . The location of the 19th-century lake is now covered by waters of the modern dam-impounded Baker Lake. Similar but lower-level hydrovolcanic activity at Sherman Crater continued intermittently for several decades afterward. On 26 November 1860, passengers who were traveling by steamer from New Westminster to Victoria reported that Mount Baker was "puffing out large volumes of smoke, which upon breaking, rolled down the snow-covered sides of the mountain, forming a pleasing effect of light and shade." In 1891, about of rock fell producing a lahar that traveled more than and covered .
Activity in the 20th century decreased from the 19th century. Numerous small debris avalanches fell from Sherman Peak and descended the Boulder Glacier; a large one occurred on July 27, 2007. In early March 1975, a dramatic increase in fumarolic activity and snow melt in the Sherman Crater area raised concern that an eruption might be imminent. Heat flow increased more than tenfold. Additional monitoring equipment was installed and several geophysical surveys were conducted to try to detect the movement of magma. The increased thermal activity prompted public officials and Puget Power to temporarily close public access to the popular Baker Lake recreation area and to lower the reservoir's water level by . If those actions had not been taken, significant avalanches of debris from the Sherman Crater area could have swept directly into the reservoir, triggering a disastrous wave that could have caused human fatalities and damage to the reservoir. Other than the increased heat flow, few anomalies were recorded during the geophysical surveys, nor were any other precursory activities observed that would indicate that magma was moving up into the volcano. Several small lahars formed from material ejected onto the surrounding glaciers and acidic water was discharged into Baker Lake for many months.
Activity gradually declined over the next two years, but stabilized at a higher level than before 1975. The increased level of fumarolic activity has continued at Mount Baker since 1975, but no other changes suggest that magma movement is involved.
A considerable amount of research has been done at Mount Baker over the past decade, and it is now among the most-studied of the Cascade volcanoes. Recent and ongoing projects include gravimetric and GPS-based geodetic monitoring, fumarole gas sampling, tephra distribution mapping, new interpretations of the Schriebers Meadow lava flow, and hazards analyses. Mapping of Carmelo and Sherman craters, and interpretations of the eruptive history, continue, as well. The Mount Baker Volcano Research Center maintains an online archive of abstracts of this work, and an extensive references list, as well as photos.
Eleven named glaciers descend from Mount Baker. Two additional glaciers (Hadley Glacier and Sholes Glacier) descend from lower slopes detached from the main glacial mass. The Coleman Glacier is the largest; it has a surface area of . The other large glaciers—which have areas greater than —are Roosevelt Glacier, Mazama Glacier, Park Glacier, Boulder Glacier, Easton Glacier, and Deming Glacier. All retreated during the first half of the century, advanced from 1950–1975 and have been retreating with increasing rapidity since 1980.
Mount Baker is drained on the north by streams that flow into the North Fork Nooksack River, on the west by the Middle Fork Nooksack River, and on the southeast and east by tributaries of the Baker River. Lake Shannon and Baker Lake are the largest nearby bodies of water, formed by two dams on the Baker River.
Two ammunition ships of the United States Navy (traditionally named for volcanoes) have been named after the mountain. The first was USS "Mount Baker" (AE-4), which was commissioned from 1941 to 1947 and from 1951 to 1969. In 1972, the Navy commissioned USS "Mount Baker" (AE-34). She was decommissioned in 1996 and placed in service with the Military Sealift Command as USNS "Mount Baker" (T-AE-34). She was scrapped in 2012.
|
https://en.wikipedia.org/wiki?curid=20897
|
Strategic sealift ships
Strategic sealift ships are part of the United States Military Sealift Command's (MSC) prepositioning program. There are currently 49 ships in the program, strategically positioned around the world to support the Army, Navy, Air Force, Marine Corps and Defense Logistics Agency. Most are named after Medal of Honor recipients from the service they support. The ships are assigned to two Military Prepositioning Ship (MPS) squadrons located in the Indian Ocean at Diego Garcia and in the Western Pacific Ocean at Guam and Saipan.
The MPS ships in each squadron have sufficient equipment, supplies and ammunition to support a Marine Air-Ground Task Force for 30 days. The MPS ships are self-sustaining, with cranes to unload at sea or pierside. MSC chartered the first two ship classes in the MPS role (the "Corporal Louis J. Hauge Jr." and "Sergeant Matej Kocak" classes) from civilian shipping lines and converted them. Later ships were purpose-built.
The "Sergeant Matej Kocak" class, the second class of MPS ships chartered by MSC, also gained amidships and a helicopter deck after conversion. These ships, delivered to MSC in the mid-1980s, built at Sun Shipbuilding & Drydock Co., Chester, Pennsylvania and converted at National Steel and Shipbuilding Company, San Diego. They were previously owned by Waterman Steamship Corporation but recently sold to MSC and now operated by Keystone Shipping Company.
The "2nd Lieutenant John P. Bobo"-class ships are new construction ships delivered to MSC in the mid-1980s from General Dynamics Quincy Shipbuilding Division, Quincy, Massachusetts They were owned by American Overseas Marine (AMSEA) but have been recently sold to MSC and are now operated by Crowley Technical Management.
The of LMSR built at National Steel and Shipbuilding Company in San Diego
The following are part of the National Defense Reserve Fleet but have been activated and are pre-positioned.
Dedicated to USMC aviation logistics support
Named for Medal of Honor recipient Louis J. Hauge Jr. USMC, the "Corporal Louis J. Hauge Jr." class is the original class of MPS ships chartered by Military Sealift Command. The five ships are Maersk Line ships converted by Bethlehem Steel. During conversion, the ships gained an additional amidships and a helicopter landing pad, among other things. They have since been returned to Maersk for commercial use and are no longer part of the MPS program.
|
https://en.wikipedia.org/wiki?curid=20898
|
Mathias Rust
Mathias Rust (born 1 June 1968) is a German aviator known for his illegal flight that ended with a landing near Red Square in Moscow on 28 May 1987. An amateur pilot, the then-teenager flew from Helsinki, Finland, to Moscow, being tracked several times by Soviet air defence and interceptor aircraft. The Soviet fighters did not receive permission to shoot him down, and his aeroplane was mistaken for a friendly aircraft several times. He landed on Bolshoy Moskvoretsky Bridge, next to Red Square near the Kremlin in the capital of the Soviet Union.
Rust said he wanted to create an "imaginary bridge" to the East, and that his flight was intended to reduce tension and suspicion between the two Cold War sides.
Rust's flight through a supposedly impenetrable air defence system had a great effect on the Soviet military and led to the dismissal of many senior officers, including Minister of Defence Marshal of the Soviet Union Sergei Sokolov and the Commander-in-Chief of the Soviet Air Defence Forces, former World War II fighter ace pilot Chief Marshal Alexander Koldunov. The incident aided Mikhail Gorbachev in the implementation of his reforms, by allowing him to dismiss numerous military officials opposed to his policies. Rust was sentenced to four years in prison for violation of border crossing and air traffic regulations, and for provoking an emergency situation upon his landing. After 14 months in prison, he was pardoned by Andrei Gromyko, the Chairman of the Presidium of the Supreme Soviet, and released.
Rust, aged 18, was an inexperienced pilot, with about 50 hours of flying experience at the time of his flight. On 13 May 1987, Rust left Uetersen near Hamburg and his home town Wedel in his rented Reims Cessna F172P D-ECJB, which was modified by removing some of the seats and replacing them with auxiliary fuel tanks. He spent the next two weeks travelling across northern Europe, visiting the Faroe islands, spending a week in Iceland, and then visiting Bergen on his way back. He was later quoted as saying that he had the idea of attempting to reach Moscow even before the departure, and he saw the trip to Iceland (where he visited Hofdi House, the site of unsuccessful talks between the United States and the Soviet Union in October 1986) as a way to test his piloting skills.
On 28 May 1987, Rust refuelled at Helsinki-Malmi Airport. He told air traffic control that he was going to Stockholm, and took off at 12:21. Immediately after his final communication with traffic control, he turned his plane to the east near Nummela. Air traffic controllers tried to contact him as he was moving around the busy Helsinki–Moscow route, but Rust turned off all his communications equipment.
Rust disappeared from the Finnish air traffic radar near Espoo. Control personnel presumed an emergency and a rescue effort was organized, including a Finnish Border Guard patrol boat. They found an oil patch near Sipoo where Rust had disappeared from radar observation, and conducted an underwater search but did not find anything.
Rust crossed the Baltic coastline over Estonia and turned towards Moscow. At 14:29 he appeared on Soviet Air Defence Forces (PVO) radar and, after failure to reply to an IFF signal, was assigned combat number 8255. Three SAM battalions of 54th Air Defence Corps tracked him for some time, but failed to obtain permission to launch missiles at him. All air defences were brought to readiness and two interceptors were sent to investigate. At 14:48, near Gdov, MiG-23 pilot Senior Lieutenant A. Puchnin observed a white sport plane similar to a Yakovlev Yak-12 and asked for permission to engage, but was denied.
The fighters lost contact with Rust soon after this. While they were being directed back to him, he disappeared from radar near Staraya Russa. West German magazine "Bunte" speculated that he might have landed there for some time, noting that he changed his clothes during his flight and that he took too much time to fly to Moscow considering his plane's speed and the weather conditions.
Air defence re-established contact with Rust's plane several times but confusion followed all of these events. The PVO system had shortly before been divided into several districts, which simplified management but created additional overhead for tracking officers at the districts' borders. The local air regiment near Pskov was on maneuvers and, due to inexperienced pilots' tendency to forget correct IFF designator settings, local control officers assigned all traffic in the area friendly status, including Rust.
Near Torzhok there was a similar situation, as increased air traffic was created by a rescue effort for an air crash the previous day. Rust, flying a slow propeller-driven aircraft, was confused with one of the helicopters taking part in the rescue. He was spotted several more times and given false friendly recognition twice. Rust was considered as a domestic training plane defying regulations, and was assigned the least priority by air defense.
Around 19:00, Rust appeared above Moscow. He had initially intended to land in the Kremlin, but he reasoned that landing inside, hidden by the Kremlin walls, would have allowed the KGB to arrest him and deny the incident. Therefore, he changed his landing spot to Red Square. Heavy pedestrian traffic did not allow him to land there either, so after circling about the square one more time, he was able to land on Bolshoy Moskvoretsky Bridge by St. Basil's Cathedral. A later inquiry found that trolleybus wires normally strung over the bridge—which would have prevented his landing there—had been removed for maintenance that morning, and were replaced the next day. After taxiing past the cathedral, he stopped about from the square, where he was greeted by curious passersby and asked for autographs. When asked where he was from, he replied "Germany" making the bystanders think he was from East Germany; but when he said West Germany, they were surprised. A British doctor videotaped Rust circling over Red Square and landing on the bridge. Rust was arrested two hours later.
Rust's trial began in Moscow on 2 September 1987. He was sentenced to four years in a general-regime labour camp for hooliganism, for disregard of aviation laws, and for breaching the Soviet border. He was never transferred to a labour camp, and instead served his time at the high security Lefortovo temporary detention facility in Moscow. Two months later, Reagan and Gorbachev agreed to sign a treaty to eliminate intermediate-range nuclear weapons in Europe, and the Supreme Soviet ordered Rust to be released in August 1988 as a goodwill gesture to the West.
Rust's return to Germany on 3 August 1988 was accompanied by huge media attention, but he did not talk to the assembled journalists; his family had sold the exclusive rights to the story to the German magazine "Stern" for DM100,000. He reported that he had been treated well in the Soviet prison. Journalists described him as "psychologically unstable and unworldly in a dangerous manner".
William E. Odom, former director of the U.S. National Security Agency and author of "The Collapse of the Soviet Military", says that Rust's flight irreparably damaged the reputation of the Soviet military. This enabled Gorbachev to remove many of the strongest opponents to his reforms. Minister of Defence Sergei Sokolov and the head of the Soviet Air Defence Forces Alexander Koldunov were dismissed along with hundreds of other officers. This was the biggest turnover in the Soviet military since Stalin's purges 50 years earlier.
Rust's rented Reims Cessna F172P (serial #F17202087), registered "D-ECJB", was sold to Japan where it was exhibited for several years. In 2008 it was returned to Germany and was placed in the Deutsches Technikmuseum in Berlin.
While doing his obligatory community service ("Zivildienst") in a West German hospital in 1989, Rust stabbed a female co-worker who had rejected him. The victim barely survived. He was convicted for injuring her and sentenced to two and a half years in prison, but was released after 15 months. Since then he has lived a fragmented life, describing himself as a "bit of an oddball". After being released from court, he converted to Hinduism in 1996 to become engaged to a daughter of an Indian tea merchant. In 2001, he was convicted of stealing a cashmere pullover and ordered to pay a fine of DM10,000, which was later reduced to DM600. A further brush with the law came in 2005, when he was convicted of fraud and had to pay €1,500 for stolen goods. In 2009 Rust described himself as a professional poker player. Most recently, in 2012, he described himself as an analyst at a Zurich-based investment bank.
In October 2015, "The Hindu" published an interview with Mathias Rust to mark the 25th anniversary of German reunification. Mathias Rust surmised that institutional failures in Western countries to preserve moral standards and uphold the primacy of democratic ideals was creating mistrust between peoples and governments. Pointing to the genesis of a New Cold War between Russia and the Western powers, Mathias Rust suggested that India should tread with caution and avoid entanglement: "India will be better served if it follows a policy of neutrality while interacting with EU member countries as the big European powers at present are following the foreign policy of the U.S. unquestioningly". He claimed: "Governments have been dominated by the corporate entities and citizens have ceased to matter in public policy".
Because Rust's flight seemed like a blow to the authority of the Soviet regime, it was the source of numerous jokes and urban legends. For a while after the incident, Red Square was jokingly referred to by Muscovites as Sheremetyevo-3 (Sheremetyevo-1 and -2 being the two terminals at Moscow's main international airport). At the end of 1987, the police radio code used by law enforcement officers in Moscow was allegedly updated to include a code for an aircraft landing.
Following the 20th anniversary of his flight on 28 May 2007, the international media interviewed Rust about the flight and its aftermath.
"The Washington Post" and "Bild" both have online editions of their interviews. The most comprehensive televised interview available online is produced by the Danish Broadcasting Corporation. In their interview "Rust in Red Square", recorded in May 2007, Rust gives a full account of the flight in English.
|
https://en.wikipedia.org/wiki?curid=20900
|
Malware
Malware (a portmanteau for malicious software) is any software intentionally designed to cause damage to a computer, server, client, or computer network (by contrast, software that causes "unintentional" harm due to some deficiency is typically described as a software bug). A wide variety of types of malware exist, including computer viruses, worms, Trojan horses, ransomware, spyware, adware, rogue software, and scareware.
Programs are also considered malware if they secretly act against the interests of the computer user. For example, at one point Sony music Compact discs silently installed a rootkit on purchasers' computers with the intention of preventing illicit copying, but which also reported on users' listening habits, and unintentionally created extra security vulnerabilities.
A range of antivirus software, firewalls and other strategies are used to help protect against the introduction of malware, to help detect it if it is already present, and to recover from malware-associated malicious activity and attacks.
Many early infectious programs, including the first Internet Worm, were written as experiments or pranks. Today, malware is used by both black hat hackers and governments, to steal personal, financial, or business information.
Malware is sometimes used broadly against government or corporate websites to gather guarded information, or to disrupt their operation in general. However, malware can be used against individuals to gain information such as personal identification numbers or details, bank or credit card numbers, and passwords.
Since the rise of widespread broadband Internet access, malicious software has more frequently been designed for profit. Since 2003, the majority of widespread viruses and worms have been designed to take control of users' computers for illicit purposes. Infected "zombie computers" can be used to send email spam, to host contraband data such as child pornography, or to engage in distributed denial-of-service attacks as a form of extortion.
Programs designed to monitor users' web browsing, display unsolicited advertisements, or redirect affiliate marketing revenues are called spyware. Spyware programs do not spread like viruses; instead they are generally installed by exploiting security holes. They can also be hidden and packaged together with unrelated user-installed software. The Sony BMG rootkit was intended to prevent illicit copying; but also reported on users' listening habits, and unintentionally created extra security vulnerabilities.
Ransomware affects an infected computer system in some way, and demands payment to bring it back to its normal state. There are two variations of ransomware, being crypto ransomware and locker ransomware. With the locker ransomware just locking down a computer system without encrypting its contents. Whereas the traditional ransomware is one that locks down your system and encrypts the contents of the system. For example, programs such as CryptoLocker encrypt files securely, and only decrypt them on payment of a substantial sum of money.
Some malware is used to generate money by click fraud, making it appear that the computer user has clicked an advertising link on a site, generating a payment from the advertiser. It was estimated in 2012 that about 60 to 70% of all active malware used some kind of click fraud, and 22% of all ad-clicks were fraudulent.
In addition to criminal money-making, malware can be used for sabotage, often for political motives. Stuxnet, for example, was designed to disrupt very specific industrial equipment. There have been politically motivated attacks which spread over and shut down large computer networks, including massive deletion of files and corruption of master boot records, described as "computer killing." Such attacks were made on Sony Pictures Entertainment (25 November 2014, using malware known as Shamoon or W32.Disttrack) and Saudi Aramco (August 2012).
The best-known types of malware, viruses and worms, are known for the manner in which they spread, rather than any specific types of behavior. A computer virus is software that embeds itself in some other executable software (including the operating system itself) on the target system without the user's knowledge and consent and when it is run, the virus is spread to other executables. On the other hand, a "worm" is a stand-alone malware software that "actively" transmits itself over a network to infect other computers. These definitions lead to the observation that a virus requires the user to run an infected software or operating system for the virus to spread, whereas a worm spreads itself.
These categories are not mutually exclusive, so malware may use multiple techniques. This section only applies to malware designed to operate undetected, not sabotage and ransomware.
A computer virus is software usually hidden within another seemingly innocuous program that can produce copies of itself and insert them into other programs or files, and that usually performs a harmful action (such as destroying data). An example of this is a PE infection, a technique, usually used to spread malware, that inserts extra data or executable code into PE files.
'Lock-screens', or screen lockers is a type of “cyber police” ransomware that blocks screens on Windows or Android devices with a false accusation in harvesting illegal content, trying to scare the victims into paying up a fee.
Jisut and SLocker impact Android devices more than other lock-screens, with Jisut making up nearly 60 percent of all Android ransomware detections.
A Trojan horse is a harmful program that misrepresents itself to masquerade as a regular, benign program or utility in order to persuade a victim to install it. A Trojan horse usually carries a hidden destructive function that is activated when the application is started. The term is derived from the Ancient Greek story of the Trojan horse used to invade the city of Troy by stealth.
Trojan horses are generally spread by some form of social engineering, for example, where a user is duped into executing an e-mail attachment disguised to be unsuspicious, (e.g., a routine form to be filled in), or by drive-by download. Although their payload can be anything, many modern forms act as a backdoor, contacting a controller which can then have unauthorized access to the affected computer. While Trojan horses and backdoors are not easily detectable by themselves, computers may appear to run slower due to heavy processor or network usage.
Unlike computer viruses and worms, Trojan horses generally do not attempt to inject themselves into other files or otherwise propagate themselves.
In spring 2017 Mac users were hit by the new version of Proton Remote Access Trojan (RAT) trained to extract password data from various sources, such as browser auto-fill data, the Mac-OS keychain, and password vaults.
Once malicious software is installed on a system, it is essential that it stays concealed, to avoid detection. Software packages known as "rootkits" allow this concealment, by modifying the host's operating system so that the malware is hidden from the user. Rootkits can prevent a harmful process from being visible in the system's list of processes, or keep its files from being read.
Some types of harmful software contain routines to evade identification and/or removal attempts, not merely to hide themselves. An early example of this behavior is recorded in the Jargon File tale of a pair of programs infesting a Xerox CP-V time sharing system:
A backdoor is a method of bypassing normal authentication procedures, usually over a connection to a network such as the Internet. Once a system has been compromised, one or more backdoors may be installed in order to allow access in the future, invisibly to the user.
The idea has often been suggested that computer manufacturers preinstall backdoors on their systems to provide technical support for customers, but this has never been reliably verified. It was reported in 2014 that US government agencies had been diverting computers purchased by those considered "targets" to secret workshops where software or hardware permitting remote access by the agency was installed, considered to be among the most productive operations to obtain access to networks around the world. Backdoors may be installed by Trojan horses, worms, implants, or other methods.
Since the beginning of 2015, a sizable portion of malware has been utilizing a combination of many techniques designed to avoid detection and analysis. From the more common, to the least common:
An increasingly common technique (2015) is adware that uses stolen certificates to disable anti-malware and virus protection; technical remedies are available to deal with the adware.
Nowadays, one of the most sophisticated and stealthy ways of evasion is to use information hiding techniques, namely stegomalware. A survey on stegomalware was published by Cabaj et al. in 2018.
Another type of evasion technique is Fileless malware or Advanced Volatile Threats (AVTs). Fileless malware does not require a file to operate. It runs within memory and utilizes existing system tools to carry out malicious acts. Because there are no files on the system, there are no executable files for antivirus and forensic tools to analyze, making such malware nearly impossible to detect. The only way to detect fileless malware is to catch it operating in real time. Recently these type attacks have become more frequent with a 432% increase in 2017 and makeup 35% of the attacks in 2018. Such attacks are not easy to perform but are becoming more prevalent with the help of exploit-kits.
Malware exploits security defects (security bugs or vulnerabilities) in the design of the operating system, in applications (such as browsers, e.g. older versions of Microsoft Internet Explorer supported by Windows XP), or in vulnerable versions of browser plugins such as Adobe Flash Player, Adobe Acrobat or Reader, or Java SE. Sometimes even installing new versions of such plugins does not automatically uninstall old versions. Security advisories from plug-in providers announce security-related updates. Common vulnerabilities are assigned CVE IDs and listed in the US National Vulnerability Database. Secunia PSI is an example of software, free for personal use, that will check a PC for vulnerable out-of-date software, and attempt to update it.
Malware authors target bugs, or loopholes, to exploit. A common method is exploitation of a buffer overrun vulnerability, where software designed to store data in a specified region of memory does not prevent more data than the buffer can accommodate being supplied. Malware may provide data that overflows the buffer, with malicious executable code or data after the end; when this payload is accessed it does what the attacker, not the legitimate software, determines.
Early PCs had to be booted from floppy disks. When built-in hard drives became common, the operating system was normally started from them, but it was possible to boot from another boot device if available, such as a floppy disk, CD-ROM, DVD-ROM, USB flash drive or network. It was common to configure the computer to boot from one of these devices when available. Normally none would be available; the user would intentionally insert, say, a CD into the optical drive to boot the computer in some special way, for example, to install an operating system. Even without booting, computers can be configured to execute software on some media as soon as they become available, e.g. to autorun a CD or USB device when inserted.
Malware distributors would trick the user into booting or running from an infected device or medium. For example, a virus could make an infected computer add autorunnable code to any USB stick plugged into it. Anyone who then attached the stick to another computer set to autorun from USB would in turn become infected, and also pass on the infection in the same way. More generally, any device that plugs into a USB port - even lights, fans, speakers, toys, or peripherals such as a digital microscope - can be used to spread malware. Devices can be infected during manufacturing or supply if quality control is inadequate.
This form of infection can largely be avoided by setting up computers by default to boot from the internal hard drive, if available, and not to autorun from devices. Intentional booting from another device is always possible by pressing certain keys during boot.
Older email software would automatically open HTML email containing potentially malicious JavaScript code. Users may also execute disguised malicious email attachments. The "2018 Data Breach Investigations Report" by Verizon, cited by CSO Online, states that emails are the primary method of malware delivery, accounting for 92% of malware delivery around the world.
In computing, privilege refers to how much a user or program is allowed to modify a system. In poorly designed computer systems, both users and programs can be assigned more privileges than they should have, and malware can take advantage of this. The two ways that malware does this is through overprivileged users and overprivileged code.
Some systems allow all users to modify their internal structures, and such users today would be considered over-privileged users. This was the standard operating procedure for early microcomputer and home computer systems, where there was no distinction between an "administrator" or "root", and a regular user of the system. In some systems, non-administrator users are over-privileged by design, in the sense that they are allowed to modify internal structures of the system. In some environments, users are over-privileged because they have been inappropriately granted administrator or equivalent status.
Some systems allow code executed by a user to access all rights of that user, which is known as over-privileged code. This was also standard operating procedure for early microcomputer and home computer systems. Malware, running as over-privileged code, can use this privilege to subvert the system. Almost all currently popular operating systems, and also many scripting applications allow code too many privileges, usually in the sense that when a user executes code, the system allows that code all rights of that user. This makes users vulnerable to malware in the form of e-mail attachments, which may or may not be disguised.
As malware attacks become more frequent, attention has begun to shift from viruses and spyware protection, to malware protection, and programs that have been specifically developed to combat malware. (Other preventive and recovery measures, such as backup and recovery methods, are mentioned in the computer virus article).
A specific component of anti-virus and anti-malware software, commonly referred to as an on-access or real-time scanner, hooks deep into the operating system's core or kernel and functions in a manner similar to how certain malware itself would attempt to operate, though with the user's informed permission for protecting the system. Any time the operating system accesses a file, the on-access scanner checks if the file is a 'legitimate' file or not. If the file is identified as malware by the scanner, the access operation will be stopped, the file will be dealt with by the scanner in a pre-defined way (how the anti-virus program was configured during/post installation), and the user will be notified. This may have a considerable performance impact on the operating system, though the degree of impact is dependent on how well the scanner was programmed. The goal is to stop any operations the malware may attempt on the system before they occur, including activities which might exploit bugs or trigger unexpected operating system behavior.
Anti-malware programs can combat malware in two ways:
Real-time protection from malware works identically to real-time antivirus protection: the software scans disk files at download time, and blocks the activity of components known to represent malware. In some cases, it may also intercept attempts to install start-up items or to modify browser settings. Because many malware components are installed as a result of browser exploits or user error, using security software (some of which are anti-malware, though many are not) to "sandbox" browsers (essentially isolate the browser from the computer and hence any malware induced change) can also be effective in helping to restrict any damage done.
Examples of Microsoft Windows antivirus and anti-malware software include the optional Microsoft Security Essentials (for Windows XP, Vista, and Windows 7) for real-time protection, the Windows Malicious Software Removal Tool (now included with Windows (Security) Updates on "Patch Tuesday", the second Tuesday of each month), and Windows Defender (an optional download in the case of Windows XP, incorporating MSE functionality in the case of Windows 8 and later). Additionally, several capable antivirus software programs are available for free download from the Internet (usually restricted to non-commercial use). Tests found some free programs to be competitive with commercial ones. Microsoft's System File Checker can be used to check for and repair corrupted system files.
Some viruses disable System Restore and other important Windows tools such as Task Manager and Command Prompt. Many such viruses can be removed by rebooting the computer, entering Windows safe mode with networking, and then using system tools or Microsoft Safety Scanner.
Hardware implants can be of any type, so there can be no general way to detect them.
As malware also harms the compromised websites (by breaking reputation, blacklisting in search engines, etc.), some websites offer vulnerability scanning.
Such scans check the website, detect malware, may note outdated software, and may report known security issues.
As a last resort, computers can be protected from malware, and infected computers can be prevented from disseminating trusted information, by imposing an "air gap" (i.e. completely disconnecting them from all other networks). However, malware can still cross the air gap in some situations. For example, removable media can carry malware across the gap.
"AirHopper", "BitWhisper", "GSMem" and "Fansmitter" are four techniques introduced by researchers that can leak data from air-gapped computers using electromagnetic, thermal and acoustic emissions.
Grayware is a term applied to unwanted applications or files that are not classified as malware, but can worsen the performance of computers and may cause security risks.
It describes applications that behave in an annoying or undesirable manner, and yet are less serious or troublesome than malware. Grayware encompasses spyware, adware, fraudulent dialers, joke programs, remote access tools and other unwanted programs that may harm the performance of computers or cause inconvenience. The term came into use around 2004.
Another term, potentially unwanted program (PUP) or potentially unwanted application (PUA), refers to applications that would be considered unwanted despite often having been downloaded by the user, possibly after failing to read a download agreement. PUPs include spyware, adware, and fraudulent dialers. Many security products classify unauthorised key generators as grayware, although they frequently carry true malware in addition to their ostensible purpose.
Software maker Malwarebytes lists several criteria for classifying a program as a PUP. Some types of adware (using stolen certificates) turn off anti-malware and virus protection; technical remedies are available.
Before Internet access became widespread, viruses spread on personal computers by infecting executable programs or boot sectors of floppy disks. By inserting a copy of itself into the machine code instructions in these programs or boot sectors, a virus causes itself to be run whenever the program is run or the disk is booted. Early computer viruses were written for the Apple II and Macintosh, but they became more widespread with the dominance of the IBM PC and MS-DOS system. The first IBM PC virus in the "wild" was a boot sector virus dubbed (c)Brain, created in 1986 by the Farooq Alvi brothers in Pakistan. Executable-infecting viruses are dependent on users exchanging software or boot-able floppies and thumb drives so they spread rapidly in computer hobbyist circles.
The first worms, network-borne infectious programs, originated not on personal computers, but on multitasking Unix systems. The first well-known worm was the Internet Worm of 1988, which infected SunOS and VAX BSD systems. Unlike a virus, this worm did not insert itself into other programs. Instead, it exploited security holes (vulnerabilities) in network server programs and started itself running as a separate process. This same behavior is used by today's worms as well.
With the rise of the Microsoft Windows platform in the 1990s, and the flexible macros of its applications, it became possible to write infectious code in the macro language of Microsoft Word and similar programs. These "macro viruses" infect documents and templates rather than applications (executables), but rely on the fact that macros in a Word document are a form of executable code.
The notion of a self-reproducing computer program can be traced back to initial theories about the operation of complex automata. John von Neumann showed that in theory a program could reproduce itself. This constituted a plausibility result in computability theory. Fred Cohen experimented with computer viruses and confirmed Neumann's postulate and investigated other properties of malware such as detectability and self-obfuscation using rudimentary encryption. His 1987 doctoral dissertation was on the subject of computer viruses. The combination of cryptographic technology as part of the payload of the virus, exploiting it for attack purposes was initialized and investigated from the mid 1990s, and includes initial ransomware and evasion ideas.
|
https://en.wikipedia.org/wiki?curid=20901
|
Muttiah Muralitharan
Deshabandu Muttiah Muralitharan (, ; also spelt Muralidaran; born 1972) is a Sri Lankan cricket coach and former cricketer. Averaging over six wickets per Test, Muralitharan is one of the most successful bowlers ever.
Muralitharan's career was beset by controversy over his bowling action for much of his international career. Due to an unusual hyperextension of his congenitally bent arm during delivery, his bowling action was called into question on a number of occasions by umpires and sections of the cricket community. After biomechanical analysis under simulated playing conditions, Muralitharan's action was cleared by the International Cricket Council, first in 1996 and again in 1999.
Muralitharan held the number one spot in the International Cricket Council's player rankings for Test bowlers for a record period of 1,711 days spanning 214 Test matches. Muralitharan holds the world record for the most wickets in both Test and one-day cricket. He became the highest wicket-taker in Test cricket when he overtook the previous record-holder Shane Warne on 2007. Muralitharan had previously held the record when he surpassed Courtney Walsh's 519 wickets in 2004, but he suffered a shoulder injury later that year and was overtaken by Warne. Muralitharan took the wicket of Gautam Gambhir on 5 February 2009 in Colombo to surpass Wasim Akram's ODI record of 502 wickets. He retired from Test cricket in 2010, registering his 800th and final wicket on 22 July 2010 from his final ball in his last Test match.
Muralitharan was rated the greatest Test match bowler ever by "Wisden Cricketers' Almanack" in 2002. In 2017, he became the only Sri Lankan to be inducted into the ICC Cricket Hall of Fame. He won the Ada Derana Sri Lankan of the Year in 2017. He was the sixth international franchise player signed to the Caribbean Premier League and the first Sri Lankan player to be named to the new Twenty20 tournament.
Muralitharan was born 17 April 1972 to a Hill Country Tamil Hindu family in Kandy, Sri Lanka. The eldest of the four sons to Sinnasamy Muttiah and Lakshmi. Muralitharan's father Sinnasamy Muttiah, runs a successful biscuit-making business. Muralitharan's paternal grandfather Periyasamy Sinasamy came from South India to work in the tea plantations of central Sri Lanka in 1920. Sinasamy later returned to the country of his birth with his daughters and settled in Tiruchirapalli, Tamil Nadu, India. However, his sons, including Muralitharan's father Muttiah, remained in Sri Lanka.
When he was nine years old, Muralitharan was sent to St. Anthony's College, Kandy, a private school run by Benedictine monks. He began his cricketing career as a medium pace bowler but on the advice of his school coach, Sunil Fernando, he took up off-spin when he was fourteen years old. He soon impressed and went on to play for four years in the school First XI. In those days he played as an all-rounder and batted in the middle order. In his final two seasons at St Anthony's College he took over one hundred wickets and in 1990/1 was named as the 'Bata Schoolboy Cricketer of the Year'.
After leaving school he joined Tamil Union Cricket and Athletic Club and was selected for the Sri Lanka A tour of England in 1991. He played in five games but failed to capture a single wicket. On his return to Sri Lanka he impressed against Allan Border's Australian team in a practice game and then went on to make his Test debut at R. Premadasa Stadium in the Second Test match of the series.
When his grandfather died at the age of 104 in July 2004, Muralitharan returned home from a tour of India to attend his funeral. Periyasamy Sinasamy's first wish to see Muralitharan claiming the world record for the most Test wickets was realised (passing the record set by Courtney Walsh), but not his desire to live to see his grandson married. Muralitharan's grandmother had died one month earlier at the age of 97. Muralitharan's manager, Kushil Gunasekera stated that "Murali's family is closely knit and united. They respect traditional values. The late grandfather enjoyed a great relationship with Murali."
Muralitharan married Madhimalar Ramamurthy, a Chennai native, on 2005. Madhimalar is the daughter of late Dr S. Ramamurthy of Malar Hospitals, and his wife Dr Nithya Ramamurthy. Their first child, Naren, was born in January 2006.
Muttiah Muralitharan holds Overseas Citizenship of India (OCI) and he does not need a visa for travelling to India. According to his manager, Kushil Gunasekera, Muralitharan qualifies for this status because his family originates from India. Muttiah announced on 3 April 2011 that he was retiring from all sport.
Even though his name was widely romanised as Muralitharan from the start of his career, he prefers the spelling Muralidaran. The different spellings have arisen because the Tamil letter த can be pronounced as both 't' and 'd' depending on its place in a word. It is often transliterated as 'th' to distinguish it from another letter, ட, which is a retroflex 't' or 'd'. In 2007, when Cricket Australia decided to unveil the new Warne-Muralidaran Trophy, to be contested between Australia and Sri Lanka, Muralitharan was requested to clarify how his name should be spelt. Cricket Australia spokesman Peter Young confirmed that "the spelling he's given is Muralidaran".
The first-day cover involving Muralitharan bears an official seal captioned as "The highest wicket taker in Test cricket, MUTHIAH MURALIDARAN, First Day of Issue 03.12.2007, Camp Post Office, Asgiriya International Cricket Stadium, Kandy".
The name Muralitharan is derived from "murali dhar" (Devnagri: मुरली धर) meaning "the bearer of the flute", which is a synonym for Lord Krishna, a deity in Hinduism who is said to play upon his bamboo flute while looking after cattle. A variation of this Sanskrit name spelt as 'Muralidharan' and 'Muraleedharan' is a common name amongst Tamil and Malayali Hindus.
In domestic cricket, Muralitharan played for two first-class Sri Lankan sides, Tamil Union Cricket and Athletic Club in the Premier Trophy and Central Province in the Provincial Championship. His record is exceptional – 234 wickets at 14.51 runs in 46 matches.
He also played county cricket in England, mainly for Lancashire (1999, 2001, 2005 and 2007) where he appeared in twenty-eight first-class games for the club. He played five first class games for Kent during the 2003 season. His bowling record in English domestic cricket is also exceptional – 236 wickets at 15.62 runs in 33 matches. Despite his efforts, he was never on a title winning first-class domestic team in either the Premier Trophy or the County Championship. He was unusual amongst his contemporaries in that he played in more Test matches than other first-class games (116 Tests and 99 other first class matches as of 2007). Muralitharan had been signed by Gloucestershire in 2011 to play in T20 matches, and he also renewed his T20 contract with Gloucestershire in 2012, but did not stay for the 2013 season.
Muralitharan was contracted to represent Bengal in the 2008–09 Ranji Trophy tournament. He was expected to play about four matches in the tournament's second division – the Plate League.
In February 2008, Muralitharan was slated to play Twenty20 cricket for the Chennai Super Kings in the Indian Premier League (IPL). He was bought for $600,000 by India Cements, the Chennai franchisee of the IPL, through a bidding process. The Chennai Super Kings were the runners up in the inaugural edition of the IPL, losing to the Rajasthan Royals in the final. Muralitharan captured 11 wickets in 15 games, at an economy rate of 6.96 an over. In 2010, in the third season of IPL, Muralitharan was part of the Chennai Super Kings side that won the IPL championship. Muralitharan also remained the side's leading wicket-taker after all the three tournaments.
At the 2011 IPL Player Auction Muralitharan was bought by Kochi Tuskers Kerala for US$1.1 million.
In the 2012 season Muralitharan moved to Royal Challengers Bangalore where he took 14 wickets in 9 games and had an average economy rate of 6.38. He played for Royal Challengers Bangalore from 2012 to 2014 and he later decided to retire from the IPL in 2014.
In 2015, Muralitharan was appointed as the bowling coach and mentor of the IPL team Sunrisers Hyderabad.
Muttiah Muralitharan signed for the Melbourne Renegades to play Twenty20 cricket in the Big Bash League, in 2012. He stated, "I wanted to play one season in Australia and the opportunity from the Melbourne Renegades was there so I took it with both hands."
Muralitharan is the first wrist-spinning off-spinner in the history of the game.
He bowls marathon spells, yet he is usually on the attack. His unique bowling action begins with a short run-up, and culminates with an open-chested extremely wristy release from a partly supinated forearm which had him mistaken for a leg-spinner early in his career by Allan Border. Aside from his stock delivery, the off-break, of which he claimed to have two variations (during a recorded television 'doosra' show off with Mark Nicholas from Channel 4 in 2004), his main deliveries are a fast topspinner which lands on the seam and usually goes straight on, and the doosra, a surprise delivery which turns from leg to off (the opposite direction of his stock delivery) with no easily discernible change of action. Additionally, he would occasionally use one of his several unnamed novelties. His super-flexible wrist makes him especially potent and guarantees him turn on any surface.
From his debut in 1992, Muralitharan took 800 Test wickets and over 500 One Day International wickets, becoming the first player to take 1,000 wickets combined in the two main forms of international cricket.
On 28 August 1992 at the age of 20, Muralitharan made his debut against Australia at the Khettarama Stadium and claimed 3 for 141. Craig McDermott was his first Test wicket. His freakish action and his angular run-up showed that this was no run-of-the-mill spinner. During his first Test, there was one dismissal which convinced many of Muralitharan's special powers. Tom Moody's leg-stump was dislodged when he shouldered arms to a delivery that pitched at least two feet outside the off-stump.
The youthful Muralitharan went from strength to strength, playing a major part in Sri Lanka's back-to-back Test victories against England and New Zealand in 1992–93. It was at this point in his career that he struck a close bond with his leader, mentor and one time business partner, the authoritative captain Arjuna Ranatunga. This relationship formed the bedrock of his success and meant that there were few doubts about his status as the team's sole wicket-taker. Ranatunga was thoroughly convinced that Muralitharan's precocious talent would signal a new era in Sri Lanka's short Test history.
In August 1993 at Moratuwa, Muralitharan captured 5 for 104 in South Africa's first innings, his first five-wicket haul in Tests. His wickets included Kepler Wessels, Hansie Cronje and Jonty Rhodes.
Muralitharan continued to baffle batsman outside the shores of Sri Lanka, irrespective of the team's performance. In Sri Lanka's humiliating drubbing at the hands of India in 1993–94, where all three Tests were innings defeats, Muralitharan was the sole success, with 12 wickets in the rubber. His perseverance in the face of some astronomical scores by the fearsome quartet of Mohammed Azharuddin, Sachin Tendulkar, Navjot Sidhu and Vinod Kambli was in sharp contrast to the submission with which his teammates played the series.
It was in New Zealand in March 1995 that Muralitharan displayed his qualities as a match-winner on any surface. In Sri Lanka's first triumph on foreign soil, Muralitharan confused the crease-bound New Zealanders on a grassy pitch in Dunedin. The Sri Lankan manager Duleep Mendis' claim that Muralitharan can turn the ball on concrete was confirmed. On the eve of his tour of Pakistan later that year, doubts were cast on his ability to trouble subcontinental batsmen. By taking 19 wickets in the series and delivering a historic 2–1 victory, the off-spinner silenced the doubters. The Pakistanis, who had negotiated Warne's leg-breaks in the previous home series, were never at ease against him.
Prior to the eventful Boxing Day Test of 1995, Muralitharan had captured 80 wickets in 22 Tests at an unflattering average of 32.74. Even at that point in his career he was the leading wicket taker for Sri Lanka having gone past Rumesh Ratnayake's aggregate of 73 wickets.
During the second Test between Sri Lanka and Australia at the Melbourne Cricket Ground on Boxing Day 1995, Australian umpire Darrell Hair called Sri Lankan spinner Muttiah Muralitharan for throwing in front of a crowd of 55,239. The off-spinner was no-balled seven times in three overs by Hair, who believed the then 23-year-old was bending his arm and straightening it in the process of delivery; an illegal action in cricket.
Muralitharan had bowled two overs before lunch from umpire Steve Dunne's or the Members' End of the ground with umpire Hair at square leg and these passed without incident. At he took up the attack from umpire Hair's or the southern end. Muralitharan's third over was a maiden with all deliveries again passed as legitimate but in his fourth Hair no-balled him twice for throwing on the fourth and sixth balls. The umpire continued to call him three times in his fifth over on the second, fourth and sixth balls. While the bowler stood with his hands on his hips perplexed, the five calls provoked an immediate response by the Sri Lankan captain Arjuna Ranatunga who left the field at to take advice from his team management. He returned at and continued with Muralitharan who was called two more times in his sixth over on the second and sixth balls. At Ranatunga removed the bowler from the attack, although he reintroduced him at at umpire Dunne's end. Although Hair reports in his book, "Decision Maker", that at the end of the tea break he stated that he would call Muralitharan no matter which end he bowled he did not do so. Muralitharan completed another twelve overs without further no-balls and, after bowling Mark Waugh, finished the day with figures of 18–3–58–1.
After being no-balled Muralitharan bowled a further 32 overs from umpire Steve Dunne's end without protest from either Dunne or Hair, at square leg. The Sri Lankan camp was outraged after the incident, but the ICC defended Hair, outlining a list of steps they had taken in the past to determine, without result, the legitimacy of Muralitharan's action. By calling Muralitharan from the bowlers' end Hair overrode what is normally regarded as the authority of the square leg umpire in adjudicating on throwing. Dunne would have had to break convention to support his partner.
At the end of the match the Sri Lankans requested from the ICC permission to confer with Hair to find out exactly how to remedy the problem with their bowler. Despite the game's controlling body agreeing to it, the Australian Cricket Board vetoed it on the grounds that it might lead to umpires being quizzed by teams after every game and meant that the throwing controversy would continue into the World Series Cup during the coming week. The Sri Lankans were disappointed they did not get an explanation and decided they would continue playing their bowler in matches not umpired by Hair and wanted to know whether other umpires would support or reject Hair's judgement.
Muralitharan's action was cleared by the ICC after biomechanical analysis at the University of Western Australia and at the Hong Kong University of Science & Technology in 1996. They concluded that his action created the 'optical illusion of throwing'.
On 16 March 1997, Muralitharan became the first Sri Lankan to reach 100 test wickets, when he dismissed Stephen Fleming in the second innings of the Hamilton Test.
In January 1998, Muralitharan took his first ten-wicket haul against Zimbabwe in the first test at Kandy. Sri Lanka won by eight wickets and Muralitharan had figures of 12 for 117.
In August that same year Muralitharan produces his career-best test match figures of 16 for 220, in the one-off test against England. In England's second innings Muralitharan bowled a marathon 54.2 overs to pick up 9 for 65 runs, the other wicket being a run out. Ben Hollioake becomes his 200th test wicket. Sri Lanka won by ten wickets, their first Test victory in England. After breaking the world record for the most test wickets in 2007, Muralitharan commented that his 1998 performance at the Oval against England, was his career highlight. He stated "Everyone thought I was a good bowler then and I didn't look back from there."
Playing his 58th test, Muralitharan claimed his 300th test wicket when he dismissed Shaun Pollock in the First Test in Durban, in December 2000. Only Dennis Lillee reached the milestone faster, in his 56th test.
On 4 January 2002 in Kandy Muralitharan might have finished with the best-ever figures for a single innings, but after he had claimed nine wickets against Zimbabwe Russel Arnold dropped a catch at short leg.
He missed out on the tenth when Chaminda Vaas dismissed Henry Olonga caught behind amid stifled appeals. Muralitharan follows up his 9 for 51 in the first innings with 4 for 64 in the second, equalling Richard Hadlee's record of 10 ten-wicket match hauls, but needing 15 fewer Tests to do so.
On 15 January 2002 playing in his 72nd test, Muralitharan became the fastest and youngest to reach the 400-wicket landmark when he bowled Olonga in the third Test in Galle.
On 16 March 2004 Muralitharan became the fastest and the youngest bowler to reach 500 wickets during the second test between Sri Lanka and Australia played in Kandy. In his 87th test, he bowled Kasprowicz to claim his 500th victim just four days after Warne reached the landmark on the fifth day of the First Test between the two teams at Galle. Warne took 108 tests to reach 500. Muralitharan took 4–48 on the first day of the second Test as Australia were skittled for 120 in the first innings.
In May 2004, Muralitharan overtook West Indian Courtney Walsh's record of 519 Test match wickets to become the highest wicket-taker. Zimbabwe's Mluleki Nkala becomes Muralitharan's 520th scalp in Tests. Muralitharan held the record until Shane Warne claimed it in October 2004. Warne surpassed Sri Lankan Muttiah Muralitharan's mark of 532 wickets by dismissing India's Irfan Pathan. Warne said he enjoyed his duel with Muralitharan, who was sidelined following shoulder surgery at the time.
After an outstanding year Muralitharan was adjudged as the Wisden Leading Cricketer in the World in 2006. In six Tests, he took 60 wickets. He took ten in each of four successive matches, the second time he performed such a feat. The opponents for his 60-wicket haul were England away, South Africa at home and New Zealand away: serious opposition. In all, Muralitharan took 90 wickets in 11 Tests in the calendar year.
For his performances in 2006, he was named in the World Test XI by ICC and Cricinfo.
In July 2007, Muttiah Muralitharan became the second bowler after Australia's Shane Warne to capture 700 Test wickets. The off-spinner reached the landmark when he had Bangladesh's last man Syed Rasel caught in the deep by Farveez Maharoof on the fourth day of the third and final Test at the Asgiriya stadium in Kandy. The dismissal signalled Sri Lanka's victory by an innings and 193 runs to give the host a 3–0 sweep of the series. Muralitharan finished with six wickets in each innings to claim 10 wickets or more in a Test for the 20th time. However, he was unable to pass Warne's record of 708 wickets when Sri Lanka toured Australia in November 2007, capturing just four wickets in two Test matches.
Muralitharan reclaimed the record for most Test wickets during the first Test against England at Kandy on 2007. The spinner bowled England's Paul Collingwood to claim his 709th Test victim and overtaking Shane Warne in the process. Muralitharan reached the mark in his 116th Test – 29 fewer than Warne – and had conceded only 21.77 runs per wicket compared to the Australian's 25.41. This was Muralitharan's 61st 5-wicket haul. Warne believed that Muralitharan would take "1,000 wickets" before he retired. Former record holder Courtney Walsh also opined that this would be possible if Muralitharan retained his hunger for wickets. Muralitharan himself believed there was a possibility that he would reach this milestone. For his performances in 2007, he was named in the World Test XI by ICC and Cricinfo.
In July 2008, Muralitharan and Ajantha Mendis stopped India's strong batting as Sri Lanka won the first Test by a record innings and 239 runs in Colombo. Muralitharan finished the match with 11 wickets for 110, as India were shot out for 138 in their second innings after conceding a lead of 377 on the fourth day. He was well supported by debutant Ajantha Mendis, an unorthodox spinner with plenty of variation, who took eight wickets in his debut match.
Muralitharan believed the emergence of Mendis would help prolong his own career. Muralitharan, 36, and 23-year-old Mendis formed a formidable partnership in the first Test thrashing of India, taking 19 of the 20 wickets between them. "If he keeps performing this way, he will definitely take a lot of wickets in international cricket. Now that he has come, I think I can play Test cricket a few more years. Bowling 50 overs in a Test innings is very hard. Now if I bowl only 30–35 and he bowls more than me, the job will get easier for me."
For his performances in 2008, he was named in the World Test XI by ICC.
In July 2007, Muralitharan achieved a career peak Test Bowling Rating of 920, based on the LG ICC Player Rankings. This is the highest ever rating achieved by a spin bowler in Test cricket. This also puts him in fourth place in the LG ICC Best-Ever Test bowling ratings.
Muralitharan has the unique distinction of getting 10 or more wickets in a match against all other nine Test playing nations as well as capturing over 50 wickets against each of them. He also obtained 7 or more wickets in an innings against five nations, namely England, India, South Africa, West Indies and Zimbabwe (refer to table above). Muttiah Muralitharan also took at least five five-fors against all the other nine Test sides.
He currently holds the highest wickets/match ratio (6.1) for any bowler with over 200 Test wickets and also represented Sri Lanka in 118 Tests of the 175 that they have played (67.4%).
Against teams excluding Bangladesh and Zimbabwe, Muralitharan took 624 wickets in 108 Tests. By comparison, excluding his matches against Bangladesh and Zimbabwe, Warne took 691 wickets in 142 tests. Murali's average of 24.05 is slightly superior to Warne's career average of 25.41. Muralitharan won 18 Man of the Match awards in Test cricket.
During Muralitharan's playing days, the ICC Future Tours Programme denied Sri Lanka and several other teams a level playing field. As a consequence Muralitharan never toured South Africa after December 2002 and never playing a Test at the spin-friendly Sydney Cricket Ground.
Another comparison of Muralitharan's bowling record against other successful international bowlers is their career record away from home. Muralitharan received criticism that he enjoyed great success on home soil, taking wickets on pitches that are more spin-friendly than other international pitches. A quick analysis of his Test record of matches played outside Sri Lanka shows that from 52 matches he took 278 wickets at an average of 26.24 runs per wicket, with a strike rate of 60.1 balls per wicket. Similarly, spin bowling rival Shane Warne retired with a slightly superior 'away' record of 362 wickets from 73 matches, at an average of 25.50 and a strike rate of 56.7. Due to the variabilities of Test cricket such as grounds played at and opposition played against it is difficult to compare the quality of the top level players and, as such, is very difficult and subjective. However, it is clear that Muralitharan did much better playing at home to test minnows Zimbabwe and Bangladesh, averaging less than 16 runs a wicket.
Cricinfo's statistics editor S Rajesh concluded that the decade 2000–2009 was the best 10-year period for Test batsmen since the 1940s. Muralitharan was clearly the leading Test wicket-taker during this period, capturing 565 wickets at 20.97 in spite of the dominance of the bat over ball. Shane Warne captured 357 wickets at an average of 25.17 during the decade. Of spinners with over Test 100 wickets only John Briggs (17.75), Jim Laker (21.24), Bill O Reilly (22.59) and Clarrie Grimmett (24.21) have sub 25.00 bowling averages.
Muralitharan was on the winning side on 54 of the 133 test matches he played. In those games he captured a total of 438 wickets (8.1 wickets per match), at an outstanding average of 16.18 per wicket and a strike rate of 42.7.
Muralitharan took 795 wickets for his country Sri Lanka in 132 tests. The next most wickets for Sri Lanka in these 132 Tests was Chaminda Vaas' 309 – less than 40% of the spinner's pile. No one else managed 100. Collectively Sri Lankan bowlers tallied 1968 wickets across that span, of which Muralitharan accounted for 40.4%. Among the 24 other Sri Lankans who took more than 10 of those wickets, only Lasith Malinga did so at a better strike rate (52.3) than Muralitharan's 54.9 – and the latter bowled rather more overs, 6657.1 of them to be precise.
On 12 August 1993 Muralitharan made his One Day International (ODI) debut against India at the Khettarama Stadium and took 1 for 38 off ten overs. Praveen Amre was his first ODI wicket.
On 27 October 2000 in Sharjah, Muralitharan captured 7 for 30 against India, which were then the best bowling figures in One Day Internationals.
On 9 April 2002 Muralitharan achieved a career peak ODI Bowling Rating of 913, based on the LG ICC Player Rankings. This is the highest ever rating achieved by a spin bowler in One Day Internationals. This also puts him in fourth place in the LG ICC Best-Ever ODI bowling ratings.
In 2006, Muralitharan had the second (now third) highest number of runs (99) hit off him in a One Day International Innings. The Australians, especially Adam Gilchrist, attacked Muralitharan's bowling more than usual that day. Yet, for his performances in 2006, he was named in the World ODI XI by the ICC. Muralitharan does not have a great record against the Australians in ODIs and this was proved again as he was ineffective in the finals of the 2007 World Cup; his chief tormentor again being Gilchrist. Yet, for his performances in 2007, he was named in the World ODI XI by the ICC. He was named in the 'Team of the Tournament' by Cricinfo for the 2007 World Cup.
Muralitharan played in five Cricket World Cup tournaments, in 1996, 1999, 2003, 2007 and 2011. He captured 67 World Cup wickets and is second in the list behind Glenn McGrath who has 71, and represented Sri Lanka in three World Cup finals. In 1996 Muralitharan was part Sri Lanka's World Cup winning team that defeated Australia in Lahore, Pakistan. Muralitharan also played in the 2007 World Cup final, when Australia defeated Sri Lanka in Bridgetown, Barbados. He picked up 23 wickets in the 2007 World Cup, and finished as the second highest wicket taker in the tournament behind Glenn McGrath. He was part of the 2011 team who lost the world cup final against India in Mumbai. It was his farewell match as well. He was named in the 'Team of the Tournament' for the 2011 World Cup by the ICC.
Muttiah Muralitharan was left out of the Sri Lankan one-day squad to tour West Indies in April 2008. The chairman of selectors Ashantha De Mel clarifying the non-selection stated that "We know he (Muralitharan) can still play in the next World Cup if he is properly looked after, so we want to use him sparingly to preserve him for the big games and the World Cup coming up in the Asian sub-continent where Muralitharan will be a threat."
Muralitharan has the highest number of career wickets in One Day Internationals, having overtaken Wasim Akram on 2009. Akram took 502 wickets in 356 matches. On 2009, Muralitharan dismissed Yuvraj Singh in his 327th match, the third ODI against India in Colombo to equal Akram's record. He won 13 Man of the Match awards in this form of the game.
An aggressive lower order batsman who usually batted at No. 11, Muralitharan was known for his tendency to back away to leg and slog. Sometimes, he could be troublesome for bowlers because of his unorthodox and adventurous ways. Once, in a Test match against England, while playing Alex Tudor, he moved back towards his leg stump trying to hook the ball and ended up lying on the ground sideways after the shot. He was infamously run out in a match against New Zealand when he left his crease to congratulate Kumar Sangakkara, who had just scored a single to reach his century; the New Zealand fielder had not yet returned the ball to the wicketkeeper, so the ball was still in play. His highest Test score of 67 came against India at Kandy in 2001, including three sixes and five fours. He made valuable scores on occasion, including 30 runs against England at the Oval in 1998, including 5 fours, 38 runs (4 fours, 1 six) against England at Galle in 2003, 43 runs (5 fours, 3 sixes) against Australia at Kandy in 2004 36 runs against the West Indies at Colombo in 2005, and his highest-ever ODI score, 33 not out (4 fours and 2 sixes off 16 balls) against Bangladesh in the final of the 2009 Tri-Series in Bangladesh. In the latter match, Muralitharan's effort, which included three fours and a six off one over, played a key role in Sri Lanka winning the match and series after the first eight overs saw them reduced to 6 for 5, the lowest score ever recorded in an ODI at the fall of the fifth wicket. Muralitharan has a strike rate close to 70 in Test cricket and scored over 55% of his Test runs in fours and sixes.
Muralitharan, together with Chaminda Vaas, holds the record for the highest 10th wicket partnership in Tests for Sri Lanka. The pair put on 79 runs for the last wicket at the Asgiriya Stadium against Australia in March 2004. Muralitharan also holds the record for scoring most runs in Test cricket while batting at the number 11 position.
Muralitharan currently holds the record for the most ducks (dismissals for zero) ever in international cricket (Tests, ODI's and Twenty20), with a total of 59 ducks.
Muralitharan voiced his frustration at routinely being heckled by Australian crowds who accuse him of throwing – one common jeer directed at him was "No Ball!". Following the then Australian Prime Minister John Howard's statement that Muralitharan was a "chucker", in 2004, Muralitharan indicated that he would skip future tours to Australia.
Tom Moody, the former Sri Lanka coach and former Australian Test cricketer, said he was embarrassed by the derogatory reaction and negative attention directed towards Muttiah Muralitharan by Australian crowds. Moody stated that "As an Australian when I have been with the Sri Lankan team in Australia, or playing against them in the World Cup, it's the only situation we find in the whole of the cricketing world where we have this disgraceful slant on a cricketer".
During the 2008 CB series in Australia, some members of the Sri Lankan contingent including Muralitharan, were the target of an egg throwing incident in Hobart. The Sri Lankan cricket selector Don Anurasiri was hit by an egg, while Muralitharan and two others were verbally abused by a car-load of people as they were walking from a restaurant back to the hotel.
Due to the incident taking place at night, it is unclear whether Muralitharan was indeed the target of the culprits. Even though the Australian coach of the Sri Lankan team, Trevor Bayliss, down-played the incident as "a non-event", Cricket Australia tightened security around the team. In response to this episode Muralitharan was quoted as saying "When you come to Australia, you expect such incidents".
At the conclusion of Muralitharan's test career cricket writer Rahul Bhattacharya summed up Muralitharan's trials thus:
"Murali is described often as a fox. This seems right. Unlike hedgehog bowlers who pursue one big idea, Murali, like a fox, had many ways of pursuit. Like a fox he did not hunt in a pack. Like a fox he was himself cruelly hunted for sport in some parts of the world. Fox hunting was banned a few years ago in England, but is still legal in Australia."
On 7 July 2010, Muttiah Muralitharan formally announced his retirement from Test cricket at a media briefing in Colombo. He confirmed that the first Test Match against India due to commence on , 2010 would be his last, but indicated that he was willing to play One-Day Internationals if it was considered necessary leading up to the 2011 World Cup, which Sri Lanka co-hosted. He identified Sri Lanka's World Cup win of 1996 as his greatest moment as a cricketer. He also stated that there were some regrets during his 19-year playing career. "Not winning Test matches in South Africa, Australia and India are regrets. But I am sure we will win very soon."
At the start of his last match, Muralitharan was eight short of 800 wickets. At the fall of the ninth wicket of the Indian's second innings Muralitharan still needed one wicket to reach the milestone. After 90 minutes of resistance Muralitharan was able to dismiss the last Indian batsman Pragyan Ojha on the last delivery of the final over of his Test career. By doing so he became the only bowler to reach 800 wickets in Test cricket. Sri Lanka won the match by 10 wickets, the seventh time they have done so and the second time they have done it against India.
In late 2010, Muralitharan announced his retirement from international cricket after 2011 Cricket World Cup, co-hosted by Bangladesh, India and Sri Lanka announcing "This World Cup will be my last outing. I am retiring totally from international cricket thereafter. My time is up. I've signed up to play for two years in IPL." His final ODI appearance in Sri Lankan soil came during the semi-final clash against New Zealand, where Muralitharan took the wicket of Scott Styris in his last delivery. His last ODI was against India in the World Cup final at Mumbai, however Sri Lanka lost the match and Murali couldn't take any wickets.
In July 2014, he played for the Rest of the World side in the Bicentenary Celebration match at Lord's.
Muralitharan is the bowling coach of Sunrisers Hyderabad, a role he has undertaken for the past two years. Under in his tenure the Sunrisers Hyderabad emerged as IPL Champions in 2016. He has also been appointed as the head coach of Thiruvallur Veerans in the 2nd edition of the TNPL.
In 2014, Muralitharan joined the Australian national team as a coaching consultant for the Test series against Pakistan in the United Arab Emirates. On 11 March 2014, he was appointed as the spin bowling consultant for the Cricket Association of Bengal. The tenure started with the players in a four-day camp beginning on 15 March 2014.
He was again called up for the Australian team prior to Australia's tour of Sri Lanka in 2015. Despite his presence in the team as consultant, Australia failed to win any of the three Test matches, losing the series 3–0. Muralitharan's role in the Australian team generated controversy throughout the country and Sri Lanka Cricket, and Muralitharan traded verbal blows with the Sri Lanka team manager following an altercation. The Head of SLC Thilanga Sumathipala warned Muralitharan for attempting to coach the Australian team, the team which gave more pressure to Muralitharan in the past due to his bowling actions. Muralitharan said that the team which was against him in the past but now called him to coach them to play against Sri Lanka was a big victory in his career.
Muttiah Muralitharan holds a number of world records, and several firsts:
Tests – Most Wickets Taken Bowled stumped (47) and caught & bowled (35) jointly with Anil Kumble. Bowled by Muralitharan (b Muralitharan) is the most common dismissal in Test cricket (excluding run out).
In 2002, Wisden carried out a statistical analysis of all Test matches in an effort to rate the greatest cricketers in history, and Muralitharan was ranked as the best Test bowler of all time. However, two years earlier, Muralitharan was not named as one of the five Wisden Cricketers of the Century. Former Australian captain Steve Waugh called him "the Don Bradman of bowling".
Muralitharan was selected as the Wisden Leading Cricketer in the World in 2000 and in 2006.
On 15 November 2007, the Warne-Muralidaran Trophy was unveiled named after the two leading wicket-takers in Test cricket, Shane Warne and Muralitharan. The trophy displays images of the two spin bowlers' hands each holding a cricket ball. This trophy will be contested between Australia and Sri Lanka in all future Test series.
On 3 December 2007, just hours after Muttiah Muralitharan became Test cricket's leading Test wicket-taker, Marylebone Cricket Club (MCC) announced it had unveiled a portrait of the Sri Lanka off-spinner at Lord's. On the same day the Philatelic Bureau of the Department of Posts in Sri Lanka issued a circular stamp with a denomination of Rs. 5 to mark the world record set by Muttiah Muralitharan. The circular design was meant to denote the cricket ball.
Australian musician Alston Koch provoked worldwide interest when he recorded the only official tribute song to Muralitharan. The song was even mentioned on the BBC's Test match Special. The Muralitharan Song video was also released after he broke the world record.
On 10 January 2008, the Parliament of Sri Lanka felicitated Muttiah Muralitharan for his world record breaking feat of being the highest wicket taker in Test cricket.
This was the first time that a sportsman had been honoured in the country's Supreme Legislature.
The Central Provincial Council in Kandy has renamed the International Cricket Stadium in Pallekele after Muttiah Muralitharan.
Throughout much of his international career, Muralitharan's action was suspected of contravening the laws of the game by the straightening of his bowling arm during delivery. Although he was cited three times, subsequent biomechanical testing led the ICC to clear him of the charge and permit him to continue bowling.
Biomechanical testing conducted on four occasions fueled debate as to whether his action was in fact illegal or actually an illusion created by his allegedly unique ability to generate extra movement both at the shoulder as well the wrist, which enables him to bowl the doosra without straightening the elbow.
Initial concerns as to whether Muralitharan's action contravened the laws of the game by straightening his bowling arm during delivery broke into open controversy after Australian umpire Darrell Hair called a "no-ball" for an illegal action seven times during the Boxing Day Test match in Melbourne, Australia, in 1995. Australian Sir Donald Bradman, universally regarded as the greatest batsman in history, was later quoted as saying it was the "worst example of umpiring that [he had] witnessed, and against everything the game stands for. Clearly Murali does not throw the ball".
Ten days later, on 5 January 1996, Sri Lanka played the West Indies in the seventh ODI of the triangular World Series competition, in Brisbane. Umpire Ross Emerson officiating in his debut international match, no-balled Muralitharan three times in his first over, twice in his second and twice in his third. It was an identical tally to that called by Hair on Boxing Day and (like Hair) Emerson made his calls from the bowler's end while his partner stood silent. The main difference was that several no-balls were for leg-breaks instead of the bowler's normal off-breaks.
In February 1996, just before the World Cup, Muralitharan underwent biomechanical analysis at the Hong Kong University of Science and Technology under the supervision of Prof. Ravindra Goonetilleke, who declared his action legal in the conditions tested, citing a congenital defect in Muralitharan's arm which makes him incapable of fully straightening the arm but gives the appearance of fully straightening it. Although under the original Laws a bowler's arm did not have to be fully straightened to be an illegal delivery, it was concluded that his action created the 'optical illusion of throwing'. Based on this evidence, ICC gave clearance to Muralitharan to continue bowling.
Doubts about Muralitharan's action persisted, however. On the 1998–99 tour to Australia he was once again called for throwing by Ross Emerson during a One Day International against England at the Adelaide Oval in Australia. The Sri Lankan team almost abandoned the match, but after instructions from the President of the Board of Control for Cricket in Sri Lanka, the game resumed. The Sri Lankan captain at the time Arjuna Ranatunga, was later fined and given a suspended ban from the game as a result. It later emerged that at the time of this match Emerson was on sick leave from his non-cricket job due to a stress-related illness and he stood down for the rest of the series. Muralitharan was sent for further tests in Perth and England and was cleared again. At no stage was Muralitharan requested to change or remodel his action, by the ICC. Up to this point in his career (1999) Muralitharan primarily bowled two types of deliveries, namely the off-break and the topspinner. He had not yet mastered the doosra.
Muralitharan continued bowling, taking his 500th Test wicket in the second Test against Australia in Kandy on 2004. At the end of the series his doosra delivery was officially called into question by match referee Chris Broad. At the University of Western Australia (Department of Human Movement and Exercise Science), three-dimensional kinematic measurements of Muttiah Muralitharan's bowling arm were taken using an optical motion capture system while he bowled his doosra. Muralitharan's mean elbow extension angle for the doosra delivery was 14°, which was subsequently reduced to a mean of 10.2° after remedial training at the University. The findings reported to ICC by the University of Western Australia's study was that Muralitharan's doosra contravened the established ICC elbow extension limit of 5° for spinners.
Under the original throwing Laws of Cricket, the umpires officiating were under an obligation to call "no-ball" to a delivery that they were not entirely happy was absolutely fair. This Law gave the umpires absolutely no discretion. In 2000, the Laws were changed to put an allowable figure of straightening of 5° for spinners, 7.5° for medium pacers and 10° for fast bowlers in an attempt to more clearly define what was legal. But these figures proved difficult to enforce due to umpires being unable to discern actual amounts of straightening and the differentiation between the three different allowable figures. Testing in Test match conditions is not currently possible "when the identification of elbow and shoulder joint centres in on-field data collection, where a shirt is worn, also involves large errors. In a match the ability to differentiate anatomical movements such as 'elbow extension' by digitising segment end-points, particularly if you have segment rotations, is extremely difficult and prone to error. This is certainly the case with spin bowlers. It is therefore not surprising that laboratory testing is preferred, particularly for spin bowlers, where an appropriate pitch length and run-up can be structured. This is clearly the only way to test players, where data would be able to withstand scientific and therefore legal scrutiny."
An extensive ICC study, the results of which were released in November 2004, was conducted to investigate the "chucking issue". A laboratory kinematic analysis of 42 non-Test playing bowlers done by Ferdinands and Kersting (2004) established that the 5° limit for slow and spin bowlers was particularly impractical.
Due to the overwhelming scientific findings, researchers recommended that a flat rate of 15° tolerable elbow extension be used to define a preliminary demarcation point between bowling and throwing. A panel of former Test players consisting of Aravinda de Silva, Angus Fraser, Michael Holding, Tony Lewis, Tim May and the ICC's Dave Richardson, with the assistance of several biomechanical experts, stated that 99% of all bowlers in the history of cricket straighten their arms when bowling. Only one player tested (part-time bowler Ramnaresh Sarwan) reportedly did not transgress the pre 2000 rules. Many of these reports have controversially not been published and as such, the 99% figure stated has yet to be proved. In fact, Muralitharan stirred up controversy when he said during an interview with a Melbourne radio station that Jason Gillespie, Glenn McGrath and Brett Lee flexed their arms by 12, 13 and 14–15 degrees respectively, although it is unclear as to where Muralitharan quoted these figures from. Muralitharan was censured by the Sri Lankan Cricket Board for these comments.
The ICC Executive was asked to ratify the panel's recommendations at the ICC's Annual General Meeting in February 2005. Based on the recommendations the ICC issued a new guideline (which was effective from 2005) allowing for extensions or hyperextensions of up to 15 degrees for all types of bowlers, thus deeming Muralitharan's doosra to be legal.
Explaining why the maximum level of 15 degrees was arrived at, panel member Angus Fraser stated "That is the number which biomechanics says that it (straightening) becomes visible. It is difficult for the naked eye to see less than 15 degrees in a bowler's action. We found when the biceps reached the shoulder the amount of bend was around 165 degrees. Very few bowlers can get to 180 degrees because the joint doesn't allow that. ... but once you go further than 15 degrees you get into an area which is starting to give you an unfair advantage and you are breaking the law".
The original decision of disallowing the doosra bowling action was hailed widely as justifiable on account of being scientifically based. Hence, a team of Australian scientists representing the University of South Australia conducted an independent research, in line with modern Artificial Intelligence and biomechanics to solve the controversial issue arise from doosra. The University of South Australia's study, founded by Prof. Mahinda Pathegama, and contributed by Prof. Ozdemir Gol, Prof. J. Mazumdar, Prof. Tony Worsley and Prof. Lakmi Jain has analyzed the previous studies with close scrutiny since the techniques in their fields of expertise are employed in the course of assessment as the basis for decision-making. The findings based on this scientific study are overwhelming and Dave Richardson, general manager ICC stated that "the ICC is currently reviewing the Law on throwing and the ICC regulations and the study done by Prof. Mahinda Pathegama with UniSA scientists is a valuable source of information in this regard." The team of Australian scientists including Sri Lankan-born Australian scientist, Prof. Mahinda Pathegama reporting their findings, in line with the Muralitharan test to ICC, has analyzed in-depth various issues, such as Pitfalls in image interpretation when using 2D images for 3D modeling associates compared to the modern techniques in Artificial Intelligence and biomechanics, and Biomechanics assessment for doosra bowling action, etc. Pathegama at al. (2004) further reports on the Disagreement of expression on measurement accuracy in the Murali Report, with the analysis of the Motion tracking system used for the Murali Report, and discussing Cognitive aspects, Evidence of errors in Anthropometric assessment and movement tracking, Lateral inhibition in response tracking, Psycho-physiological aspect on post-assessments, Angular measurement errors, Skin marker induced errors, Geometrics-and physics-based 3D modeling and the Approach to on-field assessment, etc.
The Muralitharan Report produced by the University of Western Australia's study has considered the Richards study done in 1999 to evaluate the error margin. University of South Australia's study done by Prof. Mahinda Pathegama argued that the Richards study which was presented by the University of Western Australia's study has used a rigid aluminium bar that only rotated in the horizontal plane to introduce such error margin. Pathegama's report stated that "in view of the system used in the test itself yielding considerable error even with a rigid aluminum bar (an "accuracy level of approximately 4 degrees" as stated in the Murali Report), it stands to reason that the error margin would be considerably larger when tracking skin markers on a spin bowler's moving upper limb by this same system".
Vincent Barnes in an interview argues that Bruce Elliott, the UWA professor who is also the ICC biomechanist, had made an interesting discovery in his dealings with finger spinners. "He said he had found that a lot of bowlers from the subcontinent could bowl the doosra legally, but not Caucasian bowlers."
On 2 February 2006, Muralitharan underwent a fourth round of biomechanical testing at the University of Western Australia. There had been criticism that the previous round of tests in July 2004 did not replicate match conditions due to a slower bowling speed in the laboratory tests. The results showed that the average elbow extension while bowling the 'doosra' delivery was 12.2 degrees, at an average of . The average for his off-break was 12.9 degrees at .
In July 2004 Muralitharan was filmed in England, bowling with an arm brace on. The film was shown on Britain's Channel 4 during the Test against England on 2004.
Initially, Muralitharan bowled three balls – the off-spinner, the top-spinner and the doosra – as he would in a match. Then he bowled the same three balls with a brace that is made from steel bars, which are set into strong resin. This brace has been moulded to his right arm, is approximately 46 centimetres long and weighs just under 1 kilogram.
TV presenter Mark Nicholas who tried the brace himself, confirmed that "There is no way an arm can be bent, or flexed, when it is in this brace." All three balls reacted in the same way as when bowled without the brace. They were not bowled quite so fast because the weight of the brace restricts the speed of Muralitharan's shoulder rotation, but the spin was still there.
With the brace on, there still appeared to be a jerk in his action. When studying the film at varying speeds, it still appeared as if he straightened his arm, even though the brace makes it impossible to do so. His unique shoulder rotation and amazing wrist action seem to create the illusion that he straightens his arm.
The off-spinner said the exercise was to convince a sceptical public rather than sway an ICC investigation into bowling actions launched after he was reported by match referee Chris Broad for his doosra delivery in March 2004, the third time action was taken on his bowling. In an interview for August 2004 edition of Wisden Asia Cricket, Muralitharan stated "I think it will prove a point to those who had said that it was physically impossible to bowl a ball that turned the other way. I proved that it was possible to bowl the doosra without bending the arm."
In 2004 at the R Premadasa Stadium in Colombo, Muralitharan voluntarily performed a series of tests with live video cameras. Michael Slater and Ravi Shastri witnessed it all unfold. Muralitharan once again showed he could bowl all his deliveries including the doosra with an arm brace that prevents any straightening of his elbow. Orthopediatrician Dr Mandeep Dillon stated that Muralitharan's unusual ability to generate extra movement both at the shoulder as well the wrist enables him to bowl the doosra without straightening the elbow.
Two vocal critics of Muralitharan's action have been former test cricketers, Australian Dean Jones and Bishan Bedi, the former Indian captain. Dean Jones later admitted to being wrong in his assessment of Murali when he witnessed first hand Murali bowling with an arm-brace on.
Michael Holding, the former West Indian fast bowler was also a critic of Muralitharan, but withdrew his criticisms under the light of the tests carried out. Holding had been quoted as being in "110% agreement" with Bedi, who likened Murali's action to a "javelin throw" and more recently, compared to a "shot putter". Following the ICC study, as a member of the panel that conducted the study, Holding stated, "The scientific evidence is overwhelming ... When bowlers who to the naked eye look to have pure actions are thoroughly analysed with the sophisticated technology now in place, they are likely to be shown as straightening their arm by 11 and in some cases 12 degrees. Under a strict interpretation of the Law, these players are breaking the rules. The game needs to deal with this reality and make its judgment as to how it accommodates this fact."
In May 2002, Adam Gilchrist, speaking at a Carlton (Australian) Football Club luncheon, claimed Muralitharan's action does not comply with the laws of cricket. The Melbourne-based Age newspaper quoted Gilchrist as saying."Yeah, I think he does (chuck), and I say that because, if you read the laws of the game, there's no doubt in my mind that he and many others, throughout cricket history have." These comments were made before the doosra controversy, in spite of Muralitharan's action having been cleared by ICC in both 1996 and 1999. For his comment Gilchrist was reprimanded by the Australian Cricket Board (ACB) and found guilty of being in breach of ACB rules concerned with "detrimental public comment".
During the 2006 tour of New Zealand another Muralitharan critic, former New Zealand captain and cricket commentator Martin Crowe, called for Muralitharan's doosra to be monitored more closely, asserting that his action seemed to deteriorate during a match. Earlier that year when delivering the Cowdrey lecture at Lords Martin Crowe had demanded zero tolerance instead of 15 degrees for throwing and specifically branded Muttiah Muralitharan a chucker. In response to Crowe's criticism ICC general manager Dave Richardson stated that the scientific evidence presented by biomechanists Professor Bruce Elliot, Dr Paul Hurrion and Mr Marc Portuswith was overwhelming and clarified that "Some bowlers, even those not suspected of having flawed actions, were found likely to be straightening their arms by 11 or 12 degrees. And at the same time, some bowlers that may appear to be throwing may be hyper-extending or bowl with permanently bent elbows. Under a strict interpretation of the law, they were breaking the rules – but if we ruled out every bowler that did that then there would be no bowlers left."
Since 1999 there has been a number of scientific research publications discussing Muralitharan's bowling action as well the need for defining the legality of a bowling action using biomechanical concepts. This research directly contributed towards the official acceptance of Muralitharan's bowling action and convinced the ICC to redefine the bowling laws in cricket.
The key publications are listed below:
Muralitharan, along with his manager Kushil Gunasekara, established the charitable organisation Foundation of Goodness in the early 2000s. The organisation is committed to the wellbeing of the Seenigama region (in southern Sri Lanka) and supports local communities through a range of projects across areas including children's needs, education and training, health care and psycho-social support, housing, livelihoods, sport and the environment. Murali's Seenigama project raised funds from cricketers and administrators in England and Australia. Canadian pop-star Bryan Adams donated a swimming pool.
Muralitharan also planned to build a second sports complex for war-displaced civilians in Mankulam, a town located 300 kilometers from north of Colombo. The two-year one million dollar project aims to build a sports center, a school, English and IT training centers and an Elders' home. While the Sports Complex remains the main project, Foundation of Goodness also plans to help educate children, youth and adults. English cricketer Sir Ian Botham visited Mankulam with Muralitharan, and later addressing the media in Colombo on 27 March 2011 said that he will consider a walk from Point Pedro (the extreme northern tip of Sri Lanka) to Dondra Head (the extreme southern tip of Sri Lanka) to raise funds for the project.
In June 2004, Muralitharan also joined the United Nations World Food Program as an ambassador to fight hunger among school children.
When the 2004 Indian Ocean earthquake devastated Sri Lanka on 2004, Muralitharan contributed to the relief programs. He himself narrowly escaped death, arriving 20 minutes late at Seenigama, where he was to give away prizes at one of the charity projects he worked on. While international agencies were bringing food in by air, Muralitharan paid for and organised three convoys of ten trucks each to get assist in the distribution. He persuaded those who could to donate clothes, and supervised the delivery himself.
During the rehabilitation efforts in the tsunami's aftermath, cement was in short supply. Muralitharan promptly signed an endorsement deal with Lafarge, a global cement giant, that was a straight barter, where cement would be supplied to the Foundation for Goodness in exchange for work Muralitharan did. During the first three years since the tsunami, the foundation raised more than US$ to help survivors, and has built homes, schools, sports facilities and computer centres.
On 1 August 2015, Muralitharan and fellow Sri Lankan cricketer Tillakaratne Dilshan were appointed by President of Sri Lanka Maithripala Sirisena as the Brand Ambassadors for the Presidential Task Force to combat kidney disease.
In July 2019, it was announced that a biopic would be made with Tamil actor Vijay Sethupathi portraying Muralitharan. The film is set to be produced by actor Rana Daggubati under his banner Suresh Productions. MS Sripathy is on board as the director. The film is expected to be released by December 2019.
|
https://en.wikipedia.org/wiki?curid=20905
|
Mole Day
Mole Day is an unofficial holiday celebrated among chemists, chemistry students and chemistry enthusiasts on October 23, between 6:02 a.m. and 6:02 p.m.,
|
https://en.wikipedia.org/wiki?curid=20906
|
Motörhead
Motörhead () were an English rock band formed in June 1975 by bassist, singer, and songwriter Ian "Lemmy" Kilmister, who was the sole constant member, guitarist Larry Wallis and drummer Lucas Fox. The band are often considered a precursor to the new wave of British heavy metal, which re-energised heavy metal in the late 1970s and early 1980s. Though several guitarists and drummers have played in Motörhead, most of their best-selling albums and singles feature the work of Phil "Philthy Animal" Taylor on drums and "Fast" Eddie Clarke on guitars.
Motörhead released 22 studio albums, 10 live recordings, 12 compilation albums, and five EPs over a career spanning 40 years. Usually a power trio, they had particular success in the early 1980s with several successful singles in the UK Top 40 chart. The albums "Overkill", "Bomber", "Ace of Spades", and particularly the live album "No Sleep 'til Hammersmith" cemented Motörhead's reputation as a top-tier rock band. The band are ranked number 26 on VH1's "100 Greatest Artists of Hard Rock". As of 2016, they have sold more than 15 million albums worldwide.
Motörhead are typically classified as heavy metal, and their fusion of punk rock into the genre helped to pioneer speed metal and thrash metal. Their lyrics typically covered such topics as war, good versus evil, abuse of power, promiscuous sex, substance abuse, and, most famously, gambling, the latter theme being the focus of their hit song "Ace of Spades".
Motörhead has been credited with being part of and influencing numerous musical scenes, thrash metal and speed metal especially. From the mid-1970s onward, however, Lemmy insisted that they were a rock and roll band. He has said that they had more in common with punk bands, but with their own unique sound, Motörhead is embraced in both punk and metal scenes.
Lemmy died on 28 December 2015 from cardiac arrhythmia and congestive heart failure, after being diagnosed with prostate cancer. The day after his death, drummer Mikkey Dee and guitarist Phil Campbell both confirmed that Motörhead had disbanded. By 2018, all three members of Motörhead's classic lineup (Lemmy, Taylor and Clarke) had died.
Lemmy was dismissed from Hawkwind in May 1975 after being arrested in Canada for drug possession; he said the band dismissed him for "doing the wrong drugs". Now on his own, Lemmy decided to form a new band called Motörhead, the name was inspired by the final song he had written for Hawkwind.
Lemmy wanted the music to be "fast and vicious, just like the MC5". His stated aim was to "concentrate on very basic music: loud, fast, city, raucous, arrogant, paranoid, speedfreak rock n roll ... it will be so loud that if we move in next door to you, your lawn will die". He recruited guitarist Larry Wallis (formerly of Pink Fairies) on the recommendation of Mick Farren, based on Wallis' work with Steve Peregrin Took's band Shagrat, and Lucas Fox on drums. According to Lemmy, the band's first practice was at the now defunct Sound Management rehearsal studios, in Kings Road, Chelsea in 1975. Sound Management leased the basement area of furniture store The Furniture Cave, located in adjacent Lots Road. Kilmister has said they used to steal equipment, as the band was short on gear. Their first engagement was supporting Greenslade at The Roundhouse, London on 20 July 1975. On 19 October, having played 10 gigs, they became the supporting act to Blue Öyster Cult at the Hammersmith Odeon.
The band were contracted to United Artists by Andrew Lauder, the A&R man for Lemmy's previous band, Hawkwind. They recorded sessions at Rockfield Studios in Monmouth with producer Dave Edmunds, during which Fox proved to be unreliable and was replaced by drummer Phil "Philthy Animal" Taylor, a casual acquaintance of Lemmy's. Their record label was dissatisfied with the material and refused to release it, although it was subsequently issued as "On Parole" in 1979 after the band had established some success.
In March 1976, deciding that two guitarists were required, the band auditioned an acquaintance of drummer Taylor's named "Fast" Eddie Clarke. Wallis, who was continuing to tour with a reformed Pink Fairies, quit immediately after the auditions and Clarke remained as the sole guitarist. This trio of Lemmy/Clarke/Taylor is today regarded as the "classic" Motörhead line-up. In December, the band recorded the "Leaving Here" single for Stiff Records, but United Artists intervened to prevent its general release as the band were still under contract to them, despite the label's refusal to issue their debut album. Initial reactions to the band had been unfavourable; they won a poll for "the best worst band in the world" in the music magazine "NME".
By April 1977, living in squats and with little recognition, Taylor and Clarke decided to quit the band, and after some debate, they agreed to do a farewell show at the Marquee Club in London. Lemmy had become acquainted with Ted Carroll from Chiswick Records and asked him to bring a mobile studio to the show to record it for posterity. Carroll was unable to get the mobile unit to the Marquee Club, but showed up backstage after the engagement and offered them two days at Escape Studios with producer Speedy Keen to record a single. The band took the chance, and instead of recording a single they laid down 11 unfinished tracks. Carroll gave them a few more days at Olympic Studios to finish the vocals and the band completed 13 tracks for release as an album. Chiswick issued the single "Motorhead" in June, followed by the album "Motörhead" in August, which spent one week in the UK Albums Chart at number 43. The band toured the UK supporting Hawkwind in June, then from late July they commenced the "Beyond The Threshold Of Pain Tour" with The Count Bishops.
In August, Tony Secunda took over the management of the band, and their cohesiveness became so unstable that by March 1978, Clarke and Taylor had formed and were performing as The Muggers with Speedy Keen and Billy Rath.
In July 1978, the band returned to the management of Douglas Smith, who secured a one-off singles deal with Bronze Records. The resulting "Louie Louie" single was issued in September peaking at number 68 on the UK Singles Chart, and the band toured the UK to promote it, recorded a BBC Radio 1 "John Peel in session" on 18 September (these tracks were later issued on the 2005 "BBC Live & In-Session" album), and appeared for the first time on BBC Television's "Top of the Pops" on 25 October. Chiswick capitalised on this new level of success by re-issuing the debut album "Motörhead" on white vinyl through EMI Records.
The single's success led to Bronze extending their contract, and put the band back into the studio to record an album, this time with producer Jimmy Miller at Roundhouse Studios. A hint of what the band had recorded for the album came on 9 March 1979 when the band played "Overkill" on "Top of the Pops" to support the release of the single ahead of the "Overkill" album, which was released on 24 March. It became Motörhead's first album to break into the top 40 of the UK Albums chart, reaching number 24, with the single reaching number 39 on the UK Singles Chart. These releases were followed by the "Overkill" UK tour which began on 23 March. A subsequent single was released in June, coupling the album track "No Class" as the A-side with the previously unreleased song "Like a Nightmare" on the B-side. It fared worse than both the album and previous single but reached number 61 on the UK singles chart.
During July and August, except for a break to appear at the Reading Festival, the band were working on their next album, "Bomber". Released on 27 October, it reached number 12 on the UK Albums Chart. On 1 December, it was followed by the "Bomber" single, which reached number 34 on the UK Singles Chart. The "Bomber" Europe and UK tour followed, with support from Saxon. The stage show featured a spectacular aircraft bomber-shaped lighting rig. During the "Bomber" tour, United Artists put together tapes recorded during the Rockfield Studios sessions in 1975–1976 and released them as the album "On Parole", which peaked at number 65 on the UK Albums Chart in December.
On 8 May 1980, while the band were on tour in Europe, Bronze released "The Golden Years", which sold better than any of their previous releases, reaching number eight on the UK Singles Chart. The band had, however, preferred the title "Flying Tonight", in reference to the "Bomber" lighting rig. On 20 August, the band had a 40-minute filmed slot, along with Girlschool's 20 minutes performing live at the Nottingham Theatre Royal for the "Rockstage" programme, broadcast on UK television by the ATV on 4 April 1981.
During August and September 1980, the band were at Jackson's Studios in Rickmansworth, recording with producer Vic Maile. The "Ace of Spades" single was released on 27 October 1980 as a preview of the "Ace of Spades" album, which followed on 8 November. The single reached No. 15 and the album reached No. 4 on the UK charts. Bronze celebrated its gold record status by pressing a limited edition of the album in gold vinyl.
Motörhead made an appearance on "Top of the Pops" in November that year with "Ace of Spades", and between 22 October and 29 November the band were on their "Ace Up Your Sleeve" UK tour with support from Girlschool and Vardis, and also made an appearance as guests on the ITV children's show "Tiswas" on 8 November. The "Arizona desert-style" pictures used on the album sleeve and tour booklet cover were taken during a photo session at a sandpit in Barnet. "Ace of Spades", considered to be the definitive Motörhead anthem, "put a choke on the English music charts and proved to all that a band could succeed without sacrificing its blunt power and speed".
To coincide with the "Ace of Spades" release, Big Beat, who had inherited the Chiswick catalogue, put together four unused tracks from the Escape Studios sessions in 1977 and released them as "Beer Drinkers and Hell Raisers", which reached No. 43 on the UK Singles Chart in November.
The band had more chart hits in 1981 with the releases "St. Valentine's Day Massacre" EP, their collaboration with Girlschool which reached No. 5 on the UK Singles Chart in February; the live version of "Motorhead", which reached No. 6 on the UK Singles Chart in July; and the album it was taken from, "No Sleep 'til Hammersmith", which reached No. 1 on the UK Albums Chart in June. During March 1981, the band had been touring Europe, and in the final week of the month they conducted the "Short Sharp, Pain in the Neck" UK tour from which the recordings for "No Sleep 'til Hammersmith" were made.
From April through to July, the band toured North America for the first time as guests of Blizzard of Ozz, an early incarnation of Ozzy Osbourne's band, but were still able to make an appearance on "Top of the Pops" on 9 July to promote the live "Motorhead" single. In October the band recorded tracks at BBC's Maida Vale studio for the David Jensen show broadcast on 6 October. The band commenced a European tour on 20 November, supported by Tank, after which Clarke produced Tank's debut album "Filth Hounds of Hades" at Ramport Studios in December and January.
Between 26 and 28 January 1982, the band started recording their self-produced new album at Ramport Studios, before moving onto Morgan Studios to continue the sessions throughout February. On 3 April the single "Iron Fist" was released, reaching No. 29 on the UK Singles Chart, followed by the parent album "Iron Fist", released on 17 April and peaking at No. 6 on the UK Albums Chart. They were the last releases to feature the Lemmy, Clarke, Taylor line-up, though the line-up continued to perform in the "Iron Fist" UK tour between 17 March and 12 April, and the band's first headlining North America tour from 12 May until Clarke's last engagement at the New York Palladium on 14 May.
Clarke left as a consequence of the band recording "Stand By Your Man", a cover version of the Tammy Wynette classic, in collaboration with Wendy O. Williams and the Plasmatics. Clarke felt that the song compromised the band's principles, refused to play on the recording and resigned, later forming his own band, Fastway. Lemmy and Taylor made numerous telephone calls to find a guitarist, including one to Brian Robertson, formerly with Thin Lizzy, who was recording a solo album in Canada. He agreed to help out and complete the tour with them. Robertson signed a one-album deal resulting in 1983's "Another Perfect Day" and the two singles from it, "Shine" and "I Got Mine".
In June and July the band played five dates in Japan, and from mid-October until mid-November they toured Europe. From late May until early July, the band conducted the "Another Perfect Tour", followed by an American tour between July and August, and another European tour in October and November. Robertson began to cause friction in the band as a result of his on-stage attire, consisting of shorts and ballet shoes, and with his pointblank refusal to play the old standards that every Motörhead audience expected to hear. This led to an amicable agreement that Robertson would leave, playing his last engagement with the band at the Berlin Metropol on 11 November.
After Robertson's departure in 1983, the band were sent tapes from all over the world from potential guitarists. The group returned to the concept of dual lead guitars by hiring unknowns Würzel and Phil Campbell (formerly of Persian Risk). In February 1984, the Lemmy, Campbell, Würzel, and Taylor line-up recorded "Ace of Spades" for the "Bambi" episode in the British television series, "The Young Ones". Scenes of the band playing are interspersed with the characters' antics as they rush to the railway station, in a parody of The Beatles' comedy film "A Hard Day's Night". Taylor quit the band after that recording, causing Lemmy to quip: "Did I leave them or did they leave me?". Before joining Motörhead, Phil Campbell had met former Saxon drummer Pete Gill, and the trio decided to call him to see if he would like to visit London. The try-outs went well and Gill was hired.
Bronze Records thought the new line-up would not make the grade and decided to "nail down the lid" on the group with a compilation album. When Lemmy found out, he took over the project, selecting tracks, providing sleeve notes and insisted that Motörhead record four brand new tracks to go at the end of each side of the album. During the sessions between 19 and 25 May 1984 at Britannia Row Studios, London, the band recorded six tracks for the single's B-side and the album. The single "Killed by Death" was released on 1 September and reached No. 51 in the UK Singles Chart, the double album "No Remorse" was released on 15 September and reached silver disc status, attaining the position of No. 14 in the UK Album charts.
The band were involved in a court case with Bronze over the next two years, believing that their releases were not being promoted properly, and the record company banned them from the recording studio. The band looked to more touring for income; Australia and New Zealand in late July to late August, a brief tour of Hungary in September, and the "No Remorse" "Death on the Road" tour between 24 October and 7 November. On 26 October the band made a live appearance on the British Channel 4 music programme The Tube, performing "Killed By Death", "Steal Your Face" (over which the programme's end-credits were played) and the unbroadcast "Overkill", before going on to their next engagement that evening. From 19 November to 15 December the band toured America with Canadian speed metal band Exciter and Danish heavy metal band Mercyful Fate and from 26 to 30 December performed five shows in Germany.
On 5 April 1985, ITV broadcast four songs that were recorded after the band went off air on their earlier appearance on "The Tube" programme. A week later the band, dressed in tuxedos, played four songs on the live Channel 4 music show "ECT" (Extra-Celestial Transmission). To celebrate the band's 10th anniversary, two shows were arranged at Hammersmith Odeon on 28 and 29 June, a video of the second show was taken and later released as "The Birthday Party". From early June until early August the band were on their 'It Never Gets Dark' tour of Sweden and Norway, an American tour followed in mid-November until late December.
From 26 March to 3 April 1986, the band toured Germany, the Netherlands and Denmark on their "Easter Metal Blast" and in June, played two dates in Bologna and Milan in Italy. The court case with Bronze was finally settled in the band's favour. The band's management instigated their own label, GWR. Recording took place in Master Rock Studios, London and the single "Deaf Forever" was released on 5 July as a taster for the "Orgasmatron" album, which was released on 9 August. On the same day as the release of the album, Lemmy and Würzel were interviewed by Andy Kershaw on the BBC Radio 1 "Saturday Live" show and "Orgasmatron" and "Deaf Forever" were played. The single reached No. 67 and the album reached No. 21 in the UK charts.
On 16 August, the band played at the Monsters of Rock at Castle Donington and was recorded by BBC Radio 1 for a future "Friday Rock Show" broadcast. The performance closed with a flypast by a couple of Second World War German aircraft. Also that day Lemmy was filmed giving his views on spoof metal act "Bad News" for inclusion in a Peter Richardson Comic Strip film entitled "More Bad News" since the band featuring Rik Mayall, Peter Richardson, Nigel Planer and Adrian Edmondson were also performing at Donington. In September the band conducted their "Orgasmatron" tour in Great Britain, supported by fledgling act Zodiac Mindwarp and the Love Reaction. In October they toured America and in December were in Germany.
In 1987, during the filming of "Eat the Rich" – in which Lemmy was taking a starring role alongside well-known comedy actors such as Robbie Coltrane, Kathy Burke, the regulars from The Comic Strip ensemble, and various other musician cameo appearances – Gill left the band and Taylor returned to appear in the band's cameo as "In House Club Band" alongside Würzel and Campbell. The band wrote "Eat the Rich" especially for the film, its soundtrack featured tracks from "Orgasmatron" and Würzel's solo single "Bess". The band's second album for GWR was "Rock 'n' Roll", released on 5 September, after a tight work schedule in the studio. While having some popular tracks and using "Eat the Rich" as its second track, the band commented that the album was virtually "nailed together".
On 2 July 1988 Motörhead were one of the performers at the Giants of Rock Festival in Hämeenlinna, Finland. The tracks were released as "No Sleep at All" on 15 October. A single from the album was planned with the band wanting "Traitor" as the A-side, but "Ace of Spades" was chosen instead. When the band noticed the change, they refused to allow the single to be distributed to the shops, and it was withdrawn and became available only on the "No Sleep at All" tour and through the "Motörheadbangers" fan club. While they continued to play live shows during 1989 and 1990, Motörhead once again felt unhappy with their career, and a court case with GWR followed, which was not resolved until mid-1990.
With the court case resolved, Motörhead signed to Epic/WTG and spent the last half of 1990 recording a new album and single in Los Angeles. Just prior to the album sessions the band's former manager, Doug Smith, released the recording of the band's 10th anniversary show, much against the bands wishes, having previously told him that they did not want it released, in 1986. In the studio they recorded four songs with producer Ed Stasium, before deciding he had to go.
When Lemmy listened to one of the mixes of "Going to Brazil", he asked for him to turn up four tracks, and on doing so heard claves and tambourines that Stasium had added without their knowledge. Stasium was fired and Peter Solley was hired as producer. The story according to Stasium was that Lemmy's drug and alcohol intake had far exceeded the limitations of Stasium's patience so he quit. The single "The One to Sing the Blues" issued on 24 December 1990 (7" and CD) and 5 January 1991 (12"), was followed by the album "1916" on 21 January. The single, which was issued in 7", cassette, shaped picture disc, 12" and CD single, reached No. 45 in the UK Singles Chart, the album reached No. 24 in the UK Album Charts.
The band conducted their "It Serves You Right" tour of Britain in February, the "Lights Out Over Europe" tour followed, lasting until early April, when the band returned to Britain to play another six venues. In June the band played five dates in Japan and five dates in Australia and New Zealand. Between July and August, they played across the United States with Judas Priest, Alice Cooper, Metal Church and opener Dangerous Toys on the "Operation Rock 'n' Roll" tour. The band finished the year with six dates in Germany during December.
On 28 March 1992, the band played what would turn out to be Taylor's last engagement at Irvine Meadows, Irvine, California. The band had been wanting Lemmy to get rid of their manager, Doug Banker, for some time and after an unsolicited visit from Todd Singerman, who insisted he should manage them despite never having managed a band before, the band met with Singerman and decided to take him on board, firing Banker. In the midst of this, the band were recording an album at Music Grinder Studios, in the city's east part of Hollywood during the 1992 Los Angeles riots. Three drummers participated in the making of the "March ör Die" album: Phil Taylor, who was fired because he did not learn the drum tracks on the song "I Ain't No Nice Guy"; Tommy Aldridge who recorded most of the material on the album; and Mikkey Dee, who recorded "Hellraiser", a song originally written by Lemmy for Ozzy Osbourne's "No More Tears" album. "March ör Die" features guest appearances by Ozzy Osbourne and Slash.
Lemmy had known Mikkey Dee from the time when King Diamond had toured with Motörhead. He had asked Dee to become Motörhead's drummer before, but Dee had declined due to his commitment to King Diamond. On this occasion, Dee was available and met the band to try out. Playing the song "Hellraiser" first, Lemmy thought "he was very good immediately. It was obvious that it was going to work." After recording "Hellraiser" and "Hell on Earth" in the studio, Dee's first engagement with Motörhead was on 30 August at the Saratoga Performing Arts Center. The new line-up then went on tour, playing dates with Ozzy Osbourne, Skew Siskin and Exodus. On 27 September, the band played at the Los Angeles Coliseum with Metallica and Guns N' Roses. The band toured Argentina and Brazil during October and conducted the "Bombers and Eagles in '92" tour of Europe with Saxon throughout December.
Motörhead played two dates at the Arena Obras Sanitarias in Buenos Aires in April 1993 and toured Europe from early June until early July, returning to the United States to play one show at the New York Ritz on 14 August. A new producer was sought for the band's next album and eventually Howard Benson, who was to produce the band's next four albums, was chosen. The band recorded at A&M Studios and Prime Time Studios in Hollywood and the resultant album, titled "Bastards", was released on 29 November 1993. The single "Don't Let Daddy Kiss Me" included the song "Born to Raise Hell", which also appeared on the album and would later be re-recorded with collaborative vocals from both Ice-T and Ugly Kid Joe frontman Whitfield Crane for the soundtrack of the movie "Airheads" (in which Lemmy also made a cameo appearance) and released as a single in its own right. Although "Bastards" received airtime, the record company ZYX Music would not pay for promotional copies, so the band sent out copies themselves. A further tour of Europe was made throughout December that year.
In February and March 1994, Motörhead toured the United States with Black Sabbath and Morbid Angel. In April the band resumed their tour of the States until early May, playing an engagement with the Ramones on 14 May at the Estadio Velez in Buenos Aires, attracting a crowd of 50,000 people. The band toured Japan in late May and Europe in June, August and December.
The band's 1995 touring schedule began in Europe in late April. In June, they went on a second tour with Black Sabbath, this time supported by Tiamat, until the band succumbed to influenza and headed back to Los Angeles and Cherokee Studios in Hollywood where they were to record an album. During the sessions it became clear that Würzel was not extending himself and left the band after the recording. The title track from the album, "Sacrifice", was later used in the movie "Tromeo and Juliet", a film in which Lemmy appears as the narrator. The band decided to continue as a three-man line-up and a tour of Europe was performed throughout October and the first two days of November. A three-day tour of South America followed the week after. Lemmy celebrated his 50th Birthday later that year with the band at the Whisky a Go Go in Los Angeles; Metallica played at the event under the name "The Lemmy's".
In 1996, the band began touring the States in early January and played 30 venues up to 15 February; a seven-date tour of Europe in June and July was followed by two engagements in South America during August. A tour of the United States with Belladonna and Speedball began with two shows (Los Angeles & Hollywood) in early October 1996 and concluded in Washington on 4 December. During this time the band had recorded "Overnight Sensation", at Ocean Studio and Track House Recording Studio. The album was released on 15 October, the first official album of the band as a three-piece since "Another Perfect Day" and the best distributed album the band had had for years. The band concluded the year's touring with 13 dates in Germany.
During 1997, the band toured extensively, beginning with the first leg of the "Overnight Sensation" tour in Europe on 12 January at the London Astoria, where the guest musicians were Todd Campbell, Phil Campbell's son, on "Ace of Spades" and "Fast" Eddie Clarke for "Overkill". The European leg lasted until March and was followed by four dates in Japan, from late May to 1 June, and an American tour with W.A.S.P. throughout the rest of June. In August, three dates in Europe were followed by seven dates in Britain, which ended with a show at the Brixton Academy on 25 October, where the guest musician was Paul Inder, Lemmy's son, for "Ace of Spades". A further four dates in October in Russia concluded the year 1997.
Lemmy recalled that the touring was going particularly well, with some countries like Argentina and Japan putting the band in larger venues, and the English promoters discovered that "they could turn a nice profit with Motörhead shows". In his opinion, the three-piece line-up was performing excellently and it was high time they made another live record. The band did eventually, but made another studio album first, "Snake Bite Love", recorded in various studios and released on 3 March 1998.
The band joined with Judas Priest at the Los Angeles Universal Amphitheatre on 3 April, to begin their "Snake Bite Love" tour. On 21 May, Motörhead were recorded at The Docks in Hamburg. The tracks from this performance were later released as "Everything Louder Than Everyone Else". The band were invited to join the Ozzfest Tour and played dates across the States during early July until early August and were in Europe from early October until late November. The British leg of the tour was dubbed the "No Speak With Forked Tongue" tour and included support bands Groop Dogdrill, Radiator and Psycho Squad, which was fronted by Phil Campbell's son Todd.
In 1999 Motörhead made a tour of the states between 20 April and 2 June, before going to Karo Studios in Brackel, Germany to record their next album, "We Are Motörhead", which was released in May the following year. During the time the album sessions took place, the band played at venues around Europe, the first of which was at Fila Forum in Assago, near Milan, where Metallica's James Hetfield joined the band on-stage to play "Overkill". In October and early November, the band toured the states with Nashville Pussy. Throughout the rest of November, the band conducted their European "Monsters of the Millennium" tour with Manowar, Dio and Lion's Share, ending the Millennium with two shows at the London Astoria. The two shows were billed under the "Kerrang!" "X-Fest" banner and at the first show were supported by Backyard Babies and during the second show guest vocals were provided by Skin from Skunk Anansie and Nina C. Alice from Skew Siskin for "Born to Raise Hell", and Ace from Skunk Anansie played "Overkill" with the band.
In May 2000, the release of "We Are Motörhead" and the single from it, a cover of the Sex Pistol's "God Save the Queen", coincided with the start of the band's "We Are Motörhead" tour across South and North America during May and June, with a further nine shows across in Europe in July. Shows in the United States and France were followed by the release of a double-disc compilation album, "The Best Of", on 26 August.
Four dates in Japan preceded the band's 25th anniversary concert on 22 October at the Brixton Academy in London, where guest appearances were made by "Fast" Eddie Clarke, Brian May, Doro Pesch, Whitfield Crane, Ace, Paul Inder and Todd Campbell. The show also featured the return of the Bomber lighting rig. The event was filmed and released the following year as the "25 & Alive Boneshaker" DVD, and the CD of the show, "Live at Brixton Academy", was released two years after that. Lemmy states the reason for the DVD as wanting "to record it for the posterity or whatever it is. I nodded off through the 10th anniversary, we never did anything on the 20th, so the 25th made sense."
A tour of West and East Europe followed the anniversary concert, taking the band through October, November and December. The schedule for the Eastern European tour was quite brutal, involving two 18-hour drives back-to-back and little time off, at the Warsaw venue the band did not arrive until 11 o'clock and the crew were still loading into the venue at one in the morning, while the fans waited.
After taking a month off, the band began working on a new album at Chuck Reid's house in the Hollywood Hills. This album, "Hammered", was released the following year. On 1 April 2001, the band gave a one-song performance for Triple H's entrance at WrestleMania X-Seven at the Reliant Astrodome in Houston. The second leg of the "We Are Motörhead" tour began in May in Ireland, moving across to the United Kingdom. In Manchester, the band were supported by Goldblade, and by Pure Rubbish at the two London shows. The second London show also included Backyard Babies and Paul Inder, who was guest musician for "Killed By Death". Between June and August, Motörhead played at a number of rock festivals in Europe; including as the Graspop Metal Meeting in Belgium, the Quart Festival in Norway, and the Wacken Open Air on 4 August, where four songs were recorded for the "25 & Alive Boneshaker" DVD. The band returned to the States for a seven show tour between late September and early October.
In April 2002, a DVD of some of Motörhead's performances from the 1970s and 1980s along with some stock footage of the band was released as "The Best of Motörhead". Two weeks earlier, the "Hammered" album was released and supported by the "Hammered" tour, which kicked off in the States at around the same time. The United States dates continued until late May, and a European leg followed between June and August. In October, the band played five dates in Great Britain with Anthrax, Skew Siskin and Psycho Squad. The final venue was the Wembley Arena in London, where instead of Psycho Squad, the band were supported by Hawkwind, with Lemmy performing "Silver Machine" on stage with them. Throughout the rest of October and better part of November, the band were on a European tour with Anthrax.
In April and May 2003, the band continued to promote the "Hammered" album in the States, and on the three dates Phil Campbell had to miss, his mother having died, Todd Youth stood in for him. Between late May and mid-July the band played seven dates at Summer Festivals in Europe and from late-July until the end of August, they were touring the United States with Iron Maiden and Dio. On 7 October a comprehensive five-disc collection of the band's recordings covering 1975–2002 was released as "Stone Deaf Forever!". On 1 September 2003, the band returned to Hollywood's Whisky A Go-Go club for the Hollywood Rock Walk of Fame Induction. During October, the band performed a tour of Great Britain with The Wildhearts and Young Heart Attack. The band performed seven shows across Belgium, the Netherlands and Spain between 21 and 28 October and from late November until early December they were in Germany and Switzerland, touring with Skew Siskin and Mustasch. On 9 December, the previously recorded "Live at Brixton Academy" album was released.
Motörhead performed an invitation-only concert at the Royal Opera House in Covent Garden, London on 22 February 2004; at Summer Festivals in South America during May; and in Europe during June, July and August. They had already spent time in the studio, working on "Inferno", which was released on 22 June and followed by the "Inferno" tour of Ireland with Class of Zero for three dates. Joined by Sepultura, the tour hit Great Britain. Some of the London show at the Hammersmith Apollo was filmed for TV as Gene Simmons introduced the extra opening act The Class, made up of school children from his Channel 4 series "Rock School". Würzel guested on "Overkill". The band continued the tour with Sepultura across Europe through the rest of November and December. At the show in Magdeburg, Germany on 4 December Motörhead joined Sepultura on stage during their support slot playing the song "Orgasmatron", in celebration of Sepultura's 20th Anniversary. The show on 7 December at the Philipshalle in Düsseldorf was recorded and later released as the "Stage Fright" DVD.
Motörhead picked up their first Grammy in the awards of 2005 in the Best Metal Performance category for their cover of Metallica's "Whiplash" on "". "They've managed to get the knife in," Lemmy grumbled. "It was only a mercy fuck – it was our 30th anniversary. If they gave us a Grammy for one of our albums or songs, it would mean something."
From March until early May 2005, the band toured the United States, and in June and August were on the "30th Anniversary" tour in Europe. On 22 August, they were the subject of an hour-long documentary, "Live Fast, Die Old", aired on Channel 4 as part of "The Other Side" series of documentaries, filmed by new and established directors.
On 20 September, a compilation containing the band's appearances on BBC Radio 1 and a concert recording from Paris Theatre, London, was released as "BBC Live & In-Session". In October, the band toured Europe with Mondo Generator before returning to Britain to tour with In Flames and Girlschool in October and November. During the show at the Brixton Academy on 19 November, Lemmy joined Girlschool on stage to play "Please Don't Touch". Motörhead finished the year's tours in December, with two engagements in New Zealand and five in Australia with Mötley Crüe. Also in 2005, Motörhead played on the Vaya Con Tioz farewell festival Böhse Onkelz at Lausitzring. In 2006, the band performed a four-date House of Blues tour in the States in March with Meldrum and from June until early August played at European open-air festivals with some indoor headlining shows. On 28 October, the band performed at The Rock Freakers Ball in Kansas City before heading off to tour Great Britain with Clutch and Crucified Barbara.
During that tour, "Kiss of Death" was released on 29 August 2006 via Sanctuary Records, with a video for "Be My Baby". The tour ended on 25 November at the Brixton Academy, where Phil Campbell played on "Killed By Death" during Crucified Barbara's support set. Twelve shows in Europe with Meldrum took them through the end of November to early December, the first two shows also featuring Skew Siskin.
In November, the band agreed to a sponsorship deal with the Greenbank B under-10s football team from North Hykeham, Lincoln, putting the band's name as well as War-Pig on the team's shirts; the under-10s run out to "Ace of Spades". Lemmy is old friends with Gary Weight, the team's manager; Weight "sent an email off to them and they came back and said it was a great idea" and hopes the deal will draw inspired performances from his team. On 25 April 2007, the band played at the Poliedro de Caracas in Caracas, Venezuela, and on 29 April at the Fundiçao Progresso, Rio de Janeiro. In June, Motörhead played an engagement at the Royal Festival Hall as part of Jarvis Cocker's Meltdown. On 26 February 2008, "No Sleep 'til Hammersmith" was reissued again as a two disc CD.
From March through to June 2008, the band convened in Los Angeles with producer Cameron Webb to begin work on their 19th album "Motörizer". Mikkey Dee's drum tracks were recorded at Dave Grohl's studio. "Motörizer" was released on 26 August. It does not feature artwork by Joe Petagno, the artist who designed many of their classic covers. In June 2008 the band performed on the main stage of the Download festival. Between 6 and 31 August, Motörhead joined Judas Priest, Heaven & Hell and Testament on the Metal Masters Tour. On 20 August the band played at the Roseland Ballroom, New York, as part of "The Volcom Tour 2008", which continued with The Misfits, Airbourne, Valient Thorr and Year Long Disaster at House of Blues, Anaheim, California on 2 September, playing a further thirteen dates. The band concluded the tour without the supporting bands, playing one more show at the Roseland Ballroom on 20 September, and the final engagement, at The Stone Pony, Asbury Park, New Jersey on 21 September.
On 30 September, Reuters reported that Neverdie Studios had signed a deal with Lemmy and Motörhead to develop and market Lemmy's Castle and Motorhead Stadium inside the virtual world of Entropia Universe, an online universe. The year's touring ended with a 34-date tour of Europe with a variety of support bands including Danko Jones, Saxon, Witchcraft, and Airbourne.
On 6 March 2009, the band played in the Middle East for the first time, at the annual Dubai Desert Rock Festival. On 1 April Motörhead were reported to have entered into a two-year sponsorship deal with UK Roller Derby team the Lincolnshire Bombers Roller Girls. That September, noted drummer Matt Sorum filled in for Mikkey Dee for a U.S. tour. "I was absolutely blown away and was very honoured to get the call," Sorum said. "You know what I love about Lemmy? He's always on time. We go on stage, no delays. Being in bands where you have to wait around for a couple of hours fucks you up."
In November 2009, the band were supported by NWOBHM veterans Sweet Savage on the Irish leg of the tour (30 years after first sharing the stage together) and punk and goth rock legends The Damned on the UK leg of their world tour. On The Damned's official website, Captain Sensible said: "Ha ha ... we're working with Lemmy again, are we? Excellent! He's the real deal, the absolute antithesis to all that the likes of Simon Cowell stand for. And for that we should all be grateful. This tour will be a celebration of all things rock 'n' roll ... pity the poor roadies is all I can say!"
In a November 2009 interview with ABORT Magazine's E.S. Day, Lemmy said that Motörhead would enter the studio in February 2010 "to rehearse, write and record" their 20th studio album, to be released by the end of the year. The album was recorded with Cameron Webb and Welsh producer Romesh Dodangoda in Longwave Studio, Cardiff. In an interview with Hungarian television in July 2010, drummer Mikkey Dee announced that the album was finished, with 11 tracks. The album's name was said to be "The Wörld Is Yours". On 3 November 2010, Future plc, a UK media company, announced that Motörhead were to release "The Wörld is Yours" via an exclusive publishing deal with "Classic Rock" magazine on 14 December 2010. The standard CD release of "The Wörld is Yours" would go on sale on 17 January 2011, through Motörhead's own label, Motörhead Music.
To coincide with the release of their upcoming album, Motörhead embarked on a 35th Anniversary UK tour, from 8–28 November 2010, and a European tour from 30 November 2010 – 19 December 2010. They also took their tour to the Americas in 2011. In October, the band recorded a slow blues version of their longtime hit "Ace of Spades" for a TV spot for Kronenbourg beer. On 5 December the single "Get Back in Line" was released, followed by the release of a video for the single on 6 December. In December, Mikkey Dee stated to French journalists that Motörhead are planning to release a box-set with several DVDs in 2011. He did not give any details but said that it will come in a "beautiful package including many surprises".
On 17 January 2011, it was announced that Motörhead would be part of the Sonisphere Festival in Knebworth. In August 2011, they headlined the Brutal Assault open-air festival in the Czech Republic. On 2 March 2011 Motörhead performed on "Late Night with Jimmy Fallon".
On 9 July 2011, former guitarist Würzel died of a heart attack. In celebration of 35 years' touring, in late 2011 the band released the live DVD "The Wörld Is Ours – Vol 1 – Everywhere Further Than Everyplace Else", including performances at the O2 Apollo Manchester, Best Buy Theater, New York City and Teatro Caupolicán, Santiago de Chile. On 19 December 2011, it was announced that Motörhead would play at the German festivals Rock am Ring and Rock im Park in Nürburgring and Nuremberg respectively in June 2012. On 12 January 2012, it was announced that Motörhead were touring the US and Canada in early 2012, along with three other metal bands Megadeth, Volbeat and Lacuna Coil. The Gigantour took place from 26 January to 28 February 2012, but Motörhead missed the final four shows because Lemmy had a combination of an upper respiratory viral infection and voice strain, resulting in severe laryngitis. Lemmy wrote on Facebook, "I'm giving my voice a good rest", hoping he would recover soon to play at the Mayhem Festival, which was held from 30 June to 5 August 2012. Motörhead also took part on 23 June in the Rock-A-Field Luxembourg Open Air Festival in Roeser.
In an April 2012 interview with Classic Rock Revisited, Lemmy was asked if Motörhead were planning to make a follow-up to "The Wörld Is Yours". He replied, "We have not started writing any songs yet but we will. We put out an album out every two years. I will continue to do that as long as I can afford an amp." On 28 June 2012, Lemmy told Auburn Reporter that Motörhead will release their next album in 2013 and they had written "about 6 songs so far." On 23 October 2012, Lemmy told Billboard.com that the band had planned to enter the studio in January to begin recording the album for a mid-2013 release. On 28 February 2013, it was announced that Motörhead had begun recording their new album. Motörhead released the live DVD "The Wörld Is Ours – Vol. 2 – Anyplace Crazy As Anywhere Else" in September 2012. On 18 June 2013, the new album's title was revealed to be "Aftershock".
In mid-November 2013, Motörhead were due to embark on a European tour alongside Saxon, followed by a tour in Germany and Scandinavia due to last until mid December 2013 but the dates were postponed and rescheduled for February and March 2014 due to Lemmy's health problems. However, in January 2014, Motörhead announced the cancellation of the new February and March dates of their European tour as Lemmy was still to reach full recovery from diabetes related health problems. But the same month, the band was confirmed for Coachella Festival to take place across two weekends in spring 2014 (12–14 and 19–21 April) in Indio, California, the exact dates to be revealed as 13 and 20 April 2014. In February 2014, Motörhead confirmed a Summer tour 2014 with eight European dates (from 24 June to 10 August) in France (2 dates), Switzerland, Italy, Germany (2 dates), Russia and Ukraine. In March 2014, the band announced a Los Angeles date on 11 April 2014 at Club Nokia. Later on, two new dates on 17 and 18 April 2014 respectively in Las Vegas (Pearl) and San Francisco (Warfield) were added. Still in March 2014, Motörhead announced that three heavy metal bands Megadeth, Anthrax and themselves would perform from 22 to 26 September 2014 at the first annual Motörhead's Motörboat cruise on board the Carnival Ecstasy (self-proclaimed "The Loudest Boat in the World"), due to sail from Miami and visit the ports of Key West and the Cozumel island just off Mexico's Yucatán Peninsula.
In a September 2014 interview on Full Metal Jackie, Lemmy stated that Motörhead would "probably" enter the studio in January 2015 to start work on their 22nd studio album for a tentative late 2015 release. On 25 February 2015, Motörhead officially confirmed that they were in the studio recording their new album in Los Angeles with longtime producer Cameron Webb. On 27 May 2015, the band released teasers on their Facebook page with the roman number "XXXX" on it. On 4 June the new album (which would be their last) "Bad Magic" was launched for pre-order on Amazon, revealing its title and cover art which also shows the "XXXX", coinciding with the 40th anniversary of the band. The album was released on 28 August 2015. The band performed at the UK's Glastonbury Festival in June 2015. Their final UK gig was at the Eden Project on 27 June 2015.
While touring the album as the "40th anniversary Tour", Motörhead had to cut short their Salt Lake City show on 27 August 2015 (in the Rocky Mountains) due to Lemmy's breathing problems (the result of an altitude sickness) and then they had to cancel completely day-off their Denver Riot Fest set on 28 August 2015. Their tour picked up again on 1 September 2015 at Emo's in Austin, Texas (moved from Cedar Park Center) but the group were again forced to abandon their set after three songs and to cancel subsequent shows (from the show on 2 September 2015 in San Antonio, Texas to the show on 5 September 2015 in Houston, Texas included).
Despite his ongoing health issues forcing Motörhead to cut short or cancel several US shows, Lemmy Kilmister was able to bounce back in time for the trio's annual Motörboat heavy metal cruise from Miami to the Bahamas which ran from 28 September through 2 October 2015 including performances by bands such as Slayer, Anthrax, Exodus, Suicidal Tendencies and Corrosion of Conformity. For this occasion, Motörhead performed live two entire (identical) sets on 30 September and 1 October 2015.
Motörhead continued the "40th Anniversary Tour" in Europe in November and December. They played concerts in Germany, Sweden, Norway and Finland. Their final concert was in Berlin, Germany on 11 December 2015. After Lemmy's death, drummer Mikkey Dee spoke in an interview about him: "He was terribly gaunt. He spent all his energy on stage and afterwards he was very, very tired. It's incredible that he could even play, that he could finish the Europe tour. It was only 20 days ago. Unbelievable." The "40th Anniversary Tour" was planned to continue in January 2016 in the band's home country the UK, the first concert would have been in Newcastle on 23 January 2016.
On 28 December 2015, Lemmy died, four days after celebrating his 70th birthday. He was the second Motörhead member to die in 2015, following Phil Taylor the previous month. The band posted the following message on Facebook:
The following day, drummer Mikkey Dee confirmed that Motörhead would not continue, stating, "Motörhead is over, of course. Lemmy was Motörhead. We won't be doing any more tours or anything. And there won't be any more records. But the brand survives, and Lemmy lives on in the hearts of everyone." Two days after Lemmy's death, guitarist Phil Campbell also stated that "Motörhead is no longer".
A few days later, the band's long-time manager Todd Singerman told the press that Lemmy had experienced chest pains two days after his 70th birthday party (held at Whisky a Go Go) and visited into the emergency room, but was released the next day. However, Singerman was concerned because Lemmy's speech was "getting bad" and took him to a brain scan. On 26 December the doctor came into Lemmy's apartment, "brought the results and told us all that he has two to six months to live". Lemmy reacted calmly. "He took it better than all of us", said Singerman. "His only comment was, 'Oh, only two months, huh?' The doctor goes, 'Yeah, Lem, I don't want to bullshit you. It's bad, and there's nothing anyone can do. I would be lying to you if I told you there was a chance.'" Plans were made to treat Lemmy at home. A video game console at the Rainbow Bar and Grill that Lemmy loved to play was brought to his apartment. On 28 December 2015, he spent hours on the console, and Rainbow owner Mikael Maglieri paid a visit. Lemmy died in his sleep later that day.
An autopsy on Kilmister showed that the causes of death were prostate cancer, a cardiac arrhythmia, and congestive heart failure.
Initially planned on 27 May 2016, UDR Music released on 10 June 2016 "Clean Your Clock", a Motörhead archive live album due to contain material recorded at the 20 & 21 November 2015 shows at the Zenith in Munich. On 1 September 2017, Motörhead released "Under Cöver", a covers album featuring covers throughout Motörhead's history, along with covers only found on tribute albums, and new recordings.
Former Motörhead guitarist "Fast" Eddie Clarke died on 10 January 2018 after a battle with pneumonia at the age of 67, making him the last member of the band's classic lineup (following Taylor and Lemmy) to die.
Original Motörhead guitarist Larry Wallis died on 19 September 2019 at age 70 from an unknown cause.
In a biography of the band, senior editor for AllMusic, Stephen Thomas Erlewine, wrote: "Motörhead's overwhelmingly loud and fast style of heavy metal was one of the most groundbreaking styles the genre had to offer in the late '70s" and though "Motörhead wasn't punk rock ... they were the first metal band to harness that energy and, in the process, they created speed metal and thrash metal." Whether they created these genres might be subject to debate, but Motörhead were unquestionably influential.
Although Motörhead is often considered a heavy metal band, Lemmy always described Motörhead's music as simply "rock and roll". In 2011, he said: "We were not heavy metal. We were a rock 'n' roll band. Still are. Everyone always describes us as heavy metal even when I tell them otherwise. Why won't people listen?" In 2014, he reiterated to "Der Spiegel" that he did not particularly like heavy metal.
Lemmy had stated that he generally felt more kinship with punk rockers than with heavy metal bands: Motörhead had engagements with fellow Brits, The Damned, with whom he played bass on a handful of late 1970s engagements, as well as having penned the song "R.A.M.O.N.E.S." as a tribute to the Ramones. Motörhead, Lemmy stated, have more in common aesthetically with The Damned than Black Sabbath, and nothing whatsoever in common with Judas Priest. Lemmy said he felt little kinship with the speed metal bands Motörhead have inspired:
The "NME" stated that their brief solos were just long enough "... to open another bottle of beer", while a 1977 "Stereo Review" commented that "they know they're like animals, and they don't want to appear any other way. In view of the many ugly frogs in heavy metal who think they are God's gift to womankind these Quasimodos even seem charming in their own way". Motörhead's approach has not changed drastically over the band's career, though this is a deliberate choice: erstwhile Motörhead drummer Phil "Philthy Animal" Taylor said that rock icons like Chuck Berry and Little Richard never drastically altered their style, and, like them, Motörhead preferred to play what they enjoyed and did best. This fondness for the first decade of rock and roll (mid-1950s to mid-1960s) is also reflected in some of Motörhead's occasional cover songs from that era.
Lemmy often played power chords in his basslines. When asked about whether he had begun as a rhythm guitarist, he stated:
The name "Motörhead" is a reference to users of the drug amphetamine. The band's distinctive fanged-face logo, with its oversized boar's tusks, chains, and spikes, was created by artist Joe Petagno in 1977 for the cover of the "Motörhead" album and has appeared in many variations on covers of ensuing albums. The fanged face has been referred to variously as "War-Pig" and "Snaggletooth". The band's name is usually printed in a lowercase form of blackletter. The umlaut character ö is possibly derived from the similar "heavy metal umlaut" in the name of their 1975 acquaintances Blue Öyster Cult. However, this umlaut does not alter the pronunciation of the band's name. When asked if Germans pronounced the band "Motuuuurhead", Lemmy answered "No, they don't. I only put it in there to look mean".
Snaggletooth is the fanged face that serves as the symbol of Motörhead. Artist Joe Petagno drew it in 1977 for the cover of the band's debut album (with designer Phil Smee who turned it into a negative and did the lettering to complete the logo), having met Lemmy while doing some work with Hawkwind. Petagno stated:
Eddie Clarke was less sure about the imagery to begin with:
It has remained a symbol of Motörhead throughout the years, with Petagno creating many variations of Snaggletooth, or as some have called it and written it down as War-Pig, for the covers of ensuing albums. Only two of the original covers for Motörhead's 22 studio albums do not feature any variation of War-Pig on the cover: "On Parole" and "Overnight Sensation" (of which, "On Parole" was never sanctioned by the band), and was in any case reissued with a black Snaggletooth on a white background. Phil is wearing a Snaggletooth badge on the cover of "Ace of Spades". The cover of "Iron Fist" depicts a metal gauntlet wearing four skull-shaped rings, one of which is Snaggletooth, while the rear of the album-sleeve shows a fully detailed 3-D metal sculpture of the symbol. Originally the Snaggletooth design included a swastika on one of the helmet's spikes. This was painted out on later re-releases of the albums on CD.
On 21 September 2007, Petagno announced that "there will be no more "HEADS" from my hand", citing irreconcilable differences between himself and the band's current management, Singerman Entertainment. Petagno stated:
In reply, Lemmy stated:
Motörhead are well known in the professional wrestling world for performing wrestler Triple H's entrance music, "The Game", which he has used as his entrance music since January 2001. In addition to the song playing whenever Triple H appears on WWE programming such as "Raw" or "SmackDown", and at other pay-per-view wrestling events, the band have performed the song live for him at WrestleMania X-Seven and WrestleMania 21. Their song "Rock Out" was also used as the theme song of the WWE pay-per-view Unforgiven in 2008. Motörhead also provided the entrance music for Triple H's faction Evolution, entitled "Line in the Sand". "The Game" was released on both the American version of the "Hammered" and "WWF The Music, Vol. 5" albums, and "Line in the Sand" was released on the "" album. Motörhead have since performed a new entrance track for Triple H, entitled "King of Kings", which made its debut at WrestleMania 22. Triple H has also introduced the band in concert. Lemmy inspired Triple H's facial hair, and Triple H spoke at Lemmy's funeral.
Classic Rock Roll of Honour Awards
Echo Awards
Grammy Awards
Kerrang Awards
Metal Hammer Awards (Germany)
Metal Hammer Golden Gods Awards (UK)
Revolver Music Awards
Rock and Roll Hall of Fame
|
https://en.wikipedia.org/wiki?curid=20907
|
Multiverse
The multiverse is a hypothetical group of multiple universes. Together, these universes comprise everything that exists: the entirety of space, time, matter, energy, information, and the physical laws and constants that describe them. The different universes within the multiverse are called "parallel universes," "other universes," "alternate universes," or "many worlds."
Early recorded examples of the idea of infinite worlds existed in the philosophy of Ancient Greek Atomism, which proposed that infinite parallel worlds arose from the collision of atoms. In the third century B.C. philosopher Chrysippus suggested that the world eternally expired and regenerated, effectively suggesting the existence of multiple universes across time. The concept of multiple universes became more defined in the middle ages. Fakhr al-Din al-Razi of the Islamic Golden Age, explored the probable existence of the multiverse, based on general principles of the Quran which stated that Allah was "Lord of the Worlds".
In Dublin in 1952, Erwin Schrödinger gave a lecture in which he jocularly warned his audience that what he was about to say might "seem lunatic". He said that when his equations seemed to describe several different histories, these were "not alternatives, but all really happen simultaneously". This sort of duality is called "superposition."
The American philosopher and psychologist William James used the term "multiverse" in 1895, but in a different context. The term was first used in fiction and in its current physics context by Michael Moorcock in his 1963 SF Adventures novella "The Sundered Worlds" (part of his Eternal Champion series).
Multiple universes have been hypothesized in cosmology, physics, astronomy, religion, philosophy, transpersonal psychology, music and all kinds of literature, particularly in science fiction, comic books and fantasy. In these contexts, parallel universes are also called "alternate universes", "quantum universes", "interpenetrating dimensions", "parallel universes", "parallel dimensions", "parallel worlds", "parallel realities", "quantum realities", "alternate realities", "alternate timelines", "alternate dimensions" and "dimensional planes".
The physics community has debated the various multiverse theories over time. Prominent physicists are divided about whether any other universes exist outside of our own.
Some physicists say the multiverse is not a legitimate topic of scientific inquiry. Concerns have been raised about whether attempts to exempt the multiverse from experimental verification could erode public confidence in science and ultimately damage the study of fundamental physics. Some have argued that the multiverse is a philosophical notion rather than a scientific hypothesis because it cannot be empirically falsified. The ability to disprove a theory by means of scientific experiment has always been part of the accepted scientific method. Paul Steinhardt has famously argued that no experiment can rule out a theory if the theory provides for all possible outcomes.
In 2007, Nobel laureate Steven Weinberg suggested that if the multiverse existed, "the hope of finding a rational explanation for the precise values of quark masses and other constants of the standard model that we observe in our Big Bang is doomed, for their values would be an accident of the particular part of the multiverse in which we live."
Around 2010, scientists such as Stephen M. Feeney analyzed Wilkinson Microwave Anisotropy Probe (WMAP) data and claimed to find evidence suggesting that our universe collided with other (parallel) universes in the distant past. However, a more thorough analysis of data from the WMAP and from the Planck satellite, which has a resolution three times higher than WMAP, did not reveal any statistically significant evidence of such a bubble universe collision. In addition, there was no evidence of any gravitational pull of other universes on ours.
Modern proponents of one or more of the multiverse hypotheses include Hugh Everett, Don Page, Brian Greene, Max Tegmark, Alan Guth, Andrei Linde, Michio Kaku, David Deutsch, Leonard Susskind, Alexander Vilenkin, Yasunori Nomura, Raj Pathria, Laura Mersini-Houghton, Neil deGrasse Tyson, Sean Carroll and Stephen Hawking.
Scientists who are generally skeptical of the multiverse hypothesis include: David Gross, Paul Steinhardt, Anna Ijjas, Abraham Loeb, David Spergel, Neil Turok, Viatcheslav Mukhanov, Michael S. Turner, Roger Penrose, George Ellis, Joe Silk,
Carlo Rovelli, Adam Frank, Marcelo Gleiser, Jim Baggott and Paul Davies.
In his 2003 "New York Times" opinion piece, "A Brief History of the Multiverse", author and cosmologist Paul Davies offered a variety of arguments that multiverse theories are non-scientific:
George Ellis, writing in August 2011, provided a criticism of the multiverse, and pointed out that it is not a traditional scientific theory. He accepts that the multiverse is thought to exist far beyond the cosmological horizon. He emphasized that it is theorized to be so far away that it is unlikely any evidence will ever be found. Ellis also explained that some theorists do not believe the lack of empirical testability falsifiability is a major concern, but he is opposed to that line of thinking:
Ellis says that scientists have proposed the idea of the multiverse as a way of explaining the nature of existence. He points out that it ultimately leaves those questions unresolved because it is a metaphysical issue that cannot be resolved by empirical science. He argues that observational testing is at the core of science and should not be abandoned:
Max Tegmark and Brian Greene have devised classification schemes for the various theoretical types of multiverses and universes that they might comprise.
Cosmologist Max Tegmark has provided a taxonomy of universes beyond the familiar observable universe. The four levels of Tegmark's classification are arranged such that subsequent levels can be understood to encompass and expand upon previous levels. They are briefly described below.
A prediction of cosmic inflation is the existence of an infinite ergodic universe, which, being infinite, must contain Hubble volumes realizing all initial conditions.
Accordingly, an infinite universe will contain an infinite number of Hubble volumes, all having the same physical laws and physical constants. In regard to configurations such as the distribution of matter, almost all will differ from our Hubble volume. However, because there are infinitely many, far beyond the cosmological horizon, there will eventually be Hubble volumes with similar, and even identical, configurations. Tegmark estimates that an identical volume to ours should be about 1010115 meters away from us.
Given infinite space, there would, in fact, be an infinite number of Hubble volumes identical to ours in the universe. This follows directly from the cosmological principle, wherein it is assumed that our Hubble volume is not special or unique.
In the eternal inflation theory, which is a variant of the cosmic inflation theory, the multiverse or space as a whole is stretching and will continue doing so forever, but some regions of space stop stretching and form distinct bubbles (like gas pockets in a loaf of rising bread). Such bubbles are embryonic level I multiverses.
Different bubbles may experience different spontaneous symmetry breaking, which results in different properties, such as different physical constants.
Level II also includes John Archibald Wheeler's oscillatory universe theory and Lee Smolin's fecund universes theory.
Hugh Everett III's many-worlds interpretation (MWI) is one of several mainstream interpretations of quantum mechanics.
In brief, one aspect of quantum mechanics is that certain observations cannot be predicted absolutely. Instead, there is a range of possible observations, each with a different probability. According to the MWI, each of these possible observations corresponds to a different universe. Suppose a six-sided die is thrown and that the result of the throw corresponds to a quantum mechanics observable. All six possible ways the die can fall correspond to six different universes.
Tegmark argues that a Level III multiverse does not contain more possibilities in the Hubble volume than a Level I or Level II multiverse. In effect, all the different "worlds" created by "splits" in a Level III multiverse with the same physical constants can be found in some Hubble volume in a Level I multiverse. Tegmark writes that, "The only difference between Level I and Level III is where your doppelgängers reside. In Level I they live elsewhere in good old three-dimensional space. In Level III they live on another quantum branch in infinite-dimensional Hilbert space."
Similarly, all Level II bubble universes with different physical constants can, in effect, be found as "worlds" created by "splits" at the moment of spontaneous symmetry breaking in a Level III multiverse. According to Yasunori Nomura, Raphael Bousso, and Leonard Susskind, this is because global spacetime appearing in the (eternally) inflating multiverse is a redundant concept. This implies that the multiverses of Levels I, II, and III are, in fact, the same thing. This hypothesis is referred to as "Multiverse = Quantum Many Worlds". According to Yasunori Nomura, this quantum multiverse is static, and time is a simple illusion.
Related to the "many-worlds" idea are Richard Feynman's "multiple histories" interpretation and H. Dieter Zeh's "many-minds" interpretation.
The ultimate mathematical universe hypothesis is Tegmark's own hypothesis.
This level considers all universes to be equally real which can be described by different mathematical structures.
Tegmark writes:
He argues that this "implies that any conceivable parallel universe theory can be described at Level IV" and "subsumes all other ensembles, therefore brings closure to the hierarchy of multiverses, and there cannot be, say, a Level V."
Jürgen Schmidhuber, however, says that the set of mathematical structures is not even well-defined and that it admits only universe representations describable by constructive mathematics—that is, computer programs.
Schmidhuber explicitly includes universe representations describable by non-halting programs whose output bits converge after finite time, although the convergence time itself may not be predictable by a halting program, due to the undecidability of the halting problem. He also explicitly discusses the more restricted ensemble of quickly computable universes.
The American theoretical physicist and string theorist Brian Greene discussed nine types of multiverses:
In several theories, there is a series of infinite, self-sustaining cycles (for example, an eternity of Big Bangs, Big Crunches, and/or Big Freezes).
A multiverse of a somewhat different kind has been envisaged within string theory and its higher-dimensional extension, M-theory.
These theories require the presence of 10 or 11 spacetime dimensions respectively. The extra six or seven dimensions may either be compactified on a very small scale, or our universe may simply be localized on a dynamical (3+1)-dimensional object, a D3-brane. This opens up the possibility that there are other branes which could support other universes.
Black-hole cosmology is a cosmological model in which the observable universe is the interior of a black hole existing as one of possibly many universes inside a larger universe. This includes the theory of white holes, which are on the opposite side of space-time.
The concept of other universes has been proposed to explain how our own universe appears to be fine-tuned for conscious life as we experience it.
If there were a large (possibly infinite) number of universes, each with possibly different physical laws (or different fundamental physical constants), then some of these universes (even if very few) would have the combination of laws and fundamental parameters that are suitable for the development of matter, astronomical structures, elemental diversity, stars, and planets that can exist long enough for life to emerge and evolve.
The weak anthropic principle could then be applied to conclude that we (as conscious beings) would only exist in one of those few universes that happened to be finely tuned, permitting the existence of life with developed consciousness. Thus, while the probability might be extremely small that any particular universe would have the requisite conditions for life (as we understand life), those conditions do not require intelligent design as an explanation for the conditions in the Universe that promote our existence in it.
An early form of this reasoning is evident in Arthur Schopenhauer's 1844 work "Von der Nichtigkeit und dem Leiden des Lebens", where he argues that our world must be the worst of all possible worlds, because if it were significantly worse in any respect it could not continue to exist.
Proponents and critics disagree about how to apply Occam's razor. Critics argue that to postulate an almost infinite number of unobservable universes, just to explain our own universe, is contrary to Occam's razor. However, proponents argue that in terms of Kolmogorov complexity the proposed multiverse is simpler than a single idiosyncratic universe.
For example, multiverse proponent Max Tegmark argues:
Possible worlds are a way of explaining probability and hypothetical statements. Some philosophers, such as David Lewis, believe that all possible worlds exist and that they are just as real as the world we live in (a position known as modal realism).
|
https://en.wikipedia.org/wiki?curid=20911
|
Molotov cocktail
A Molotov cocktail, also known as a petrol bomb, gasoline bomb (United States), bottle bomb, poor man's grenade, Molotovin koktaili (Finnish), polttopullo (Finnish), fire bomb (not to be confused with an actual fire bomb), fire bottle or just Molotov, sometimes shortened as Molly, is a generic name used for a variety of bottle-based improvised incendiary weapons.
Due to the relative ease of production, Molotov cocktails have been used by street criminals, rioters, criminal gangs, urban guerrillas, terrorists, irregular soldiers, or even regular soldiers short on equivalent military-issue weapons. They are primarily intended to ignite rather than completely destroy targets, and are often used just as much to cause chaos as to actually do damage, sometimes being used as weapons during riots and mass protest.
The name "Molotov cocktail" was coined by the Finns during the Winter War, called "Molotovin koktaili" in Finnish. The name was a pejorative reference to Soviet foreign minister Vyacheslav Molotov, who was one of the architects of the Molotov–Ribbentrop Pact signed in late August 1939. The pact with Nazi Germany, which established that the U.S.S.R. would not intervene in the war, was widely mocked by the Finns.
The name's origin came from the propaganda Molotov produced during the Winter War, mainly his declaration on Soviet state radio that bombing missions over Finland were actually airborne humanitarian food deliveries for their starving neighbours. As a result, the Finns sarcastically dubbed the Soviet cluster bombs "Molotov bread baskets" in reference to Molotov's propaganda broadcasts. When the hand-held bottle firebomb was developed to attack Soviet tanks, the Finns called it the "Molotov cocktail", as "a drink to go with his food parcels".
A Molotov cocktail is a breakable glass bottle containing a flammable substance such as petrol, alcohol, or a napalm-like mixture, with some motor oil added, and usually a source of ignition such as a burning cloth wick held in place by the bottle's stopper. The wick is usually soaked in alcohol or kerosene, rather than petrol.
In action, the wick is lit and the bottle hurled at a target such as a vehicle or fortification. When the bottle smashes on impact, the ensuing cloud of fuel droplets and vapour is ignited by the attached wick, causing an immediate fireball followed by spreading flames as the remainder of the fuel is consumed.
Other flammable liquids such as diesel fuel, methanol, turpentine, jet fuel, and isopropyl alcohol have been used in place of, or combined with petrol. Thickening agents such as solvents, foam polystyrene, baking soda, petroleum jelly, tar, strips of tyre tubing, nitrocellulose, XPS foam, motor oil, rubber cement, detergent and dish soap have been added to promote adhesion of the burning liquid and to create clouds of thick, choking smoke.
In addition, toxic substances are also known to be added to the mixture, to create a suffocating or poisonous gas on the resulting explosion, effectively turning the Molotov cocktail into a makeshift chemical weapon. These include bleach, chlorine, various strong acids, and pesticides, among others.
Improvised incendiary devices of this type were used in warfare for the first time in the Spanish Civil War between July 1936 and April 1939, before they became known as "Molotov cocktails". In 1936, General Francisco Franco ordered Spanish Nationalist forces to use the weapon against Soviet T-26 tanks supporting the Spanish Republicans in a failed assault on the Nationalist stronghold of Seseña, near Toledo, south of Madrid. After that, both sides used simple petrol bombs or petrol-soaked blankets with some success. Tom Wintringham, a veteran of the International Brigades, later publicised his recommended method of using them:
The Battle of Khalkhin Gol, a border conflict of 1939 ostensibly between Mongolia and Manchukuo, saw heavy fighting between Japanese and Soviet forces. Short of anti-tank equipment, Japanese infantry attacked Soviet tanks with gasoline-filled bottles. Japanese infantrymen claimed that several hundred Soviet tanks had been destroyed this way, though Soviet loss records do not support this assessment.
On 30 November 1939, the Soviet Union attacked Finland, starting what came to be known as the Winter War. The Finns perfected the design and tactical use of the petrol bomb. The fuel for the Molotov cocktail was refined to a slightly sticky mixture of alcohol, kerosene, tar, and potassium chlorate. Further refinements included the attachment of wind-proof matches or a phial of chemicals that would ignite on breakage, thereby removing the need to pre-ignite the bottle, and leaving the bottle about one-third empty was found to make breaking more likely.
A British War Office report dated June 1940 noted that:
Molotov cocktails were eventually mass-produced by the Alko corporation at its Rajamäki distillery, bundled with matches to light them. Production totalled 450,000 during the Winter War. The original recipe of the Molotov cocktail was a mixture of ethanol, tar and gasoline in a bottle. The bottle had two long pyrotechnic storm matches attached to either side. Before use, one or both of the matches was lit; when the bottle broke on impact, the mixture ignited. The storm matches were found to be safer to use than a burning rag on the mouth of the bottle.
Early in 1940, with the prospect of immediate invasion, the possibilities of the petrol bomb gripped the imagination of the British public. For the layman, the petrol bomb had the benefit of using entirely familiar and available materials, and they were quickly improvised in large numbers, with the intention of using them against enemy tanks.
The Finns had found that they were effective when used in the right way and in sufficient numbers. Although the experience of the Spanish Civil War received more publicity, the more sophisticated petroleum warfare tactics of the Finns were not lost on British commanders. In his 5 June address to LDV leaders, General Ironside said:
Wintringham advised that a tank that was isolated from supporting infantry was potentially vulnerable to men who had the required determination and cunning to get close. Rifles or even a shotgun would be sufficient to persuade the crew to close all the hatches, and then the view from the tank is very limited; a turret-mounted machine gun has a very slow traverse and cannot hope to fend off attackers coming from all directions. Once sufficiently close, it is possible to hide where the tank's gunner cannot see: "The most dangerous distance away from a tank is 200 yards; the safest distance is six inches." Petrol bombs will soon produce a pall of blinding smoke, and a well-placed explosive package or even a stout iron bar in the tracks can immobilise the vehicle, leaving it at the mercy of further petrol bombs – which will suffocate the engine and possibly the crew – or an explosive charge or anti-tank mine.
By August 1940, the War Office produced training instructions for the creation and use of Molotov cocktails. The instructions suggested scoring the bottles vertically with a diamond to ensure breakage and providing fuel-soaked rag, windproof matches or a length of cinema film (made of highly flammable nitrocellulose) as a source of ignition.
On 29 July 1940, manufacturers Albright & Wilson of Oldbury demonstrated to the RAF how their white phosphorus could be used to ignite incendiary bombs. The demonstration involved throwing glass bottles containing a mixture of petrol and phosphorus at pieces of wood and into a hut. On breaking, the phosphorus was exposed to the air and spontaneously ignited; the petrol also burned, resulting in a fierce fire. Because of safety concerns, the RAF was not interested in white phosphorus as a source of ignition, but the idea of a self-igniting petrol bomb took hold. Initially known as an A.W. bomb, it was officially named the No. 76 Grenade, but more commonly known as the SIP (Self-Igniting Phosphorus) grenade. The perfected list of ingredients was white phosphorus, benzene, water and a two-inch strip of raw rubber; all in a half-pint bottle sealed with a crown stopper. Over time, the rubber would slowly dissolve, making the contents slightly sticky, and the mixture would separate into two layers – this was intentional, and the grenade should not be shaken to mix the layers, as this would only delay ignition. When thrown against a hard surface, the glass would shatter and the contents would instantly ignite, liberating choking fumes of phosphorus pentoxide and sulfur dioxide as well as producing a great deal of heat. Strict instructions were issued to store the grenades safely, preferably underwater and certainly never in a house. Mainly issued to the Home Guard as an anti-tank weapon, it was produced in vast numbers; by August 1941 well over 6,000,000 had been manufactured.
There were many who were skeptical about the efficacy of Molotov cocktails and SIPs grenades against the more modern German tanks. Weapon designer Stuart Macrae witnessed a trial of the SIPs grenade at Farnborough: "There was some concern that, if the tank drivers could not pull up quickly enough and hop out, they were likely to be frizzled to death, but after looking at the bottles they said they would be happy to take a chance." The drivers were proved right, trials on modern British tanks confirmed that Molotov and SIP grenades caused the occupants of the tanks "no inconvenience whatsoever."
Wintringham, though enthusiastic about improvised weapons, cautioned against a reliance on petrol bombs and repeatedly emphasised the importance of using explosive charges.
During the Irish War of Independence, the Irish Republican Army sometimes used sods of turf soaked in paraffin oil to attack British army barracks. Fencing wire was pushed through the sod to make a throwing handle.
The Polish Home Army developed a version which ignited on impact without the need of a wick. Ignition was caused by a reaction between concentrated sulfuric acid mixed with the fuel and a mixture of potassium chlorate and sugar which was crystallized from solution onto a rag attached to the bottle.
The United States Marine Corps developed a version during World War II that used a tube of nitric acid and a lump of metallic sodium to ignite a mixture of petrol and diesel fuel.
Molotov cocktails were reportedly used in the United States for arson attacks on shops and other buildings during the 1992 Los Angeles riots.
During the Second Battle of Fallujah in 2004, U.S. Marines employed Molotov cocktails made with "one part liquid laundry detergent, two parts gas" while clearing houses "when contact is made in a house and the enemy must be burned out". The tactic "was developed in response to the enemy's tactics" of guerrilla warfare and particularly martyrdom tactics which often resulted in U.S. Marine casualties. The cocktail was a less expedient alternative to white phosphorus mortar rounds or propane tanks detonated with C4 (nicknamed the "House Guest"), all of which proved effective at burning out engaged enemy combatants.
Molotov cocktails were also used by protesters and civilian militia in Ukraine during violent outbreaks of the Euromaidan and the 2014 Ukrainian revolution. Protesters during the Ferguson riots used Molotov cocktails.
In Bangladesh during anti government protests at the time of the 2014 national election, many buses and cars were targeted with petrol bombs. A number of people burnt to death and many more were injured during the period 2013–2014 due to petrol bomb attacks.
In the 2019 Hong Kong protests, protesters used Molotov cocktails to defend and attack police or to create roadblocks. Protesters also attacked an MTR station and caused severe damage. A journalist has also been hit by a Molotov cocktail during the protests. They also used Molotov cocktails to stop the police from invading the universities in Hong Kong and laying siege to it.
During the protests in Venezuela from 2014 and into 2017, protesters had been using Molotov cocktails similar to those used by demonstrators in other countries. As the 2017 Venezuelan protests intensified, demonstrators began using "Puputovs", a play on words of Molotov, with glass devices filled with excrement being thrown at authorities after the PSUV ruling-party official, Jacqueline Faría, mocked protesters who had to crawl through sewage in Caracas' Guaire River to avoid tear gas.
On 8 May, the hashtag #puputov became the top trend on Twitter in Venezuela as reports of authorities vomiting after being drenched in excrement began to circulate. By 10 May 2017, Venezuelans unrelated to political parties called for a "Marcha de la Mierda", or a ""March of Shit"", essentially establishing puputovs in the arsenal of Venezuelan protesters. A month later on 4 June 2017 during protests against Donald Trump in Portland, Oregon, protesters began throwing balloons filled with "unknown, foul-smelling liquid" at officers.
As incendiary devices, Molotov cocktails are illegal to manufacture or possess in many regions. In the United States, Molotov cocktails are considered "destructive devices" under the National Firearms Act and regulated by the ATF. Wil Casey Floyd, formerly from Elkhart Lake, Wisconsin, was arrested after throwing Molotov cocktails at Seattle Police Officers during a protest in May 2016; he pleaded guilty for using the incendiary devices in February 2018.
At Simpson County, Kentucky, 20-year-old Trey Alexander Gwathney-Law attempted to burn Franklin-Simpson County Middle School with five Molotov cocktails; he was found guilty of making and possessing illegal firearms and was sentenced to 20 years in prison in 2018.
|
https://en.wikipedia.org/wiki?curid=20922
|
Matzo
Matzo, matzah, or matza ( "matsoh", "matsa"; plural matzot; matzos of Ashkenazi Jewish dialect) is an unleavened flatbread that is part of Jewish cuisine and forms an integral element of the Passover festival, during which "chametz" (leaven and five grains that, per Jewish Law, can be leavened) is forbidden.
As the Torah recounts, God commanded the Jews to create this special unleavened bread. During Passover it is eaten as a flat, cracker-like bread or used in dishes as breadcrumbs and in the traditional matzo ball soup.
Matzo that is kosher for Passover is limited in Ashkenazi tradition to plain matzo made from flour and water. The flour may be whole grain or refined grain, but must be either wheat, spelt, barley, rye, or oat. Some Sephardic communities allow matzos containing eggs and fruit juice to be used throughout the holiday.
Passover and non-Passover matzo may be soft or crisp, but the crisp "cracker" type is available commercially in most locations. Soft matzo is made by hand, and is similar to pita.
"Matzo" is mentioned in the Torah several times in relation to The Exodus from Egypt:
There are numerous explanations behind the symbolism of matzo. One is historical: Passover is a commemoration of the exodus from Egypt. The biblical narrative relates that the Israelites left Egypt in such haste they could not wait for their bread dough to rise; the bread, when baked, was matzo. (Exodus 12:39). The other reason for eating matzo is symbolic: On the one hand, matzo symbolizes redemption and freedom, but it is also "lechem oni", "poor man's bread". Thus it serves as a reminder to be humble, and to not forget what life was like in servitude. Also, leaven symbolizes corruption and pride as leaven "puffs up". Eating the "bread of affliction" is both a lesson in humility and an act that enhances the appreciation of freedom.
Another explanation is that matzo has been used to replace the pesach, or the traditional Passover offering that was made before the destruction of the Temple. During the Seder the third time the matzo is eaten it is preceded with the Sephardic rite, "zekher l'korban pesach hane'ekhal al hasova". This means "remembrance of the Passover offering, eaten while full". This last piece of the matzo eaten is called afikoman and many explain it as a symbol of salvation in the future.
The Passover Seder meal is full of symbols of salvation, the closing line, "Next year in Jerusalem," but the use of matzo is the oldest symbol of salvation in the Seder.
At the Passover seder, simple matzo made of flour and water is mandatory. Sephardic tradition additionally permits the inclusion of eggs in the recipe. The flour must be ground from one of the five grains specified in Jewish law for Passover matzo: wheat, barley, spelt, rye or oat. Per Ashkenazic tradition, matzo made with wine, fruit juice, onion, garlic, etc., is not acceptable for use at any time during the Passover festival except by the elderly or unwell.
Non-Passover matzo may be made with onion, garlic, poppy seed, etc. It can even be made from rice, maize, buckwheat and other non-traditional flours that can never be used for Passover matzo.
Some manufacturers produce gluten-free matzo-lookalike made from potato starch, tapioca, and other non-traditional flour to market to those who cannot safely eat gluten, such as those with coeliac disease. The Orthodox Union states that these gluten-free products may be eaten on Passover, but that they do not fulfill the commandment ("mitzvah") of eating matzo, because matzo must be made from one of the five grains (wheat, barley, oat, spelt, and rye).
The only one of the five grains that does not contain gluten is oat, but the resulting matzo would be gluten-free only if there were no cross-contamination with gluten-containing grains. Additionally, some authorities have expressed doubt about whether oat is properly listed among the five grains, or whether it resulted from a historical mistranslation.
Matzo dough is quickly mixed and rolled out without an autolyse step as used for leavened breads. Most forms are pricked with a fork or a similar tool to keep the finished product from puffing up, and the resulting flat piece of dough is cooked at high temperature until it develops dark spots, then set aside to cool and, if sufficiently thin, to harden to crispness. Dough is considered to begin the leavening process 18 minutes from the time it gets wet; sooner if eggs, fruit juice, or milk is added to the dough. The entire process of making matzo takes only a few minutes in efficient modern matzo bakeries.
After baking, matzo may be ground into fine crumbs, known as matzo meal. Matzo meal can be used like flour during the week of Passover when flour can otherwise be used only to make matzo.
There are two major forms of matzo. In many western countries the most common form is the hard form of matzo which is cracker-like in appearance and taste and is used in all Ashkenazic and most Sephardic communities. Yemenites, and Iraqi Jews traditionally made a form of soft matzo which looks like Greek pita or like a tortilla. Soft matzo is made only by hand, and generally with "shmurah" flour.
Flavored varieties of matzo are produced commercially, such as poppy seed- or onion-flavored. Oat and spelt matzo with kosher certification are produced. Oat matzo is generally suitable for those who cannot eat gluten. Whole wheat, bran and organic matzo are also available. Chocolate-covered matzo is a favorite among children, although some consider it "enriched matzo" and will not eat it during the Passover holiday. A quite different flat confection of chocolate and nuts that resembles matzo is sometimes called "chocolate matzo".
Matzo contains typically 111 calories per 1-ounce/28g (USDA Nutrient Database), about the same as rye crispbread.
"Shĕmura" ("guarded") matzo ( "matsa shĕmura") is made from grain that has been under special supervision from the time it was harvested to ensure that no fermentation has occurred, and that it is suitable for eating on the first night of Passover. ("Shĕmura" wheat may be formed into either handmade or machine-made matzo, while non-"shĕmura" wheat is only used for machine-made matzo. It is possible to hand-bake matzo in "shĕmura" style from non-shmurah flour—this is a matter of style, it is not actually in any way "shĕmura"—but such matzo has rarely been produced since the introduction of machine-made matzo.)
Haredi Judaism is scrupulous about the supervision of matzo and have the custom of baking their own or at least participating in some stage of the baking process. Rabbi Haim Halberstam of Sanz ruled that machine-made matzo were chametz. According to that opinion, handmade non-"shmurah" matzo may be used on the eighth day of Passover outside of the Holy Land. However the non-Hasidic Haredi community of Jerusalem follows the custom that machine-made matzo may be used, with preference to the use of "shĕmurah" flour, in accordance with the ruling of Rabbi Yosef Haim Sonnenfeld, who ruled that machine-made matzo may be preferable to hand made in some cases. The commentators to the Shulhan `Aruch record that it is the custom of some of Diaspora Jewry to be scrupulous in giving Hallah from the dough used for baking "Matzot Mitzvah" (the Shĕmurah matzo eaten during Passover) to a Kohen child to eat.
The requirement for eating Matzo at the Seder cannot be fulfilled "with this matza."
"Egg (sometimes "enriched") matzo" are matzot usually made with fruit juice, often grape or apple juice instead of water, but not necessarily with eggs themselves. There is a custom among some Ashkenazi Jews not to eat them during Passover, except for the elderly, infirm, or children, who cannot digest plain matzo; these matzot are considered to be kosher for Passover if prepared otherwise properly. The issue of whether egg matzo is allowed for Passover comes down to whether there is a difference between the various liquids that can be used. Water facilitates a fermentation of grain flour specifically into what is defined as chometz, but the question is whether fruit juice, eggs, honey, oil or milk are also deemed to do so within the strict definitions of Jewish laws regarding chometz.
The "Talmud", Pesachim 35a, states that liquid food extracts do not cause flour to leaven the way that water does. According to this view, flour mixed with other liquids would not need to be treated with the same care as flour mixed with water. The "Tosafot" (commentaries) explain that such liquids only produce a leavening reaction within flour "if" they themselves have had water added to them and otherwise the dough they produce is completely permissible for consumption during Passover, whether or not made according to the laws applying to matzot.
As a result, Joseph ben Ephraim Karo, author of the "Shulchan Aruch" or "Code of Jewish Law" (Orach Chayim 462:4) granted blanket permission for the use of any matzo made from non-water-based dough, including egg matzo, on Passover. Many egg matzo boxes no longer include the message, "Ashkenazi custom is that egg matzo is only allowed for children, elderly and the infirm during Passover." Even amongst those who consider that enriched matzot may not be "eaten" during Passover, it is permissible to "retain" it in the home.
Chocolate-covered matzo was sold in boxes as a standard product, alongside boxes of egg matzo.
The matzo itself is not "Hamotzi" (meaning that it is "Mezonot").
Matzo may be used whole, broken, chopped ("matzo farfel"), or finely ground ("matzo meal") to make numerous matzo-based cooked dishes. These include matzo balls, which are traditionally served in chicken soup; matzo brei, a dish of Ashkenazi origin made from matzo soaked in water, mixed with beaten egg, and fried; matzo pizza, in which the piece of matzo takes the place of pizza crust and is topped with melted cheese and sauce; and kosher for Passover cakes and cookies, which are made with matzo meal or a finer variety called "cake meal" that gives them a denser texture than ordinary baked foods made with flour. Hasidic Jews do not cook with matzo, believing that mixing it with water may allow leavening; this stringency is known as "gebrochts". However, Jews who avoid eating "gebrochts" will eat cooked matzo dishes on the eighth day of Passover outside the Land of Israel, as the eighth day is of rabbinic and not Torah origin.
Sephardim use matzo soaked in water or stock to make pies or lasagne, known as "mina", "méguena", "mayena" or .
Communion wafers used by the Catholic Church as well as in some Protestant traditions for the Eucharist are flat, unleavened bread. The main reason for the use of this bread is the belief that, because the last supper was described in the Synoptic Gospels as a Passover meal, the unleavened matzo bread was used by Jesus when he held it up and said "this is my body". All Byzantine Rite churches use leavened bread for the Eucharist as this symbolizes the risen Christ.
Some Oriental Orthodox and Eastern Catholic Christians use leavened bread, as in the east there is the tradition, based upon the gospel of John, that leavened bread was on the table of the Last Supper. In the Armenian Apostolic Church, the Ethiopian Orthodox Tewahedo Church and Eritrean Orthodox Tewahedo Church, unleavened bread called "qǝddus qurban" in Ge'ez, the liturgical language of the Eritreans and Ethiopians, is used for communion.
Catholic Christians living on the Malabar coast of Kerala, India have the customary celebration of Pesaha in their homes. On the evening before Good Friday, Pesaha bread is made at home. It is made with unleavened flour and they consume a sweet drink made up of coconut milk and jaggery along with this bread. On the Pesaha night, the bread is baked (steamed) immediately after rice flour is mixed with water and they pierce it many times with handle of the spoon to let out steam so that the bread will not rise (this custom is called "juthante kannu kuthal" in the Malayalam language meaning "piercing the bread according to the custom of Jews"). This bread is cut by the head of the family and shared among the family members.
At the end of World War II, the National Jewish Welfare Board had a matzo factory (according to the American Jewish Historical Society, it was probably the Manischewitz matzo factory in New Jersey) produce matzo in the form of a giant "V" for "Victory", for shipment to military bases overseas and in the U.S., for Passover seders for Jewish military personnel. Passover in 1945 began on 1 April, when the collapse of the Axis in Europe was clearly imminent; Germany surrendered five weeks later.
|
https://en.wikipedia.org/wiki?curid=20923
|
Michel Tremblay
Michel Tremblay, CQ (born 25 June 1942) is a French Canadian novelist, and playwright.
He was born in Montreal, Quebec, where he grew up in the French-speaking neighbourhood of Plateau Mont-Royal; at the time of his birth a neighbourhood with a working-class character and joual dialect - something that would heavily influence his work. Tremblay's first professionally produced play, "Les Belles-Sœurs", was written in 1965 and premiered at the Théâtre du Rideau Vert on August 28, 1968. It transformed the old guard of Canadian theatre and introduced joual to the mainstream. It stirred up controversy by portraying the lives of working class women and attacking the strait-laced, deeply religious society of mid-20th century Quebec.
The most profound and lasting effects of Tremblay's early plays, including "Hosanna" and "La Duchesse de Langeais", were the barriers they toppled in Quebec society. Until the Quiet Revolution of the early 1960s, Tremblay saw Quebec as a poor, working-class province dominated by an English-speaking elite and the Roman Catholic Church. Tremblay's work was part of a vanguard of liberal, nationalist thought that helped create an essentially modern society. His most famous plays are usually centered on gay characters. The first Canadian play about and starring a drag queen was his play "Hosanna", which was first performed at Théâtre de Quat'Sous in Montreal in 1973. The women in his plays are usually strong but possessed with demons they must vanquish. It is said he sees Quebec as a matriarchal society. He is considered one of the best playwrights for women. In the late 1980s, "Les Belles-soeurs" ("The Sisters-in-Law") was produced in Scotland in Scots, as "The Guid-Sisters" ("guid-sister" being Scots for "sister-in-law"). His work has been translated into many languages, including Yiddish, and including such works as "Sainte-Carmen de la Main", "Ç'ta ton tour, Laura Cadieux", and "Forever Yours, Marilou" ("À toi pour toujours, ta Marie-Lou").
He has been openly gay throughout his public life, and he has written many novels ("The Duchess and the Commoner", "La nuit des princes charmants", "Le Coeur découvert", "Le Coeur éclaté") and plays ("Hosanna", "La duchesse de Langeais", "Fragments de mensonges inutiles") centred on gay characters. In a 1987 interview with Shelagh Rogers for CBC Radio's "The Arts Tonight", he remarked that he has always avoided behaviours he has considered masculine; for example, he does not smoke and he noted that he was 45 years old and did not know how to drive a car. "I think I am a rare breed," he said, "A homosexual who doesn't like men." He claims one of his biggest regrets in life was not telling his mother that he was gay, before she died.
His latest play to receive wide acclaim is "For the Pleasure of Seeing Her Again", a funny and nostalgic play, centered on the memories of his mother. He later published the Plateau Mont-Royal Chronicles, a cycle of six novels including "The Fat Woman Next Door is Pregnant" ("La grosse femme d'à côté est enceinte", 1978) and "The Duchess and the Commoner" ("La duchesse et le roturier", 1982). The second novel of this series, "Therese and Pierrette and the Little Hanging Angel" ("Thérèse et Pierrette à l'école des Saints-Anges", 1980), was one of the novels chosen for inclusion in the French version of "Canada Reads", "Le combat des livres", broadcast on Radio-Canada in 2005, where it was championed by union activist Monique Simard.
Tremblay worked also on a television series entitled "Le Cœur découvert" ("The Heart Laid Bare"), about the lives of a gay couple in Quebec, for the French-language TV network Radio-Canada. In 2005 he completed another novel cycle, the "Cahiers" ("Le Cahier noir" (translated as "The Black Notebook"), "Le Cahier rouge", "Le Cahier bleu"), dealing with the changes that occurred in 1960s Montreal during the Quiet Revolution. In 2009 "The Fat Woman Next Door" was a finalist in CBC's prestigious Canada Reads competition.
For many years, Tremblay has believed that the only reasonable solution for Quebec is to separate from Canada. Once the Parti Québécois was elected in Quebec, he softened his views on allowing his plays to be produced in English there. He made it clear, however, that that did not mean that he agreed with bilingualism, calling it "stupid" and stating that he thought it ridiculous to expect a housewife in Vancouver to be fluent in both English and French.
Despite his often outspoken views in public, Tremblay's treatment of politics in his plays is subtle. Speaking of politics and the theatre in an CBC interview in 1978, Tremblay said:
"I know what I want in the theatre. I want a real political theatre, but I know that political theatre is dull. I write fables."
In April 2006 he declared that he did not support the arguments put forward for the separation of Quebec. But he clarified his thoughts some time later by saying he was still a supporter of Quebec sovereignty, though critical of the actual state of the debate, which in his opinion was too much focused on economic issues. In response to this, the columnist Marc Cassivi of "La Presse" wrote that "there was only one closet a Quebec artist could never exit and that was the federalist one."
Tremblay has received numerous awards in recognition of his work. These include the "Prix Victor-Morin" (1974), the "Prix France-Québec" (1984), the Chalmers Award (1986) and the Molson Prize (1994).
He received the Lieutenant-Governor's award for Ontario in 1976 and 1977. Tremblay was named the "Montréalais le plus remarquable des deux dernières décennies dans le domaine du théâtre" (the most remarkable Montrealer of the past two decades in theatre) (1978). In 1991 he was appointed "Officier de l'Ordre de France," and in the same year, "Chevalier de l'Ordre National du Québec." He is also a recipient of the "Chevalier de l'Ordre des Arts et des Lettres de France" (1994).
In 1999, Tremblay received a Governor General's Performing Arts Award, Canada's highest honour in the performing arts. This produced controversy when several well-known Quebec nationalists suggested that he should refuse the award. While he did not do this, he did admit, for the first time, that he had refused the Order of Canada in 1990.
In 2000, "Encore une fois, si vous le permettez" ("For The Pleasure of Seeing Her Again") won a Chalmers Award and a Dora Mavor Moore Award.
Note: Most titles also available in English translations
Note: Most titles also available in English translations
|
https://en.wikipedia.org/wiki?curid=20925
|
Supervised learning
Supervised learning is the machine learning task of learning a function that maps an input to an output based on example input-output pairs. It infers a function from "" consisting of a set of "training examples". In supervised learning, each example is a "pair" consisting of an input object (typically a vector) and a desired output value (also called the "supervisory signal"). A supervised learning algorithm analyzes the training data and produces an inferred function, which can be used for mapping new examples. An optimal scenario will allow for the algorithm to correctly determine the class labels for unseen instances. This requires the learning algorithm to generalize from the training data to unseen situations in a "reasonable" way (see inductive bias).
The parallel task in human and animal psychology is often referred to as concept learning.
In order to solve a given problem of supervised learning, one has to perform the following steps:
A wide range of supervised learning algorithms are available, each with its strengths and weaknesses. There is no single learning algorithm that works best on all supervised learning problems (see the No free lunch theorem).
There are four major issues to consider in supervised learning:
A first issue is the tradeoff between "bias" and "variance". Imagine that we have available several different, but equally good, training data sets. A learning algorithm is biased for a particular input formula_1 if, when trained on each of these data sets, it is systematically incorrect when predicting the correct output for formula_1. A learning algorithm has high variance for a particular input formula_1 if it predicts different output values when trained on different training sets. The prediction error of a learned classifier is related to the sum of the bias and the variance of the learning algorithm. Generally, there is a tradeoff between bias and variance. A learning algorithm with low bias must be "flexible" so that it can fit the data well. But if the learning algorithm is too flexible, it will fit each training data set differently, and hence have high variance. A key aspect of many supervised learning methods is that they are able to adjust this tradeoff between bias and variance (either automatically or by providing a bias/variance parameter that the user can adjust).
The second issue is the amount of training data available relative to the complexity of the "true" function (classifier or regression function). If the true function is simple, then an "inflexible" learning algorithm with high bias and low variance will be able to learn it from a small amount of data. But if the true function is highly complex (e.g., because it involves complex interactions among many different input features and behaves differently in different parts of the input space), then the function will only be able to learn from a very large amount of training data and using a "flexible" learning algorithm with low bias and high variance.
A third issue is the dimensionality of the input space. If the input feature vectors have very high dimension, the learning problem can be difficult even if the true function only depends on a small number of those features. This is because the many "extra" dimensions can confuse the learning algorithm and cause it to have high variance. Hence, high input dimensional typically requires tuning the classifier to have low variance and high bias. In practice, if the engineer can manually remove irrelevant features from the input data, this is likely to improve the accuracy of the learned function. In addition, there are many algorithms for feature selection that seek to identify the relevant features and discard the irrelevant ones. This is an instance of the more general strategy of dimensionality reduction, which seeks to map the input data into a lower-dimensional space prior to running the supervised learning algorithm.
A fourth issue is the degree of noise in the desired output values (the supervisory target variables). If the desired output values are often incorrect (because of human error or sensor errors), then the learning algorithm should not attempt to find a function that exactly matches the training examples. Attempting to fit the data too carefully leads to overfitting. You can overfit even when there are no measurement errors (stochastic noise) if the function you are trying to learn is too complex for your learning model. In such a situation, the part of the target function that cannot be modeled "corrupts" your training data - this phenomenon has been called deterministic noise. When either type of noise is present, it is better to go with a higher bias, lower variance estimator.
In practice, there are several approaches to alleviate noise in the output values such as early stopping to prevent overfitting as well as detecting and removing the noisy training examples prior to training the supervised learning algorithm. There are several algorithms that identify noisy training examples and removing the suspected noisy training examples prior to training has decreased generalization error with statistical significance.
Other factors to consider when choosing and applying a learning algorithm include the following:
When considering a new application, the engineer can compare multiple learning algorithms and experimentally determine which one works best on the problem at hand (see cross validation). Tuning the performance of a learning algorithm can be very time-consuming. Given fixed resources, it is often better to spend more time collecting additional training data and more informative features than it is to spend extra time tuning the learning algorithms.
The most widely used learning algorithms are:
Given a set of formula_4 training examples of the form formula_5 such that formula_6 is the feature vector of the i-th example and formula_7 is its label (i.e., class), a learning algorithm seeks a function formula_8, where formula_9 is the input space and
formula_10 is the output space. The function formula_11 is an element of some space of possible functions formula_12, usually called the "hypothesis space". It is sometimes convenient to
represent formula_11 using a scoring function formula_14 such that formula_11 is defined as returning the formula_16 value that gives the highest score: formula_17. Let formula_18 denote the space of scoring functions.
Although formula_12 and formula_18 can be any space of functions, many learning algorithms are probabilistic models where formula_11 takes the form of a conditional probability model formula_22, or formula_23 takes the form of a joint probability model formula_24. For example, naive Bayes and linear discriminant analysis are joint probability models, whereas logistic regression is a conditional probability model.
There are two basic approaches to choosing formula_23 or formula_11: empirical risk minimization and structural risk minimization. Empirical risk minimization seeks the function that best fits the training data. Structural risk minimization includes a "penalty function" that controls the bias/variance tradeoff.
In both cases, it is assumed that the training set consists of a sample of independent and identically distributed pairs, formula_27. In order to measure how well a function fits the training data, a loss function formula_28 is defined. For training example formula_29, the loss of predicting the value formula_30 is formula_31.
The "risk" formula_32 of function formula_11 is defined as the expected loss of formula_11. This can be estimated from the training data as
In empirical risk minimization, the supervised learning algorithm seeks the function formula_11 that minimizes formula_32. Hence, a supervised learning algorithm can be constructed by applying an optimization algorithm to find formula_11.
When formula_11 is a conditional probability distribution formula_40 and the loss function is the negative log likelihood: formula_41, then empirical risk minimization is equivalent to maximum likelihood estimation.
When formula_12 contains many candidate functions or the training set is not sufficiently large, empirical risk minimization leads to high variance and poor generalization. The learning algorithm is able
to memorize the training examples without generalizing well. This is called overfitting.
Structural risk minimization seeks to prevent overfitting by incorporating a regularization penalty into the optimization. The regularization penalty can be viewed as implementing a form of Occam's razor that prefers simpler functions over more complex ones.
A wide variety of penalties have been employed that correspond to different definitions of complexity. For example, consider the case where the function formula_11 is a linear function of the form
A popular regularization penalty is formula_45, which is the squared Euclidean norm of the weights, also known as the formula_46 norm. Other norms include the formula_47 norm, formula_48, and the formula_49 norm, which is the number of non-zero formula_50s. The penalty will be denoted by formula_51.
The supervised learning optimization problem is to find the function formula_11 that minimizes
The parameter formula_54 controls the bias-variance tradeoff. When formula_55, this gives empirical risk minimization with low bias and high variance. When formula_54 is large, the learning algorithm will have high bias and low variance. The value of formula_54 can be chosen empirically via cross validation.
The complexity penalty has a Bayesian interpretation as the negative log prior probability of formula_11, formula_59, in which case formula_60 is the posterior probabability of formula_11.
The training methods described above are "discriminative training" methods, because they seek to find a function formula_11 that discriminates well between the different output values (see discriminative model). For the special case where formula_24 is a joint probability distribution and the loss function is the negative log likelihood formula_64 a risk minimization algorithm is said to perform "generative training", because formula_23 can be regarded as a generative model that explains how the data were generated. Generative training algorithms are often simpler and more computationally efficient than discriminative training algorithms. In some cases, the solution can be computed in closed form as in naive Bayes and linear discriminant analysis.
There are several ways in which the standard supervised learning problem can be generalized:
|
https://en.wikipedia.org/wiki?curid=20926
|
Martin Helwig
Martin Helwig () (5 November 1516 – 26 January 1574) was a German cartographer of and from Silesia and
pedagogue. He was born in Neisse and died in Breslau, Holy Roman Empire.
A former pupil of an eminent German scholar and educationist Valentin Friedland, Martin Helwig went on to study at the University of Wittenberg, where as a student of Martin Luther and Philip Melanchthon he earned the academic degree of Magister. In 1552, he became Rector of St. Maria Magdalena School in Breslau (now Wrocław, in Poland). Equally proficient in mathematics and geography as well as classical languages, he produced the first woodcut map of Silesia made on the basis of surveys and data collected from local inhabitants, which he published in 1561 under the title "Silesiae Typus", and dedicated to , a wealthy Silesian merchant, banker, philanthropist, governor and patron of the principality of Breslau (Wrocław) who sponsored the map. Martin Helwig's map went on to receive acclaim in a public writing by Caspar Peucer, an eminent German scholar at the University of Wittenberg, and was later republished in several versions of Abraham Ortelius's pioneering world atlas, "Theatrum Orbis Terrarum".
The first map of Silesia by Martin Helwig constituted until the middle of the 18th century the main model and source of information for the cartographical presentation of this region of Europe on the maps of the most famous cartographers and publishers of those times.
|
https://en.wikipedia.org/wiki?curid=20932
|
Macro virus
In computing terminology, a macro virus is a virus that is written in a macro language: a programming language which is embedded inside a software application (e.g., word processors and spreadsheet applications). Some applications, such as Microsoft Office, Excel, PowerPoint allow macro programs to be embedded in documents such that the macros are run automatically when the document is opened, and this provides a distinct mechanism by which malicious computer instructions can spread. This is one reason it can be dangerous to open unexpected attachments in e-mails. Many antivirus programs can detect macro viruses; however, the macro virus' behavior can still be difficult to detect.
A macro is a series of commands and actions that helps automating some tasks - usually a quite short and simple program. However they are created, they need to be executed by some system which interprets the stored commands. Some macro systems are self-contained programs, but others are built into complex applications (for example word processors) to allow users to repeat sequences of commands easily, or to allow developers to tailor the application to local needs.
A macro virus can be spread through e-mail attachments, removable media, networks and the Internet, and is notoriously difficult to detect. A common way for a macro virus to infect a computer is by replacing normal macros with a virus. The macro virus replaces regular commands with the same name and runs when the command is selected. These malicious macros may start automatically when a document is opened or closed, without the user's knowledge.
Once a file containing a macro virus is opened, the virus can infect the system. When triggered, it will begin to embed itself in other documents and templates. It may corrupt other parts of the system, depending on what resources a macro in this application can access. When the infected documents are shared with other users and systems, the virus spreads. Macro viruses have been used as a method of installing software on a system without the user's consent, as they can be used to download and install software from the internet through the use of automated key-presses. However, this is uncommon as it is usually not fruitful for the virus coder since the installed software is usually noticed and uninstalled by the user.
Since a macro virus depends on the application rather than the operating system, it can infect a computer running any operating system to which the targeted application has been ported. In particular, since Microsoft Word is available on Macintosh computers, word macro viruses can attack some Macs in addition to Windows platforms.
An example of a macro virus is the Melissa virus which appeared in March 1999. When a user opens a Microsoft Word document containing the Melissa virus, their computer becomes infected. The virus then sends itself by email to the first 50 people in the person's address book. This made the virus replicate at a fast rate.
Not all macro viruses are detected by antivirus software. Caution when opening email attachments and other documents decreases the chance of becoming infected.
|
https://en.wikipedia.org/wiki?curid=20934
|
Microsoft Access
Microsoft Access is a database management system (DBMS) from Microsoft that combines the relational Microsoft Jet Database Engine with a graphical user interface and software-development tools. It is a member of the Microsoft Office suite of applications, included in the Professional and higher editions or sold separately.
Microsoft Access stores data in its own format based on the Access Jet Database Engine. It can also import or link directly to data stored in other applications and databases.
Software developers, data architects and power users can use Microsoft Access to develop application software. Like other Microsoft Office applications, Access is supported by Visual Basic for Applications (VBA), an object-based programming language that can reference a variety of objects including the legacy DAO (Data Access Objects), ActiveX Data Objects, and many other ActiveX components. Visual objects used in forms and reports expose their methods and properties in the VBA programming environment, and VBA code modules may declare and call Windows operating system operations.
Prior to the introduction of Access, Borland (with Paradox and dBase) and Fox (with FoxPro) dominated the desktop database market. Microsoft Access was the first mass-market database program for Windows. With Microsoft's purchase of FoxPro in 1992 and the incorporation of Fox's Rushmore query optimization routines into Access, Microsoft Access quickly became the dominant database for Windows—effectively eliminating the competition which failed to transition from the MS-DOS world.
Microsoft's first attempt to sell a relational database product was during the mid 1980s, when Microsoft obtained the license to sell . In the late 1980s Microsoft developed its own solution codenamed Omega. It was confirmed in 1988 that a database product for Windows and OS/2 was in development. It was going to include the "EB" Embedded Basic language, which was going to be the language for writing macros in all Microsoft applications, but the unification of macro languages did not happen until the introduction of Visual Basic for Applications (VBA). Omega was also expected to provide a front end to the Microsoft SQL Server. The application was very resource-hungry, and there were reports that it was working slowly on the 386 processors that were available at the time. It was scheduled to be released in the 1st quarter of 1990, but in 1989 the development of the product was reset and it was rescheduled to be delivered no sooner than in January 1991. Parts of the project were later used for other Microsoft projects: Cirrus (codename for Access) and Thunder (codename for Visual Basic, where the Embedded Basic engine was used). After Access's premiere, the Omega project was demonstrated in 1992 to several journalists and included features that were not available in Access.
After the Omega project was scrapped, some of its developers were assigned to the Cirrus project (most were assigned to the team which created Visual Basic). Its goal was to create a competitor for applications like Paradox or dBase that would work on Windows. After Microsoft acquired FoxPro, there were rumors that the Microsoft project might get replaced with it, but the company decided to develop them in parallel. It was assumed that the project would make use of Extensible Storage Engine (Jet Blue) but, in the end, only support for Microsoft Jet Database Engine (Jet Red) was provided. The project used some of the code from both the Omega project and a pre-release version of Visual Basic. In July 1992, betas of Cirrus shipped to developers and the name Access became the official name of the product. "Access" was originally used for an older terminal emulation program from Microsoft. Years after the program was abandoned, they decided to reuse the name here.
1992: Microsoft released Access version 1.0 on November 13, 1992, and an Access 1.1 release in May 1993 to improve compatibility with other Microsoft products and to include the Access Basic programming language.
1994: Microsoft specified the minimum hardware requirements for Access v2.0 as: Microsoft Windows v3.1 with 4 MB of RAM required, 6 MB RAM recommended; 8 MB of available hard disk space required, 14 MB hard disk space recommended. The product shipped on seven 1.44 MB diskettes. The manual shows a 1994 copyright date.
With Office 95, Microsoft Access 7.0 (a.k.a. "Access 95") became part of the Microsoft Office Professional Suite, joining Microsoft Excel, Word, and PowerPoint and transitioning from Access Basic to VBA. Since then, Microsoft has released new versions of Microsoft Access with each release of Microsoft Office. This includes Access 97 (version 8.0), Access 2000 (version 9.0), Access 2002 (version 10.0), Access 2003 (version 11.5), Access 2007 (version 12.0), Access 2010 (version 14.0), and Access 2013 (version 15.0).
Versions 3.0 and 3.5 of Microsoft Jet database engine (used by Access 7.0 and the later-released Access 97 respectively) had a critical issue which made these versions of Access unusable on a computer with more than 1 GB of memory. While Microsoft fixed this problem for Jet 3.5/Access 97 post-release, it never fixed the issue with Jet 3.0/Access 95.
The native Access database format (the Jet MDB Database) has also evolved over the years. Formats include Access 1.0, 1.1, 2.0, 7.0, 97, 2000, 2002, and 2007. The most significant transition was from the Access 97 to the Access 2000 format; which is not backward compatible with earlier versions of Access. all newer versions of Access support the Access 2000 format. New features were added to the Access 2002 format which can be used by Access 2002, 2003, 2007, and 2010.
Microsoft Access 2000 increased the maximum database size to 2GB from 1GB in Access 97.
Microsoft Access 2007 introduced a new database format: ACCDB. It supports links to SharePoint lists and complex data types such as multivalue and attachment fields. These new field types are essentially recordsets in fields and allow the storage of multiple values or files in one field. Microsoft Access 2007 also introduced File Attachment field, which stored data more efficiently than the OLE (Object Linking and Embedding) field.
Microsoft Access 2010 introduced a new version of the ACCDB format supported hosting Access Web services on a SharePoint 2010 server. For the first time, this allowed Access applications to be run without having to install Access on their PC and was the first support of Mac users. Any user on the SharePoint site with sufficient rights could use the Access Web service. A copy of Access was still required for the developer to create the Access Web service, and the desktop version of Access remained part of Access 2010. The Access Web services were not the same as the desktop applications. Automation was only through the macro language (not VBA) which Access automatically converted to JavaScript. The data was no longer in an Access database but SharePoint lists. An Access desktop database could link to the SharePoint data, so hybrid applications were possible so that SharePoint users needing basic views and edits could be supported while the more sophisticated, traditional applications could remain in the desktop Access database.
Microsoft Access 2013 offers traditional Access desktop applications plus a significantly updated SharePoint 2013 web service. The Access Web model in Access 2010 was replaced by a new architecture that stores its data in actual SQL Server databases. Unlike SharePoint lists, this offers true relational database design with referential integrity, scalability, extensibility and performance one would expect from SQL Server. The database solutions that can be created on SharePoint 2013 offer a modern user interface designed to display multiple levels of relationships that can be viewed and edited, along with resizing for different devices and support for touch. The Access 2013 desktop is similar to Access 2010 but several features were discontinued including support for Access Data Projects (ADPs), pivot tables, pivot charts, Access data collections, source code control, replication, and other legacy features. Access desktop database maximum size remained 2GB (as it has been since the 2000 version).
In addition to using its own database storage file, Microsoft Access also may be used as the 'front-end' of a program while other products act as the 'back-end' tables, such as Microsoft SQL Server and non-Microsoft products such as Oracle and Sybase. Multiple backend sources can be used by a Microsoft Access Jet Database (ACCDB and MDB formats). Similarly, some applications such as Visual Basic, ASP.NET, or Visual Studio .NET will use the Microsoft Access database format for its tables and queries. Microsoft Access may also be part of a more complex solution, where it may be integrated with other technologies such as Microsoft Excel, Microsoft Outlook, Microsoft Word, Microsoft PowerPoint and ActiveX controls.
Access tables support a variety of standard field types, indices, and referential integrity including cascading updates and deletes. Access also includes a query interface, forms to display and enter data, and reports for printing. The underlying Jet database, which contains these objects, is multi-user and handles record-locking.
Repetitive tasks can be automated through macros with point-and-click options. It is also easy to place a database on a network and have multiple users share and update data without overwriting each other's work. Data is locked at the record level which is significantly different from Excel which locks the entire spreadsheet.
There are template databases within the program and for download from Microsoft's website. These options are available upon starting Access and allow users to enhance a database with predefined tables, queries, forms, reports, and macros. Database templates support VBA code but Microsoft's templates do not include VBA code.
Programmers can create solutions using VBA, which is similar to Visual Basic 6.0 (VB6) and used throughout the Microsoft Office programs such as Excel, Word, Outlook and PowerPoint. Most VB6 code, including the use of Windows API calls, can be used in VBA. Power users and developers can extend basic end-user solutions to a professional solution with advanced automation, data validation, error trapping, and multi-user support.
The number of simultaneous users that can be supported depends on the amount of data, the tasks being performed, level of use, and application design. Generally accepted limits are solutions with 1 GB or less of data (Access supports up to 2 GB) and it performs quite well with 100 or fewer simultaneous connections (255 concurrent users are supported). This capability is often a good fit for department solutions. If using an Access database solution in a multi-user scenario, the application should be "split". This means that the tables are in one file called the back end (typically stored on a shared network folder) and the application components (forms, reports, queries, code, macros, linked tables) are in another file called the front end. The linked tables in the front end point to the back end file. Each user of the Access application would then receive his or her own copy of the front end file.
Applications that run complex queries or analysis across large datasets would naturally require greater bandwidth and memory. Microsoft Access is designed to scale to support more data and users by linking to multiple Access databases or using a back-end database like Microsoft SQL Server. With the latter design, the amount of data and users can scale to enterprise-level solutions.
Microsoft Access's role in web development prior to version 2010 is limited. User interface features of Access, such as forms and reports, only work in Windows. In versions 2000 through 2003 an Access object type called Data Access Pages created publishable web pages. Data Access Pages are no longer supported. The Microsoft Jet Database Engine, core to Access, can be accessed through technologies such as ODBC or OLE DB. The data (i.e., tables and queries) can be accessed by web-based applications developed in ASP.NET, PHP, or Java. With the use of Microsoft's Terminal Services and Remote Desktop Application in Windows Server 2008 R2, organizations can host Access applications so they can be run over the web. This technique does not scale the way a web application would but is appropriate for a limited number of users depending on the configuration of the host.
Access 2010 allows databases to be published to SharePoint 2010 web sites running Access Services. These web-based forms and reports run in any modern web browser. The resulting web forms and reports, when accessed via a web browser, don't require any add-ins or extensions (e.g. ActiveX, Silverlight).
Access 2013 can create web applications directly in SharePoint 2013 sites running Access Services. Access 2013 web solutions store its data in an underlying SQL Server database which is much more scalable and robust than the Access 2010 version which used SharePoint lists to store its data.
Access Services in SharePoint has since been retired.
A compiled version of an Access database (File extensions: .MDE /ACCDE or .ADE; ACCDE only works with Access 2007 or later) can be created to prevent user from accessing the design surfaces to modify module code, forms, and reports. An MDE or ADE file is a Microsoft Access database file with all modules compiled and all editable source code removed. Both the .MDE and .ADE versions of an Access database are used when end-user modifications are not allowed or when the application's source code should be kept confidential.
Microsoft also offers developer extensions for download to help distribute Access 2007 applications, create database templates, and integrate source code control with Microsoft Visual SourceSafe.
Users can create tables, queries, forms and reports, and connect them together with macros. Advanced users can use VBA to write rich solutions with advanced data manipulation and user control. Access also has report creation features that can work with any data source that Access can access.
The original concept of Access was for end users to be able to access data from any source. Other features include: the import and export of data to many formats including Excel, Outlook, ASCII, dBase, Paradox, FoxPro, SQL Server and Oracle. It also has the ability to link to data in its existing location and use it for viewing, querying, editing, and reporting. This allows the existing data to change while ensuring that Access uses the latest data. It can perform heterogeneous joins between data sets stored across different platforms. Access is often used by people downloading data from enterprise level databases for manipulation, analysis, and reporting locally.
There is also the Jet Database format (MDB or ACCDB in Access 2007) which can contain the application and data in one file. This makes it very convenient to distribute the entire application to another user, who can run it in disconnected environments.
One of the benefits of Access from a programmer's perspective is its relative compatibility with SQL (structured query language)—queries can be viewed graphically or edited as SQL statements, and SQL statements can be used directly in Macros and VBA Modules to manipulate Access tables. Users can mix and use both VBA and "Macros" for programming forms and logic and offers object-oriented possibilities. VBA can also be included in queries.
Microsoft Access offers parameterized queries. These queries and Access tables can be referenced from other programs like VB6 and .NET through DAO or ADO. From Microsoft Access, VBA can reference parameterized stored procedures via ADO.
The desktop editions of Microsoft SQL Server can be used with Access as an alternative to the Jet Database Engine. This support started with MSDE (Microsoft SQL Server Desktop Engine), a scaled down version of Microsoft SQL Server 2000, and continues with the SQL Server Express versions of SQL Server 2005 and 2008.
Microsoft Access is a file server-based database. Unlike client–server relational database management systems (RDBMS), Microsoft Access does not implement database triggers, stored procedures, or transaction logging. Access 2010 includes table-level triggers and stored procedures built into the ACE data engine. Thus a Client-server database system is not a requirement for using stored procedures or table triggers with Access 2010.
Tables, queries, forms, reports and macros can now be developed specifically for web based applications in Access 2010. Integration with Microsoft SharePoint 2010 is also highly improved.
The 2013 edition of Microsoft Access introduced a mostly flat design and the ability to install apps from the Office Store, but it did not introduce new features. The theme was partially updated again for 2016, but no dark theme was created for Access.
ASP.NET web forms can query a Microsoft Access database, retrieve records and display them on the browser.
SharePoint Server 2010 via Access Services allows for Access 2010 databases to be published to SharePoint, thus enabling multiple users to interact with the database application from any standards-compliant Web browser. Access Web databases published to SharePoint Server can use standard objects such as tables, queries, forms, macros, and reports. Access Services stores those objects in SharePoint.
Access 2013 offers the ability to publish Access web solutions on SharePoint 2013. Rather than using SharePoint lists as its data source, Access 2013 uses an actual SQL Server database hosted by SharePoint or SQL Azure. This offers a true relational database with referential integrity, scalability, maintainability, and extensibility compared to the SharePoint views Access 2010 used. The macro language is enhanced to support more sophisticated programming logic and database level automation.
Microsoft Access can also import or link directly to data stored in other applications and databases. Microsoft Office Access 2007 and newer can import from or link to:
Microsoft offers free runtime versions of Microsoft Access which allow users to run an Access desktop application without needing to purchase or install a retail version of Microsoft Access. This actually allows Access developers to create databases that can be freely distributed to an unlimited number of end-users. These runtime versions of Access 2007 and later can be downloaded for free from Microsoft. The runtime versions for Access 2003 and earlier were part of the Office Developer Extensions/Toolkit and required a separate purchase.
The runtime version allows users to view, edit and delete data, along with running queries, forms, reports, macros and VBA module code. The runtime version does not allow users to change the design of Microsoft Access tables, queries, forms, reports, macros or module code. The runtime versions are similar to their corresponding full version of Access and usually compatible with earlier versions; for example Access Runtime 2010 allows a user to run an Access application made with the 2010 version as well as 2007 through 2000. Due to deprecated features in Access 2013, its runtime version is also unable to support those older features. During development one can simulate the runtime environment from the fully functional version by using the codice_1 command line option.
Access stores all database tables, queries, forms, reports, macros, and modules in the Access Jet database as a single file.
For query development, Access offers a "Query Designer", a graphical user interface that allows users to build queries without knowledge of structured query language. In the Query Designer, users can "show" the datasources of the query (which can be tables or queries) and select the fields they want returned by clicking and dragging them into the grid. One can set up joins by clicking and dragging fields in tables to fields in other tables. Access allows users to view and manipulate the SQL code if desired. Any Access table, including linked tables from different data sources, can be used in a query.
Access also supports the creation of "pass-through queries". These snippets of SQL code can address external data sources through the use of ODBC connections on the local machine. This enables users to interact with data stored outside the Access program without using linked tables or Jet.
Users construct the pass-through queries using the SQL syntax supported by the external data source.
When developing reports (in "Design View") additions or changes to controls cause any linked queries to execute in the background and the designer is forced to wait for records to be returned before being able to make another change. This feature cannot be turned off.
Non-programmers can use the macro feature to automate simple tasks through a series of drop-down selections. Macros allow users to easily chain commands together such as running queries, importing or exporting data, opening and closing forms, previewing and printing reports, etc. Macros support basic logic (IF-conditions) and the ability to call other macros. Macros can also contain sub-macros which are similar to subroutines. In Access 2007, enhanced macros included error-handling and support for temporary variables. Access 2007 also introduced embedded macros that are essentially properties of an object's event. This eliminated the need to store macros as individual objects. However, macros were limited in their functionality by a lack of programming loops and advanced coding logic until Access 2013. With significant further enhancements introduced in Access 2013, the capabilities of macros became fully comparable to VBA. They made feature rich web-based application deployments practical, via a greatly enhanced Microsoft SharePoint interface and tools, as well as on traditional Windows desktops.
In common with other products in the Microsoft Office suite, the other programming language used in Access is Microsoft VBA. It is similar to Visual Basic 6.0 (VB6) and code can be stored in modules, classes, and code behind forms and reports. To create a richer, more efficient and maintainable finished product with good error handling, most professional Access applications are developed using the VBA programming language rather than macros, except where web deployment is a business requirement.
To manipulate data in tables and queries in VBA or macros, Microsoft provides two database access libraries of COM components:
As well as DAO and ADO, developers can also use OLE DB and ODBC for developing native C/C++ programs for Access. For ADPs and the direct manipulation of SQL Server data, ADO is required. DAO is most appropriate for managing data in Access/Jet databases, and the only way to manipulate the complex field types in ACCDB tables.
In the database container or navigation pane in Access 2007 and later versions, the system automatically categorizes each object by type (e.g., table, query, macro). Many Access developers use the Leszynski naming convention, though this is not universal; it is a programming convention, not a DBMS-enforced rule. It is particularly helpful in VBA where references to object names may not indicate its data type (e.g. tbl for tables, qry for queries).
Developers deploy Microsoft Access most often for individual and workgroup projects (the Access 97 speed characterization was done for 32 users). Since Access 97, and with Access 2003 and 2007, Microsoft Access and hardware have evolved significantly. Databases under 1 GB in size (which can now fit entirely in RAM) and 200 simultaneous users are well within the capabilities of Microsoft Access. Of course, performance depends on the database design and tasks. Disk-intensive work such as complex searching and querying take the most time.
As data from a Microsoft Access database can be cached in RAM, processing speed may substantially improve when there is only a single user or if the data is not changing. In the past, the effect of packet latency on the record-locking system caused Access databases to run slowly on a virtual private network (VPN) or a wide area network (WAN) against a Jet database. broadband connections have mitigated this issue. Performance can also be enhanced if a continuous connection is maintained to the back-end database throughout the session rather than opening and closing it for each table access. If Access database performance over VPN or WAN suffers, then a client using Remote Desktop Protocol (such as Microsoft Terminal Services) can provide an effective solution. Access databases linked to SQL Server or to Access Data Projects work well over VPNs and WANs.
In July 2011, Microsoft acknowledged an intermittent query performance problem with all versions of Access and Windows 7 and Windows Server 2008 R2 due to the nature of resource management being vastly different in newer operating systems. This issue severely affects query performance on both Access 2003 and earlier with the Jet Database Engine code, as well as Access 2007 and later with the Access Database Engine (ACE). Microsoft has issued hotfixes KB2553029 for Access 2007 and KB2553116 for Access 2010, but will not fix the issue with Jet 4.0 as it is out of mainstream support.
In earlier versions of Microsoft Access, the ability to distribute applications required the purchase of the Developer Toolkit; in Access 2007, 2010 and Access 2013 the "Runtime Only" version is offered as a free download, making the distribution of royalty-free applications possible on Windows XP, Vista, 7 and Windows 8.x.
Microsoft Access applications can adopt a split-database architecture. The single database can be divided into a separate "back-end" file that contains the data tables (shared on a file server) and a "front-end" (containing the application's objects such as queries, forms, reports, macros, and modules). The "front-end" Access application is distributed to each user's desktop and linked to the shared database. Using this approach, each user has a copy of Microsoft Access (or the runtime version) installed on their machine along with their application database. This reduces network traffic since the application is not retrieved for each use. The "front-end" database can still contain local tables for storing a user's settings or temporary data. This split-database design also allows development of the application independent of the data. One disadvantage is that users may make various changes to their own local copy of the application and this makes it hard to manage version control. When a new version is ready, the front-end database is replaced without impacting the data database. Microsoft Access has two built-in utilities, Database Splitter and Linked Table Manager, to facilitate this architecture.
Linked tables in Access use absolute paths rather than relative paths, so the development environment either has to have the same path as the production environment or a "dynamic-linker" routine can be written in VBA.
For very large Access databases, this may have performance issues and a SQL backend should be considered in these circumstances. This is less of an issue if the entire database can fit in the PC's RAM since Access caches data and indexes.
To scale Access applications to enterprise or web solutions, one possible technique involves migrating to Microsoft SQL Server or equivalent server database. A client–server design significantly reduces maintenance and increases security, availability, stability, and transaction logging.
Access 2000 through Access 2010 included a feature called the Upsizing Wizard that allowed users to upgrade their databases to Microsoft SQL Server, an ODBC client–server database. This feature was removed from Access 2013. An additional solution, the SQL Server Migration Assistant for Access (SSMA), continues to be available for free download from Microsoft.
A variety of upgrading options are available. After migrating the data and queries to SQL Server, the Access database can be linked to the SQL database. However, certain data types are problematic, most notably "Yes/No". In Microsoft Access there are three states for the Yes/No (True/False) data type: empty, no/false (zero) and yes/true (-1). The corresponding SQL Server data type is binary, with only two states, permissible values, zero and 1. Regardless, SQL Server is still the easiest migration. Retrieving data from linked tables is optimized to just the records needed, but this scenario may operate less efficiently than what would otherwise be optimal for SQL Server. For example, in instances where multi-table joins still require copying the whole table across the network.
In previous versions of Access, including Access 2010, databases can also be converted to Access Data Projects (ADP) which are tied directly to one SQL Server database. This feature was removed from Access 2013. ADP's support the ability to directly create and modify SQL Server objects such as tables, views, stored procedures, and SQL Server constraints. The views and stored procedures can significantly reduce the network traffic for multi-table joins. SQL Server supports temporary tables and links to other data sources beyond the single SQL Server database.
Finally, some Access databases are completely replaced by another technology such as ASP.NET or Java once the data is converted. However any migration may dictate major effort since the Access SQL language is a more powerful superset of standard SQL. Further, Access application procedures, whether VBA and macros, are written at a relatively higher level versus the currently available alternatives that are both robust and comprehensive. Note that the Access macro language, allowing an even higher level of abstraction than VBA, was significantly enhanced in Access 2010 and again in Access 2013.
In many cases, developers build direct web-to-data interfaces using ASP.NET, while keeping major business automation processes, administrative and reporting functions that don't need to be distributed to everyone in Access for information workers to maintain.
While all Access data can migrate to SQL Server directly, some queries cannot migrate successfully. In some situations, you may need to translate VBA functions and user defined functions into T–SQL or .NET functions / procedures. Crosstab queries can be migrated to SQL Server using the PIVOT command.
Microsoft Access applications can be made secure by various methods, the most basic being password access control; this is a relatively weak form of protection.
A higher level of protection is the use of workgroup security requiring a user name and password. Users and groups can be specified along with their rights at the object type or individual object level. This can be used to specify people with read-only or data entry rights but may be challenging to specify. A separate workgroup security file contains the settings which can be used to manage multiple databases. Workgroup security is not supported in the Access 2007 and Access 2010 ACCDB database format, although Access 2007 and Access 2010 still support it for MDB databases.
Databases can also be encrypted. The ACCDB format offers significantly advanced encryption from previous versions.
Additionally, if the database design needs to be secured to prevent changes, Access databases can be locked/protected (and the source code compiled) by converting the database to a .MDE file. All changes to the VBA project (modules, forms, or reports) need to be made to the original MDB and then reconverted to MDE. In Access 2007 and Access 2010, the ACCDB database is converted to an ACCDE file. Some tools are available for unlocking and "decompiling", although certain elements including original VBA comments and formatting are normally irretrievable.
Microsoft Access saves information under the following file formats:
There are no Access versions between 2.0 and 7.0 because the Office 95 version was launched with Word 7. All of the Office 95 products have OLE 2 capabilities, and Access 7 shows that it was compatible with Word 7.
Version number 13 was skipped.
|
https://en.wikipedia.org/wiki?curid=20935
|
Metabolic pathway
In biochemistry, a metabolic pathway is a linked series of chemical reactions occurring within a cell. The reactants, products, and intermediates of an enzymatic reaction are known as metabolites, which are modified by a sequence of chemical reactions catalyzed by enzymes. In most cases of a metabolic pathway, the product of one enzyme acts as the substrate for the next. However, side products are considered waste and removed from the cell. These enzymes often require dietary minerals, vitamins, and other cofactors to function.
Different metabolic pathways function based on the position within a eukaryotic cell and the significance of the pathway in the given compartment of the cell. For instance, the, electron transport chain, and oxidative phosphorylation all take place in the mitochondrial membrane. In contrast, glycolysis, pentose phosphate pathway, and fatty acid biosynthesis all occur in the cytosol of a cell.
There are two types of metabolic pathways that are characterized by their ability to either synthesize molecules with the utilization of energy (anabolic pathway) or break down of complex molecules by releasing energy in the process (catabolic pathway). The two pathways complement each other in that the energy released from one is used up by the other. The degradative process of a catabolic pathway provides the energy required to conduct a biosynthesis of an anabolic pathway. In addition to the two distinct metabolic pathways is the amphibolic pathway, which can be either catabolic or anabolic based on the need for or the availability of energy.
Pathways are required for the maintenance of homeostasis within an organism and the flux of metabolites through a pathway is regulated depending on the needs of the cell and the availability of the substrate. The end product of a pathway may be used immediately, initiate another metabolic pathway or be stored for later use. The metabolism of a cell consists of an elaborate network of interconnected pathways that enable the synthesis and breakdown of molecules (anabolism and catabolism).
Each metabolic pathway consists of a series of biochemical reactions that are connected by their intermediates: the products of one reaction are the substrates for subsequent reactions, and so on. Metabolic pathways are often considered to flow in one direction. Although all chemical reactions are technically reversible, conditions in the cell are often such that it is thermodynamically more favorable for flux to proceed in one direction of a reaction. For example, one pathway may be responsible for the synthesis of a particular amino acid, but the breakdown of that amino acid may occur via a separate and distinct pathway. One example of an exception to this "rule" is the metabolism of glucose. Glycolysis results in the breakdown of glucose, but several reactions in the glycolysis pathway are reversible and participate in the re-synthesis of glucose (gluconeogenesis).
A catabolic pathway is a series of reactions that bring about a net release of energy in the form of a high energy phosphate bond formed with the energy carriers adenosine diphosphate (ADP) and guanosine diphosphate (GDP) to produce adenosine triphosphate (ATP) and guanosine triphosphate (GTP), respectively. The net reaction is, therefore, thermodynamically favorable, for it results in a lower free energy for the final products. A catabolic pathway is an exergonic system that produces chemical energy in the form of ATP, GTP, NADH, NADPH, FADH2, etc. from energy containing sources such as carbohydrates, fats, and proteins. The end products are often carbon dioxide, water, and ammonia. Coupled with an endergonic reaction of anabolism, the cell can synthesize new macromolecules using the original precursors of the anabolic pathway. An example of a coupled reaction is the phosphorylation of fructose-6-phosphate to form the intermediate fructose-1,6-bisphosphate by the enzyme phosphofructokinase accompanied by the hydrolysis of ATP in the pathway of glycolysis. The resulting chemical reaction within the metabolic pathway is highly thermodynamically favorable and, as a result, irreversible in the cell.
formula_1
A core set of energy-producing catabolic pathways occur within all living organisms in some form. These pathways transfer the energy released by breakdown of nutrients into ATP and other small molecules used for energy (e.g. GTP, NADPH, FADH). All cells can perform anaerobic respiration by glycolysis. Additionally, most organisms can perform more efficient aerobic respiration through the citric acid cycle and oxidative phosphorylation. Additionally plants, algae and cyanobacteria are able to use sunlight to anabolically synthesize compounds from non-living matter by photosynthesis.
In contrast to catabolic pathways, anabolic pathways require an energy input to construct macromolecules such as polypeptides, nucleic acids, proteins, polysaccharides, and lipids. The isolated reaction of anabolism is unfavorable in a cell due to a positive Gibbs Free Energy (+Δ"G"). Thus, an input of chemical energy through a coupling with an exergonic reaction is necessary. The coupled reaction of the catabolic pathway affects the thermodynamics of the reaction by lowering the overall activation energy of an anabolic pathway and allowing the reaction to take place. Otherwise, an endergonic reaction is non-spontaneous.
An anabolic pathway is a biosynthetic pathway, meaning that it combines smaller molecules to form larger and more complex ones. An example is the reversed pathway of glycolysis, otherwise known as gluconeogenesis, which occurs in the liver and sometimes in the kidney to maintain proper glucose concentration in the blood and supply the brain and muscle tissues with adequate amount of glucose. Although gluconeogenesis is similar to the reverse pathway of glycolysis, it contains three distinct enzymes from glycolysis that allow the pathway to occur spontaneously. An example of the pathway for gluconeogenesis is illustrated in the image titled "Gluconeogenesis Mechanism".
An amphibolic pathway is one that can be either catabolic or anabolic based on the availability of or the need for energy. The currency of energy in a biological cell is adenosine triphosphate (ATP), which stores its energy in the phosphoanhydride bonds. The energy is utilized to conduct biosynthesis, facilitate movement, and regulate active transport inside of the cell. Examples of amphibolic pathways are the citric acid cycle and the glyoxylate cycle. These sets of chemical reactions contain both energy producing and utilizing pathways. To the right is an illustration of the amphibolic properties of the TCA cycle.
The glyoxylate shunt pathway is an alternative to the tricarboxylic acid (TCA) cycle, for it redirects the pathway of TCA to prevent full oxidation of carbon compounds, and to preserve high energy carbon sources as future energy sources. This pathway occurs only in plants and bacteria and transpires in the absence of glucose molecules.
The flux of the entire pathway is regulated by the rate-determining steps. These are the slowest steps in a network of reactions. The rate-limiting step occurs near the beginning of the pathway and is regulated by feedback inhibition, which ultimately controls the overall rate of the pathway. The metabolic pathway in the cell is regulated by covalent or non-covalent modifications. A covalent modification involves an addition or removal of a chemical bond, whereas a non-covalent modification (also known as allosteric regulation) is the binding of the regulator to the enzyme via hydrogen bonds, electrostatic interactions, and Van Der Waals forces.
The rate of turnover in a metabolic pathway, also known as the metabolic flux, is regulated based on the stoichiometric reaction model, the utilization rate of metabolites, and the translocation pace of molecules across the lipid bilayer. The regulation methods are based on experiments involving 13C-labeling, which is then analyzed by Nuclear Magnetic Resonance (NMR) or gas chromatography-mass spectrometry (GC-MS)-derived mass compositions. The aforementioned techniques synthesize a statistical interpretation of mass distribution in proteinogenic amino acids to the catalytic activities of enzymes in a cell.
|
https://en.wikipedia.org/wiki?curid=20941
|
Malthusian catastrophe
A Malthusian catastrophe (also known as Malthusian check, Malthusian crisis, Malthusian spectre or Malthusian crunch) occurs when population growth outpaces agricultural production.
In 1798, Thomas Malthus wrote:
The dramatic imagery of the final sentence should not obscure the force of the earlier lines. Malthus is suggesting that the growth of population need not be restricted by plague or famine; more quotidian pressures work more quietly to slow population growth.
Malthus proposed two kinds of population checks: preventive and positive.
A preventive check is a conscious decision to delay marriage or abstain from procreation based on a lack of resources. Malthus argued that man is incapable of ignoring the consequences of uncontrolled population growth, and would intentionally avoid contributing to it. According to Malthus, a positive check is any event or circumstance that shortens the human life span. The primary examples of this are war, plague and famine. However, poor health and economic conditions are also considered instances of positive checks.
After World War II, mechanized agriculture produced a dramatic increase in productivity of agriculture and the Green Revolution greatly increased crop yields, expanding the world's food supply while lowering food prices. In response, the growth rate of the world's population accelerated rapidly, resulting in predictions by Paul R. Ehrlich, Simon Hopkins, and many others of an imminent Malthusian catastrophe. However, populations of most developed countries grew slowly enough to be outpaced by gains in productivity.
By the early 21st century, many technologically-developed countries had passed through the demographic transition, a complex social development encompassing a drop in total fertility rates in response to various fertility factors, including lower infant mortality, increased urbanization, and a wider availability of effective birth control.
On the assumption that the demographic transition is now spreading from the developed countries to less developed countries, the United Nations Population Fund estimates that human population may peak in the late 21st century rather than continue to grow until it has exhausted available resources.
Historians have estimated the total human population back to 10,000 BC. The figure on the right shows the trend of total population from 1800 to 2005, and from there in three projections out to 2100 (low, medium, and high). The United Nations population projections out to 2100 (the red, orange, and green lines) show a possible peak in the world's population occurring by 2040 in the first scenario, and by 2100 in the second scenario, and never ending growth in the third.
The graph of annual growth rates in world population (outlined in yellow) shows exponential growth at varying rates. Any point above zero shows exponential growth. A constant exponential growth would be a horizontal straight line at any height above 0. A stable population has a growth rate of 0. Population decline would take the curve below the chart into the yellow border. The hump in the graph began about 1920, peaked in the mid-1960s, and for the last 40 years the rate of exponential growth has been steadily declining. The downward direction of the curve does not mean that the population is declining, only that it is growing less quickly. The sharp fluctuation between 1959 and 1960 was due to the combined effects of the Great Leap Forward and a natural disaster in China. Also visible on this graph are the effects of the Great Depression, the two world wars, and possibly also the 1918 flu pandemic.
A 2002 study by the UN Food and Agriculture Organization predicts that world food production will be in excess of the needs of the human population by the year 2030; however, that source also states that hundreds of millions will remain hungry (presumably due to economic realities and political issues).
A 2004 study by a group of prominent economists and ecologists, including Kenneth Arrow and Paul Ehrlich suggests that the central concerns regarding sustainability have shifted from population growth to the consumption/savings ratio, due to shifts in population growth rates since the 1970s. Empirical estimates show that public policy (taxes or the establishment of more complete property rights) can promote more efficient consumption and investment that are sustainable in an ecological sense; that is, given the current (relatively low) population growth rate, the Malthusian catastrophe can be avoided by either a shift in consumer preferences or public policy that induces a similar shift.
A study conducted in 2009 said that food production will have to increase by 70% over the next 40 years, and food production in the developing world will need to double.
This is a result of the increasing population (world population will increase to 9.1 billion in 2050, where there are just 7.8 billion people today). Another issue is that the effects of global warming (floods, droughts, extreme weather events, ...) will negatively affect food production, in such a degree even that food shortages are likely to occur in the coming decades. As a result, we will need to use the scarce natural resources more efficiently and adapt to climate change. Lastly, some things (such as the increase in agricultural production for 1st generation biofuels) has not been taken into account in FAO's forecast.
Karl Marx and Friedrich Engels argued that Malthus failed to recognize a crucial difference between humans and other species. In capitalist societies, as Engels put it, scientific and technological "progress is as unlimited and at least as rapid as that of population". Marx argued, even more broadly, that the growth of both a human population "in toto" and the "relative surplus population" within it, occurred in direct proportion to accumulation.
Henry George in "Progress and Poverty" (1879) criticized Malthus's view that population growth was a cause of poverty, arguing that poverty was caused by the concentration of ownership of land and natural resources. George noted that humans are distinct from other species, because unlike most species humans can use their minds to leverage the reproductive forces of nature to their advantage. He wrote, "Both the jayhawk and the man eat chickens; but the more jayhawks, the fewer chickens, while the more men, the more chickens."
D. E. C. Eversley observed that Malthus appeared unaware of the extent of industrialization, and either ignored or discredited the possibility that it could improve living conditions of the poorer classes.
Barry Commoner believed in "The Closing Circle" (1971) that technological progress will eventually reduce the demographic growth and environmental damage created by civilization. He also opposed coercive measures postulated by neo-malthusian movements of his time arguing that their cost will fall disproportionately on the low-income population who is struggling already.
Ester Boserup suggested that expanding population leads to agricultural intensification and development of more productive and less labor-intensive methods of farming. Thus, human population levels determines agricultural methods, rather than agricultural methods determining population.
Environmentalist Stewart Brand summarized how the Malthusian predictions of The Population Bomb and The Limits to Growth failed to materialize due to radical changes in fertility:
Short-term trends, even on the scale of decades or centuries, cannot prove or disprove the existence of mechanisms promoting a Malthusian catastrophe over longer periods. However, due to the prosperity of a major fraction of the human population at the beginning of the 21st century, and the debatability of the predictions for ecological collapse made by Paul R. Ehrlich in the 1960s and 1970s, some people, such as economist Julian L. Simon and medical statistician Hans Rosling questioned its inevitability.
Joseph Tainter noted that science has diminishing marginal returns and that scientific progress is becoming more difficult, harder to achieve, and more costly, which may reduce efficiency of the factors that prevented the Malthusian scenarios from happening in the past.
|
https://en.wikipedia.org/wiki?curid=20942
|
Millennialism
Millennialism (from millennium, Latin for "a thousand years") or chiliasm (from the Greek equivalent), is a belief advanced by some religious denominations that a Golden Age or Paradise will occur on Earth prior to the final judgment and future eternal state of the "World to Come".
Christianity and Judaism have both produced messianic movements which featured millennialist teachings—such as the notion that an earthly kingdom of God was at hand. These millenarian movements often led to considerable social unrest.
Similarities to millennialism appear in Zoroastrianism, which identified successive thousand-year periods, each of which will end in a cataclysm of heresy and destruction, until the final destruction of evil and of the spirit of evil by a triumphant king of peace at the end of the final millennial age. "Then Saoshyant makes the creatures again pure, and the resurrection and future existence occur" ("Zand-i Vohuman Yasht 3:62").
Scholars have also linked various other social and political movements, both religious and secular, to millennialist metaphors.
Christian millennialist thinking is primarily based upon the Book of Revelation, specifically , which describes the vision of an angel who descended from heaven with a large chain and a key to a bottomless pit, and captured Satan, imprisoning him for a thousand years:
The Book of Revelation then describes a series of judges who are seated on thrones, as well as John's vision of the souls of those who were beheaded for their testimony in favor of Jesus and their rejection of the mark of the beast. These souls:
Thus, John of Patmos characterizes a millennium where Christ and the Father will rule over a theocracy of the righteous. While there are an abundance of biblical references to such a kingdom of God throughout the Old and New Testaments, this is the only reference in the Bible to such a period lasting one thousand years.
During the first centuries after Christ, various forms of chiliasm (millennialism) were to be found in the Church, both East and West. It was a decidedly majority view at that time, as admitted by Eusebius, himself an opponent of the doctrine:
Nevertheless, strong opposition later developed from some quarters, most notably from Augustine of Hippo. The Church never took a formal position on the issue at any of the ecumenical councils, and thus both pro and con positions remained consistent with orthodoxy. The addition to the Nicene Creed was intended to refute the perceived Sabellianism of Marcellus of Ancyra and others, a doctrine which includes an end to Christ's reign and which is explicitly singled out for condemnation by the council [Canon #1]. The "Catholic Encyclopedia" notes that the 2nd century proponents of various Gnostic beliefs (themselves considered heresies) also rejected millenarianism.
Millennialism was taught by various earlier writers such as Justin Martyr, Irenaeus, Tertullian, Commodian, Lactantius, Methodius, and Apollinaris of Laodicea in a form now called premillennialism. According to religious scholar Rev. Dr. Francis Nigel Lee, "Justin's 'Occasional Chiliasm' sui generis which was strongly anti-pretribulationistic was followed possibly by Pothinus in A.D. 175 and more probably (around 185) by Irenaeus". Justin Martyr, discussing his own premillennial beliefs in his "Dialogue with Trypho the Jew", Chapter 110, observed that they were not necessary to Christians:
Melito of Sardis is frequently listed as a second century proponent of premillennialism. The support usually given for the supposition is that "Jerome [Comm. on Ezek. 36] and Gennadius [De Dogm. Eccl., Ch. 52] both affirm that he was a decided millenarian."
In the early third century, Hippolytus of Rome wrote:
Around 220, there were some similar influences on Tertullian, although only with very important and extremely optimistic (if not perhaps even postmillennial) modifications and implications. On the other hand, "Christian Chiliastic" ideas were indeed advocated in 240 by Commodian; in 250 by the Egyptian Bishop Nepos in his Refutation of Allegorists; in 260 by the almost unknown Coracion; and in 310 by Lactantius. Into the late fourth century, Bishop Ambrose of Milan had millennial leanings (Ambrose of Milan. Book II. On the Belief in the Resurrection, verse 108). Lactantius is the last great literary defender of chiliasm in the early Christian church. Jerome and Augustine vigorously opposed chiliasm by teaching the symbolic interpretation of the Revelation of St. John, especially chapter 20.
In a letter to Queen Gerberga of France around 950, Adso of Montier-en-Der established the idea of a "last World Emperor" who would conquer non-Christians before the arrival of the Antichrist.
Christian views on the future order of events diversified after the Protestant reformation (c.1517). In particular, new emphasis was placed on the passages in the Book of Revelation which seemed to say that as Christ would return to judge the living and the dead, Satan would be locked away for 1000 years, but then released on the world to instigate a final battle against God and his Saints (). Previous Catholic and Orthodox theologians had no clear or consensus view on what this actually meant (only the concept of the end of the world coming unexpectedly, "like a thief in a night", and the concept of "the antichrist" were almost universally held). Millennialist theories try to explain what this "1000 years of Satan bound in chains" would be like.
Various types of millennialism exist with regard to Christian eschatology, especially within Protestantism, such as Premillennialism, Postmillennialism, and Amillennialism. The first two refer to different views of the relationship between the "millennial Kingdom" and Christ's second coming.
Premillennialism sees Christ's second advent as preceding the millennium, thereby separating the second coming from the final judgment. In this view, "Christ's reign" will be physically on the earth.
Postmillennialism sees Christ's second coming as subsequent to the millennium and concurrent with the final judgment. In this view "Christ's reign" (during the millennium) will be spiritual in and through the church.
Amillennialism basically denies a future literal 1000 year kingdom and sees the church age metaphorically described in Rev. 20:1–6 in which "Christ's reign" is current in and through the church.
The Catholic Church strongly condemns millennialism as the following shows:
The Bible Student movement is a millennialist movement based on views expressed in "The Divine Plan of the Ages," in 1886, in Volume One of the "Studies in the Scriptures" series, by Charles Taze Russell. (This series is still being published, since 1927, by the Dawn Bible Students Association.)
Jehovah's Witnesses believe that Christ will rule from heaven for 1,000 years as king over the earth, assisted by 144,000 holy ones.
Also known as Eastern Lightning, The Church of Almighty God mentions in its teachings the Age of Millennial Kingdom, which will follow the catastrophes prophesied in the Book of Revelation.
Millennialist thinking first emerged in Jewish apocryphal literature of the tumultuous Second Temple period,
producing writings such as the Psalm 46, the Book of Enoch, the Book of Jubilees, Esdras, Book of Daniel, and the additions to Daniel.
Passages within these texts, including 1 Enoch 6-36, 91-104, 2 Enoch 33:1, and Jubilees 23:27, refer to the establishment of a "millennial kingdom" by a messianic figure, occasionally suggesting that this kingdom would endure for a thousand years. However, the actual number of years given for the duration of the kingdom varied. In 4 Ezra 7:28-9, for example, the kingdom lasts only 400 years.
This notion of the millennium no doubt helped some Jews to cope with the socio-political conflicts that they faced. This concept of the millennium served to reverse the previous period of evil and suffering, rewarding the virtuous for their courage while punishing evil-doers, with a clear separation of those who are good from those who are evil. The vision of a thousand-year period of bliss for the faithful, to be enjoyed here in the physical world as "heaven on earth", exerted an irresistible power over the imagination of Jews in the inter-testamental period as well as on early Christians. Millennialism, which had already existed in Jewish thought, received a new interpretation and fresh impetus with the development of Christianity.
Gerschom Scholem profiles medieval and early modern Jewish millennialist teachings in his book "Sabbatai Sevi, the mystical messiah", which focuses on the 17th-century movement centered on the self-proclaimed messiahship (1648) of Sabbatai Zevi
(16261676).
Bahá'u'lláh mentioned in the "Kitáb-i-Íqán" that God will renew the "City of God" about every thousand years, and specifically mentioned that a new Manifestation of God would not appear within 1,000 years (1893–2893) of Bahá'u'lláh's message, but that the authority of Bahá'u'lláh's message could last up to 500,000 years.
The Theosophist Alice Bailey taught that Christ (in her books she refers to the powerful spiritual being best known by Theosophists as "Maitreya" as "The Christ" or "The World Teacher", not as "Maitreya") would return “sometime after AD 2025”, and that this would be the New Age equivalent of the Christian concept of the Second Coming of Christ.
Millennial social movements are a specific form of millenarianism that are based on some concept of a one thousand-year cycle. Sometimes the two terms are used as synonyms, but this is not entirely accurate for a purist. Millennial social movements need not be religious, but they must have a vision of an apocalypse that can be utopian or dystopian. Those who are part of millennial social movements are "prone to be violent," with certain types of millennialism connected to violence. The first is progressive, where the "transformation of the social order is gradual and humans play a role in fostering that transformation." The second is catastrophic millennialism which "deems the current social order as irrevocably corrupt, and total destruction of this order is necessary as the precursor to the building of a new, godly order." However the link between millennialism and violence may be problematic, as new religious movements may stray from the catastrophic view as time progresses.
The most controversial interpretation of the Three Ages philosophy and of millennialism in general is Adolf Hitler's "Third Reich" (""Drittes Reich""), which in his vision would last for a thousand years to come (""Tausendjähriges Reich"") but ultimately lasted for only 12 years (1933–1945).
The phrase "Third Reich" was originally coined by the German thinker Arthur Moeller van den Bruck, who in 1923 published a book titled "Das Dritte Reich". Looking back at German history, he distinguished two separate periods, and identified them with the ages of the 12th century Italian theologian Joachim of Fiore:
After the interval of the Weimar Republic (1918–1933), during which constitutionalism, parliamentarism and even pacifism ruled, these were then to be followed by:
Although van den Bruck was unimpressed by Hitler when he met him in 1922 and did not join the Nazi Party, the phrase was nevertheless adopted by the Nazis to describe the totalitarian state they wanted to set up when they gained power, which they succeeded in doing in 1933. Later, however, the Nazi authorities banned the informal use of "Third Reich" throughout the German press in the summer of 1939, instructing it to use more official terms such as "German Reich", "Greater German Reich", and "National Socialist Germany" exclusively.
During the early part of the Third Reich many Germans also referred to Hitler as being the "German Messiah", especially when he conducted the Nuremberg Rallies, which came to be held at a date somewhat before the Autumn Equinox in Nuremberg, Germany.
In a speech held on 27 November 1937, Hitler commented on his plans to have major parts of Berlin torn down and rebuilt:
After Adolf Hitler's unsuccessful attempt to implement a thousand-year-reign, the Vatican issued an official statement that millennial claims could not be safely taught and that the related scriptures in Revelation (also called the Apocalypse) should be understood spiritually. Catholic author Bernard LeFrois wrote:
The early Christian concept had ramifications far beyond strictly religious concern during the centuries to come, as it was blended and enhanced with ideas of utopia.
In the wake of early millennial thinking, the Three Ages philosophy developed. The Italian monk and theologian Joachim of Fiore (died 1202) claimed that all of human history was a succession of three ages:
It was believed that the Age of the Holy Spirit would begin at around 1260, and that from then on all believers would be living as monks, mystically transfigured and full of praise for God, for a thousand years until Judgment Day would put an end to the history of our planet.
The New Age movement was also highly influenced by Joachim of Fiore's divisions of time, and transformed the Three Ages philosophy into astrological terminology. The Age of the Father was recast as the Age of Aries, the Age of the Son became the Age of Pisces, and the Age of the Holy Spirit was called the Aquarian New Age. The current so-called "Age of Aquarius" will supposedly witness the development of a number of great changes for humankind, reflecting the typical features of millennialism.
|
https://en.wikipedia.org/wiki?curid=20943
|
Might and Magic
Might and Magic is a series of role-playing video games from New World Computing, which in 1996 became a subsidiary of The 3DO Company. The original "Might and Magic" series ended with the closure of the 3DO Company. The rights to the "Might and Magic" name were purchased for US$1.3 million by Ubisoft, who "rebooted" the franchise with a new series with no apparent connection to the previous continuity, starting with the games "Heroes of Might and Magic V" and "Dark Messiah of Might and Magic".
There have been several spin-offs from the main series, including the long-running "Heroes of Might and Magic" series, "Crusaders of Might and Magic", "Warriors of Might and Magic", "Legends of Might and Magic", "", and the fan-made "Swords of Xeen".
In August 2003, Ubisoft acquired the rights to the "Might and Magic" franchise for US$1.3 million after 3DO filed for Chapter 11 bankruptcy. Ubisoft has since released multiple new projects using the "Might and Magic" brand, including a fifth installment of the "Heroes" series developed by Nival, an action-style game "Dark Messiah of Might and Magic" developed by Arkane Studios, a puzzle RPG "" developed by "Capybara Games", and the mobile strategy RPG titled "Might & Magic: Elemental Guardians".
The majority of the gameplay takes place in a medieval fantasy setting, while later sections of the games are often based on science fiction tropes, the transition often serving as a plot twist. The player controls a party of player characters, which can consist of members of various character classes. The game world is presented to the player in first person perspective. In the earlier games the interface is very similar to that of "Bard's Tale", but from "" onward, the interface features a three-dimensional environment. Combat is turn-based, though the later games allowed the player to choose to conduct combat in real time.
The game worlds in all of the Might and Magic games are quite large, and a player can expect each game to provide several dozen hours of gameplay. It is usually quite combat-intensive and often involves large groups of enemy creatures. Monsters and situations encountered throughout the series tend to be well-known fantasy staples such as giant rats, werewolf curses, dragon flights and zombie hordes, rather than original creations. "Isles of Terra" and the "Xeen" games featured a more distinct environment, blending fantasy and science fiction elements in a unique way.
The Might and Magic games have some replay value as the player can choose their party composition, develop different skills, choose sides, do quests in a different order, hunt for hidden secrets and easter eggs, and/or change difficulty level.
Although most of the gameplay reflects a distinctly fantasy genre, the overarching plot of the first nine games has something of a science fiction background. The series is set in a fictional galaxy as part of an alternative universe, where planets are overseen by a powerful race of space travelers known as Ancients. In each of the games, a party of characters fights monsters and completes quests on one of these planets, until they eventually become involved in the affairs of the Ancients. Might and Magic could as such be considered an example of science fantasy.
The producer of the series was Jon Van Caneghem. Van Caneghem has stated in interview that the "Might and Magic" setting is inspired by his love for both science fiction and fantasy. He cites "The Twilight Zone" and the "Star Trek" episode "For the World is Hollow and I Have Touched the Sky" as having inspired "Might and Magic" lore.
The first five games in the series concern the renegade guardian of the planet Terra, named Sheltem, who becomes irrevocably corrupted, developing a penchant for throwing planets into their suns. Sheltem establishes himself on a series of flat worlds known as nacelles (which are implied to be giant spaceships) and Corak, a second guardian and creation of the Ancients, with the assistance of the player characters, pursues him across the Void. Eventually both Corak and Sheltem are destroyed in a climactic battle on the nacelle of Xeen.
The sixth, seventh and eighth games take place on Enroth, a single planet partially ruled by the Ironfist dynasty, and chronicle the events and aftermath of an invasion by the Kreegan (colloquially referred to as Devils), the demonlike arch-enemies of the Ancients. It is also revealed that the destruction wrought by the Ancients' wars with the Kreegan is the reason why the worlds of Might & Magic exist as medieval fantasy settings despite once being seeded with futuristic technology – the worlds have been 'cut off' from the Ancients and descended into barbarism. The first through third games in the "Heroes of Might and Magic" series traces the fortunes of the Ironfists in more detail. None of the science fiction elements appear in the "Heroes" series besides the appearance of Kreegan characters in "Heroes of Might and Magic III" and "IV".
The Ubisoft release "" departs from this continuity and is set in the world of Ashan. Ashan is a high fantasy setting with no science fiction elements in its lore.
"Might and Magic" is considered one of the defining examples of early role-playing video games, along with "The Bard's Tale", "Ultima" and "Wizardry" series. By March 1994, combined sales of the "Might and Magic" series totaled 1 million units. The number rose to 2.5 million sales by November 1996. and 4 million by March 1999.
|
https://en.wikipedia.org/wiki?curid=20945
|
Guitar
The guitar is a fretted musical instrument that usually has six strings. It is typically played with both hands by strumming or plucking the strings with either a guitar pick or the fingers/fingernails of one hand, while simultaneously fretting (pressing the strings against the frets) with the fingers of the other hand. The sound of the vibrating strings is projected either acoustically, by means of the hollow chamber of the guitar (for an acoustic guitar), or through an electrical amplifier and a speaker.
The guitar is a type of chordophone, traditionally constructed from wood and strung with either gut, nylon or steel strings and distinguished from other chordophones by its construction and tuning. The modern guitar was preceded by the gittern, the vihuela, the four-course Renaissance guitar, and the five-course baroque guitar, all of which contributed to the development of the modern six-string instrument.
There are three main types of modern acoustic guitar: the classical guitar (Spanish guitar/nylon-string guitar), the steel-string acoustic guitar, and the archtop guitar, which is sometimes called a "jazz guitar". The tone of an acoustic guitar is produced by the strings' vibration, amplified by the hollow body of the guitar, which acts as a resonating chamber. The classical guitar is often played as a solo instrument using a comprehensive finger-picking technique where each string is plucked individually by the player's fingers, as opposed to being strummed. The term "finger-picking" can also refer to a specific tradition of folk, blues, bluegrass, and country guitar playing in the United States. The acoustic bass guitar is a low-pitched instrument that is one octave below a regular guitar.
Electric guitars, introduced in the 1930s, use an amplifier and a loudspeaker that both makes the sound of the instrument loud enough for the performers and audience to hear, and, given that it produces an electric signal when played, that can electronically manipulate and shape the tone using an equalizer (e.g., bass and treble tone controls) and a huge variety of electronic effects units, the most commonly used ones being distortion (or "overdrive") and reverb. Early amplified guitars employed a hollow body, but solid wood guitars began to dominate during the 1960s and 1970s, as they are less prone to unwanted acoustic feedback "howls". As with acoustic guitars, there are a number of types of electric guitars, including hollowbody guitars, archtop guitars (used in jazz guitar, blues and rockabilly) and solid-body guitars, which are widely used in rock music.
The loud, amplified sound and sonic power of the electric guitar played through a guitar amp has played a key role in the development of blues and rock music, both as an accompaniment instrument (playing riffs and chords) and performing guitar solos, and in many rock subgenres, notably heavy metal music and punk rock. The electric guitar has had a major influence on popular culture. The guitar is used in a wide variety of musical genres worldwide. It is recognized as a primary instrument in genres such as blues, bluegrass, country, flamenco, folk, jazz, jota, mariachi, metal, punk, reggae, rock, soul, and pop.
Before the development of the electric guitar and the use of synthetic materials, a guitar was defined as being an instrument having "a long, fretted neck, flat wooden soundboard, ribs, and a flat back, most often with incurved sides." The term is used to refer to a number of chordophones that were developed and used across Europe, beginning in the 12th century and, later, in the Americas. A 3,300-year-old stone carving of a Hittite bard playing a stringed instrument is the oldest iconographic representation of a chordophone and clay plaques from Babylonia show people playing an instrument that has a strong resemblance to the guitar, indicating a possible Babylonian origin for the guitar.
The modern word "guitar," and its antecedents, has been applied to a wide variety of chordophones since classical times and as such causes confusion. The English word "guitar," the German "," and the French ' were all adopted from the Spanish ', which comes from the Andalusian Arabic (') and the Latin ', which in turn came from the Ancient Greek . Kithara appears in the Bible four times (1 Cor. 14:7, Rev. 5:8, 14:2 and 15:2), and is usually translated into English as "harp".
Many influences are cited as antecedents to the modern guitar. Although the development of the earliest "guitars" is lost in the history of medieval Spain, two instruments are commonly cited as their most influential predecessors, the European lute and its cousin, the four-string oud; the latter was brought to Iberia by the Moors in the 8th century.
At least two instruments called "guitars" were in use in Spain by 1200: the ' (Latin guitar) and the so-called ' (Moorish guitar). The guitarra morisca had a rounded back, wide fingerboard, and several sound holes. The guitarra Latina had a single sound hole and a narrower neck. By the 14th century the qualifiers "moresca" or "morisca" and "latina" had been dropped, and these two cordophones were simply referred to as guitars.
The Spanish vihuela, called in Italian the "", a guitar-like instrument of the 15th and 16th centuries, is widely considered to have been the single most important influence in the development of the baroque guitar. It had six courses (usually), lute-like tuning in fourths and a guitar-like body, although early representations reveal an instrument with a sharply cut waist. It was also larger than the contemporary four-course guitars. By the 16th century, the vihuela's construction had more in common with the modern guitar, with its curved one-piece ribs, than with the viols, and more like a larger version of the contemporary four-course guitars. The vihuela enjoyed only a relatively short period of popularity in Spain and Italy during an era dominated elsewhere in Europe by the lute; the last surviving published music for the instrument appeared in 1576.
Meanwhile, the five-course baroque guitar, which was documented in Spain from the middle of the 16th century, enjoyed popularity, especially in Spain, Italy and France from the late 16th century to the mid-18th century. In Portugal, the word "viola" referred to the guitar, as "guitarra" meant the "Portuguese guitar", a variety of cittern.
There were many different plucked instruments that were being invented and used in Europe, during the Middle Ages. By the 16th century, most of the forms of guitar had fallen off, to never be seen again. However, midway through the 16th century, the five-course guitar was established. It was not a straightforward process. There were two types of five-course guitars, they differed in the location of the major third and in the interval pattern. The fifth course can be placed on the instrument, because it was known to play seventeen notes or more. Because the guitar had a fifth string, it was capable of playing that amount of notes. The guitars strings were tuned in unison, so, in other words, it was tuned by placing a finger on the second fret of the thinnest string and tuning the guitar bottom to top. The strings were a whole octave apart from one another, which is the reason for the different method of tuning. Because it was such so different, there was major controversy as to who created the five course guitar. A literary source, Lope de Vega's Dorotea, gives the credit to the poet and musician Vicente Espinel. This claim was also repeated by Nicolas Doizi de Velasco in 1640, however this claim has been refuted by others who state that Espinel's birth year (1550) make it impossible for him to be responsible for the tradition. He believed that the tuning was the reason the instrument became known as the Spanish guitar in Italy. Even later, in the same century, Gaspar Sanz wrote that other nations such as Italy or France added to the Spanish guitar. All of these nations even imitated the five-course guitar by "recreating" their own.
Finally, circa 1850, the form and structure of the modern guitar is followed by different Spanish makers such as Manuel de Soto y Solares and perhaps the most important of all guitar makers Antonio Torres Jurado, who increased the size of the guitar body, altered its proportions, and invented the breakthrough fan-braced pattern. Bracing, which refers to the internal pattern of wood reinforcements used to secure the guitar's top and back and prevent the instrument from collapsing under tension, is an important factor in how the guitar sounds. Torres' design greatly improved the volume, tone, and projection of the instrument, and it has remained essentially unchanged since.
Guitars can be divided into two broad categories, acoustic and electric guitars. Within each of these categories, there are also further sub-categories. For example, an electric guitar can be purchased in a six-string model (the most common model) or in seven- or twelve-string models.
Acoustic guitars form several notable subcategories within the acoustic guitar group: classical and flamenco guitars; steel-string guitars, which include the flat-topped, or "folk", guitar; twelve-string guitars; and the arched-top guitar. The acoustic guitar group also includes unamplified guitars designed to play in different registers, such as the acoustic bass guitar, which has a similar tuning to that of the electric bass guitar.
Renaissance and Baroque guitars are the ancestors of the modern classical and flamenco guitar. They are substantially smaller, more delicate in construction, and generate less volume. The strings are paired in courses as in a modern 12-string guitar, but they only have four or five courses of strings rather than six single strings normally used now. They were more often used as rhythm instruments in ensembles than as solo instruments, and can often be seen in that role in early music performances. (Gaspar Sanz's "Instrucción de Música sobre la Guitarra Española" of 1674 contains his whole output for the solo guitar.) Renaissance and Baroque guitars are easily distinguished, because the Renaissance guitar is very plain and the Baroque guitar is very ornate, with ivory or wood inlays all over the neck and body, and a paper-cutout inverted "wedding cake" inside the hole.
Classical guitars, also known as "Spanish" guitars, are typically strung with nylon strings, plucked with the fingers, played in a seated position and are used to play a diversity of musical styles including classical music. The classical guitar's wide, flat neck allows the musician to play scales, arpeggios, and certain chord forms more easily and with less adjacent string interference than on other styles of guitar. Flamenco guitars are very similar in construction, but they are associated with a more percussive tone. In Portugal, the same instrument is often used with steel strings particularly in its role within fado music. The guitar is called viola, or violão in Brazil, where it is often used with an extra seventh string by choro musicians to provide extra bass support.
In Mexico, the popular mariachi band includes a range of guitars, from the small "requinto" to the "guitarrón," a guitar larger than a cello, which is tuned in the bass register. In Colombia, the traditional quartet includes a range of instruments too, from the small "bandola" (sometimes known as the Deleuze-Guattari, for use when traveling or in confined rooms or spaces), to the slightly larger tiple, to the full-sized classical guitar. The requinto also appears in other Latin-American countries as a complementary member of the guitar family, with its smaller size and scale, permitting more projection for the playing of single-lined melodies. Modern dimensions of the classical instrument were established by the Spaniard Antonio de Torres Jurado (1817–1892).
Flat-top or steel-string guitars are similar to the classical guitar, however, within the varied sizes of the steel-stringed guitar the body size is usually significantly larger than a classical guitar, and has a narrower, reinforced neck and stronger structural design. The robust X-bracing typical of the steel-string was developed in the 1840s by German-American luthiers, of whom Christian Friedrich "C. F." Martin is the best known. Originally used on gut-strung instruments, the strength of the system allowed the guitar to withstand the additional tension of steel strings when this fortunate combination arose in the early 20th century. The steel strings produce a brighter tone, and according to many players, a louder sound. The acoustic guitar is used in many kinds of music including folk, country, bluegrass, pop, jazz, and blues. Many variations are possible from the roughly classical-sized OO and Parlour to the large Dreadnought (the most commonly available type) and Jumbo. Ovation makes a modern variation, with a rounded back/side assembly molded from artificial materials.
Archtop guitars are steel-string instruments in which the top (and often the back) of the instrument are carved, from a solid billet, into a curved, rather than a flat, shape. This violin-like construction is usually credited to the American Orville Gibson. Lloyd Loar of the Gibson Mandolin-Guitar Mfg. Co introduced the violin-inspired "F"-shaped hole design now usually associated with archtop guitars, after designing a style of mandolin of the same type. The typical archtop guitar has a large, deep, hollow body whose form is much like that of a mandolin or a violin-family instrument. Nowadays, most archtops are equipped with magnetic pickups, and they are therefore both acoustic and electric. F-hole archtop guitars were immediately adopted, upon their release, by both jazz and country musicians, and have remained particularly popular in jazz music, usually with flatwound strings.
All three principal types of resonator guitars were invented by the Slovak-American John Dopyera (1893–1988) for the National and Dobro (Dopyera Brothers) companies. Similar to the flat top guitar in appearance, but with a body that may be made of brass, nickel-silver, or steel as well as wood, the sound of the resonator guitar is produced by one or more aluminum resonator cones mounted in the middle of the top. The physical principle of the guitar is therefore similar to the loudspeaker.
The original purpose of the resonator was to produce a very loud sound; this purpose has been largely superseded by electrical amplification, but the resonator guitar is still played because of its distinctive tone. Resonator guitars may have either one or three resonator cones. The method of transmitting sound resonance to the cone is either a "biscuit" bridge, made of a small piece of hardwood at the vertex of the cone (Nationals), or a "spider" bridge, made of metal and mounted around the rim of the (inverted) cone (Dobros). Three-cone resonators always use a specialized metal bridge. The type of resonator guitar with a neck with a square cross-section—called "square neck" or "Hawaiian"—is usually played face up, on the lap of the seated player, and often with a metal or glass slide. The round neck resonator guitars are normally played in the same fashion as other guitars, although slides are also often used, especially in blues.
The twelve-string guitar usually has steel strings, and it is widely used in folk music, blues, and rock and roll. Rather than having only six strings, the 12-string guitar has six courses made up of two strings each, like a mandolin or lute. The highest two courses are tuned in unison, while the others are tuned in octaves. The 12-string guitar is also made in electric forms. The chime-like sound of the 12-string electric guitar was the basis of jangle pop.
The acoustic bass guitar is a bass instrument with a hollow wooden body similar to, though usually somewhat larger than, that of a 6-string acoustic guitar. Like the traditional electric bass guitar and the double bass, the acoustic bass guitar commonly has four strings, which are normally tuned E-A-D-G, an octave below the lowest four strings of the 6-string guitar, which is the same tuning pitch as an electric bass guitar. It can, more rarely, be found with 5 or 6 strings, which provides a wider range of notes to be played with less movement up and down the neck.
Electric guitars can have solid, semi-hollow, or hollow bodies; solid bodies produce little sound without amplification. Electromagnetic pickups, and sometimes piezoelectric pickups, convert the vibration of the steel strings into signals, which are fed to an amplifier through a patch cable or radio transmitter. The sound is frequently modified by other electronic devices (effects units) or the natural distortion of valves (vacuum tubes) or the pre-amp in the amplifier. There are two main types of magnetic pickups, single- and double-coil (or humbucker), each of which can be passive or active. The electric guitar is used extensively in jazz, blues, R & B, and rock and roll. The first successful magnetic pickup for a guitar was invented by George Beauchamp, and incorporated into the 1931 Ro-Pat-In (later Rickenbacker) "Frying Pan" lap steel; other manufacturers, notably Gibson, soon began to install pickups in archtop models. After World War II the completely solid-body electric was popularized by Gibson in collaboration with Les Paul, and independently by Leo Fender of Fender Music. The lower fretboard action (the height of the strings from the fingerboard), lighter (thinner) strings, and its electrical amplification lend the electric guitar to techniques less frequently used on acoustic guitars. These include tapping, extensive use of legato through pull-offs and hammer-ons (also known as slurs), pinch harmonics, volume swells, and use of a tremolo arm or effects pedals.
Some electric guitar models feature piezoelectric pickups, which function as transducers to provide a sound closer to that of an acoustic guitar with the flip of a switch or knob, rather than switching guitars. Those that combine piezoelectric pickups and magnetic pickups are sometimes known as hybrid guitars.
Hybrids of acoustic and electric guitars are also common. There are also more exotic varieties, such as guitars with two, three, or rarely four necks, all manner of alternate string arrangements, fretless fingerboards (used almost exclusively on bass guitars, meant to emulate the sound of a stand-up bass), 5.1 surround guitar, and such.
Solid body seven-string guitars were popularized in the 1980s and 1990s. Other artists go a step further, by using an eight-string guitar with two extra low strings. Although the most common seven-string has a low B string, Roger McGuinn (of The Byrds and Rickenbacker) uses an octave G string paired with the regular G string as on a 12-string guitar, allowing him to incorporate chiming 12-string elements in standard six-string playing. In 1982 Uli Jon Roth developed the "Sky Guitar", with a vastly extended number of frets, which was the first guitar to venture into the upper registers of the violin. Roth's seven-string and "Mighty Wing" guitar features a wider octave range.
The bass guitar (also called an "electric bass", or simply a "bass") is similar in appearance and construction to an electric guitar, but with a longer neck and scale length, and four to six strings. The four-string bass, by far the most common, is usually tuned the same as the double bass, which corresponds to pitches one octave lower than the four lowest pitched strings of a guitar (E, A, D, and G). The bass guitar is a transposing instrument, as it is notated in bass clef an octave higher than it sounds (as is the double bass) to avoid excessive ledger lines being required below the staff. Like the electric guitar, the bass guitar has pickups and it is plugged into an amplifier and speaker for live performances.
Modern guitars can be constructed to suit both left- and right-handed players. Normally, the dominant hand (in most people, the right hand) is used to pluck or strum the strings. This is similar to the convention of the violin family of instruments where the right hand controls the bow.
Left-handed players sometimes choose an opposite-handed (mirror) instrument, although some play in a standard-handed manner, others play a standard-handed guitar reversed, and still others (for example Jimi Hendrix) played a standard-handed guitar strung in reverse. This last configuration differs from a true opposite handed guitar in that the saddle is normally angled in such a way that the bass strings are slightly longer than the treble strings to improve intonation. Reversing the strings, therefore, reverses the relative orientation of the saddle, adversely affecting intonation, although in Hendrix's case, this is believed to have been an important element in his unique sound.
The headstock is located at the end of the guitar neck farthest from the body. It is fitted with machine heads that adjust the tension of the strings, which in turn affects the pitch. The traditional tuner layout is "3+3", in which each side of the headstock has three tuners (such as on Gibson Les Pauls). In this layout, the headstocks are commonly symmetrical. Many guitars feature other layouts, including six-in-line tuners (featured on Fender Stratocasters) or even "4+2" (e.g. Ernie Ball Music Man). Some guitars (such as Steinbergers) do not have headstocks at all, in which case the tuning machines are located elsewhere, either on the body or the bridge.
The nut is a small strip of bone, plastic, brass, corian, graphite, stainless steel, or other medium-hard material, at the joint where the headstock meets the fretboard. Its grooves guide the strings onto the fretboard, giving consistent lateral string placement. It is one of the endpoints of the strings' vibrating length. It must be accurately cut, or it can contribute to tuning problems due to string slippage or string buzz. To reduce string friction in the nut, which can adversely affect tuning stability, some guitarists fit a roller nut. Some instruments use a zero fret just in front of the nut. In this case the nut is used only for lateral alignment of the strings, the string height and length being dictated by the zero fret.
A guitar's frets, fretboard, tuners, headstock, and truss rod, all attached to a long wooden extension, collectively constitute its neck. The wood used to make the fretboard usually differs from the wood in the rest of the neck. The bending stress on the neck is considerable, particularly when heavier gauge strings are used (see Tuning), and the ability of the neck to resist bending (see Truss rod) is important to the guitar's ability to hold a constant pitch during tuning or when strings are fretted. The rigidity of the neck with respect to the body of the guitar is one determinant of a good instrument versus a poor-quality one.
The shape of the neck (from a cross-sectional perspective) can also vary, from a gentle "C" curve to a more pronounced "V" curve. There are many different types of neck profiles available, giving the guitarist many options. Some aspects to consider in a guitar neck may be the overall width of the fretboard, scale (distance between the frets), the neck wood, the type of neck construction (for example, the neck may be glued in or bolted on), and the shape (profile) of the back of the neck. Other types of material used to make guitar necks are graphite (Steinberger guitars), aluminum (Kramer Guitars, Travis Bean and Veleno guitars), or carbon fiber (Modulus Guitars and ThreeGuitars). Double neck electric guitars have two necks, allowing the musician to quickly switch between guitar sounds.
The neck joint or heel is the point at which the neck is either bolted or glued to the body of the guitar. Almost all acoustic steel-string guitars, with the primary exception of Taylors, have glued (otherwise known as set) necks, while electric guitars are constructed using both types. Most classical guitars have a neck and headblock carved from one piece of wood, known as a "Spanish heel." Commonly used set neck joints include mortise and tenon joints (such as those used by C. F. Martin & Co.), dovetail joints (also used by C. F. Martin on the D-28 and similar models) and Spanish heel neck joints, which are named after the shoe they resemble and commonly found in classical guitars. All three types offer stability.
Bolt-on necks, though they are historically associated with cheaper instruments, do offer greater flexibility in the guitar's set-up, and allow easier access for neck joint maintenance and repairs. Another type of neck, only available for solid body electric guitars, is the neck-through-body construction. These are designed so that everything from the machine heads down to the bridge are located on the same piece of wood. The sides (also known as wings) of the guitar are then glued to this central piece. Some luthiers prefer this method of construction as they claim it allows better sustain of each note. Some instruments may not have a neck joint at all, having the neck and sides built as one piece and the body built around it.
The fingerboard, also called the fretboard, is a piece of wood embedded with metal frets that comprises the top of the neck. It is flat on classical guitars and slightly curved crosswise on acoustic and electric guitars. The curvature of the fretboard is measured by the fretboard radius, which is the radius of a hypothetical circle of which the fretboard's surface constitutes a segment. The smaller the fretboard radius, the more noticeably curved the fretboard is. Most modern guitars feature a 12" neck radius, while older guitars from the 1960s and 1970s usually feature a 6-8" neck radius. Pinching a string against a fret on fretboard effectively shortens the vibrating length of the string, producing a higher pitch.
Fretboards are most commonly made of rosewood, ebony, maple, and sometimes manufactured using composite materials such as HPL or resin. See the section "Neck" below for the importance of the length of the fretboard in connection to other dimensions of the guitar. The fingerboard plays an essential role in the treble tone for acoustic guitars. The quality of vibration of the fingerboard is the principal characteristic for generating the best treble tone. For that reason, ebony wood is better, but because of high use, ebony has become rare and extremely expensive. Most guitar manufacturers have adopted rosewood instead of ebony.
Almost all guitars have frets, which are metal strips (usually nickel alloy or stainless steel) embedded along the fretboard and located at exact points that divide the scale length in accordance with a specific mathematical formula. The exceptions include fretless bass guitars and very rare fretless guitars. Pressing a string against a fret determines the strings' vibrating length and therefore its resultant pitch. The pitch of each consecutive fret is defined at a half-step interval on the chromatic scale. Standard classical guitars have 19 frets and electric guitars between 21 and 24 frets, although guitars have been made with as many as 27 frets. Frets are laid out to accomplish an equal tempered division of the octave. Each set of twelve frets represents an octave. The twelfth fret divides the scale length exactly into two halves, and the 24th fret position divides one of those halves in half again.
The ratio of the spacing of two consecutive frets is formula_1 (twelfth root of two). In practice, luthiers determine fret positions using the constant 17.817—an approximation to 1/(1-1/formula_1). If the nth fret is a distance x from the bridge, then the distance from the (n+1)th fret to the bridge is x-(x/17.817). Frets are available in several different gauges and can be fitted according to player preference. Among these are "jumbo" frets, which have much thicker gauge, allowing for use of a slight vibrato technique from pushing the string down harder and softer. "Scalloped" fretboards, where the wood of the fretboard itself is "scooped out" between the frets, allow a dramatic vibrato effect. Fine frets, much flatter, allow a very low string-action, but require that other conditions, such as curvature of the neck, be well-maintained to prevent buzz.
The truss rod is a thin, strong metal rod that runs along the inside of the neck. It is used to correct changes to the neck's curvature caused by aging of the neck timbers, changes in humidity, or to compensate for changes in the tension of strings. The tension of the rod and neck assembly is adjusted by a hex nut or an allen-key bolt on the rod, usually located either at the headstock, sometimes under a cover, or just inside the body of the guitar underneath the fretboard and accessible through the sound hole. Some truss rods can only be accessed by removing the neck. The truss rod counteracts the immense amount of tension the strings place on the neck, bringing the neck back to a straighter position. Turning the truss rod clockwise tightens it, counteracting the tension of the strings and straightening the neck or creating a backward bow. Turning the truss rod counter-clockwise loosens it, allowing string tension to act on the neck and creating a forward bow.
Adjusting the truss rod affects the intonation of a guitar as well as the height of the strings from the fingerboard, called the action. Some truss rod systems, called "double action" truss systems, tighten both ways, pushing the neck both forward and backward (standard truss rods can only release to a point beyond which the neck is no longer compressed and pulled backward). The artist and luthier Irving Sloane pointed out, in his book "Steel-String Guitar Construction," that truss rods are intended primarily to remedy concave bowing of the neck, but cannot correct a neck with "back bow" or one that has become twisted. Classical guitars do not require truss rods, as their nylon strings exert a lower tensile force with lesser potential to cause structural problems. However, their necks are often reinforced with a strip of harder wood, such as an ebony strip that runs down the back of a cedar neck. There is no tension adjustment on this form of reinforcement.
Inlays are visual elements set into the exterior surface of a guitar, both for decoration and artistic purposes and, in the case of the markings on the 3rd, 5th, 7th and 12th fret (and in higher octaves), to provide guidance to the performer about the location of frets on the instrument. The typical locations for inlay are on the fretboard, headstock, and on acoustic guitars around the soundhole, known as the rosette. Inlays range from simple plastic dots on the fretboard to intricate works of art covering the entire exterior surface of a guitar (front and back). Some guitar players have used LEDs in the fretboard to produce unique lighting effects onstage. Fretboard inlays are most commonly shaped like dots, diamond shapes, parallelograms, or large blocks in between the frets.
Dots are usually inlaid into the upper edge of the fretboard in the same positions, small enough to be visible only to the player. These usually appear on the odd numbered frets, but also on the 12th fret (the one octave mark) instead of the 11th and 13th frets. Some older or high-end instruments have inlays made of mother of pearl, abalone, ivory, colored wood or other exotic materials and designs. Simpler inlays are often made of plastic or painted. High-end classical guitars seldom have fretboard inlays as a well-trained player is expected to know his or her way around the instrument. In addition to fretboard inlay, the headstock and soundhole surround are also frequently inlaid. The manufacturer's logo or a small design is often inlaid into the headstock. Rosette designs vary from simple concentric circles to delicate fretwork mimicking the historic rosette of lutes. Bindings that edge the finger and sound boards are sometimes inlaid. Some instruments have a filler strip running down the length and behind the neck, used for strength or to fill the cavity through which the truss rod was installed in the neck.
In acoustic guitars, string vibration is transmitted through the bridge and saddle to the body via sound board. The sound board is typically made of tone woods such as spruce or cedar. Timbers for tone woods are chosen for both strength and ability to transfer mechanical energy from the strings to the air within the guitar body. Sound is further shaped by the characteristics of the guitar body's resonant cavity. In expensive instruments, the entire body is made of wood. In inexpensive instruments, the back may be made of plastic.
In an acoustic instrument, the body of the guitar is a major determinant of the overall sound quality. The guitar top, or soundboard, is a finely crafted and engineered element made of tonewoods such as spruce and red cedar. This thin piece of wood, often only 2 or 3 mm thick, is strengthened by differing types of internal bracing. Many luthiers consider the top the dominant factor in determining the sound quality. The majority of the instrument's sound is heard through the vibration of the guitar top as the energy of the vibrating strings is transferred to it. The body of an acoustic guitar has a sound hole through which sound projects. The sound hole is usually a round hole in the top of the guitar under the strings. Air inside the body vibrates as the guitar top and body is vibrated by the strings, and the response of the air cavity at different frequencies is characterized, like the rest of the guitar body, by a number of resonance modes at which it responds more strongly.
The top, back and ribs of an acoustic guitar body are very thin (1–2 mm), so a flexible piece of wood called lining is glued into the corners where the rib meets the top and back. This interior reinforcement provides 5 to 20 mm of solid gluing area for these corner joints. Solid linings are often used in classical guitars, while kerfed lining is most often found in steel string acoustics. Kerfed lining is also called kerfing because it is scored, or "kerfed"(incompletely sawn through), to allow it to bend with the shape of the rib). During final construction, a small section of the outside corners is carved or routed out and filled with binding material on the outside corners and decorative strips of material next to the binding, which are called purfling. This binding serves to seal off the end grain of the top and back. Purfling can also appear on the back of an acoustic guitar, marking the edge joints of the two or three sections of the back. Binding and purfling materials are generally made of either wood or plastic.
Body size, shape and style has changed over time. 19th century guitars, now known as salon guitars, were smaller than modern instruments. Differing patterns of internal bracing have been used over time by luthiers. Torres, Hauser, Ramirez, Fleta, and C. F. Martin were among the most influential designers of their time. Bracing not only strengthens the top against potential collapse due to the stress exerted by the tensioned strings, but also affects the resonance characteristics of the top. The back and sides are made out of a variety of timbers such as mahogany, Indian rosewood and highly regarded Brazilian rosewood ("Dalbergia nigra"). Each one is primarily chosen for their aesthetic effect and can be decorated with inlays and purfling.
Instruments with larger areas for the guitar top were introduced by Martin in an attempt to create greater volume levels. The popularity of the larger "dreadnought" body size amongst acoustic performers is related to the greater sound volume produced.
Most electric guitar bodies are made of wood and include a plastic pick guard. Boards wide enough to use as a solid body are very expensive due to the worldwide depletion of hardwood stock since the 1970s, so the wood is rarely one solid piece. Most bodies are made from two pieces of wood with some of them including a seam running down the center line of the body. The most common woods used for electric guitar body construction include maple, basswood, ash, poplar, alder, and mahogany. Many bodies consist of good-sounding, but inexpensive woods, like ash, with a "top", or thin layer of another, more attractive wood (such as maple with a natural "flame" pattern) glued to the top of the basic wood. Guitars constructed like this are often called "flame tops". The body is usually carved or routed to accept the other elements, such as the bridge, pickup, neck, and other electronic components. Most electrics have a polyurethane or nitrocellulose lacquer finish. Other alternative materials to wood are used in guitar body construction. Some of these include carbon composites, plastic material, such as polycarbonate, and aluminum alloys.
The main purpose of the bridge on an acoustic guitar is to transfer the vibration from the strings to the soundboard, which vibrates the air inside of the guitar, thereby amplifying the sound produced by the strings. On all electric, acoustic and original guitars, the bridge holds the strings in place on the body. There are many varied bridge designs. There may be some mechanism for raising or lowering the bridge saddles to adjust the distance between the strings and the fretboard (action), or fine-tuning the intonation of the instrument. Some are spring-loaded and feature a "whammy bar", a removable arm that lets the player modulate the pitch by changing the tension on the strings. The whammy bar is sometimes also called a "tremolo bar". (The effect of rapidly changing pitch is properly called "vibrato". See Tremolo for further discussion of this term.) Some bridges also allow for alternate tunings at the touch of a button.
On almost all modern electric guitars, the bridge has saddles that are adjustable for each string so that intonation stays correct up and down the neck. If the open string is in tune, but sharp or flat when frets are pressed, the bridge saddle position can be adjusted with a screwdriver or hex key to remedy the problem. In general, flat notes are corrected by moving the saddle forward and sharp notes by moving it backwards. On an instrument correctly adjusted for intonation, the actual length of each string from the nut to the bridge saddle is slightly, but measurably longer than the scale length of the instrument. This additional length is called compensation, which flattens all notes a bit to compensate for the sharping of all fretted notes caused by stretching the string during fretting.
The saddle of a guitar refers to the part of the bridge that physically supports the strings. It may be one piece (typically on acoustic guitars) or separate pieces, one for each string (electric guitars and basses). The saddle's basic purpose is to provide the end point for the string's vibration at the correct location for proper intonation, and on acoustic guitars to transfer the vibrations through the bridge into the top wood of the guitar. Saddles are typically made of plastic or bone for acoustic guitars, though synthetics and some exotic animal tooth variations (e.g. fossilized tooth, ivory, etc. ) have become popular with some players. Electric guitar saddles are typically metal, though some synthetic saddles are available.
The pickguard, also known as the scratchplate, is usually a piece of laminated plastic or other material that protects the finish of the top of the guitar from damage due to the use of a plectrum ("pick") or fingernails. Electric guitars sometimes mount pickups and electronics on the pickguard. It is a common feature on steel-string acoustic guitars. Some performance styles that use the guitar as a percussion instrument (tapping the top or sides between notes, etc.), such as flamenco, require that a scratchplate or pickguard be fitted to nylon-string instruments.
The standard guitar has six strings, but four-, seven-, eight-, nine-, ten-, eleven-, twelve-, thirteen- and eighteen-string guitars are also available. Classical and flamenco guitars historically used gut strings, but these have been superseded by polymer materials, such as nylon and fluorocarbon. Modern guitar strings are constructed from metal, polymers, or animal or plant product materials. Instruments utilizing "steel" strings may have strings made from alloys incorporating steel, nickel or phosphor bronze. Bass strings for both instruments are wound rather than monofilament.
Pickups are transducers attached to a guitar that detect (or "pick up") string vibrations and convert the mechanical energy of the string into electrical energy. The resultant electrical signal can then be electronically amplified. The most common type of pickup is electromagnetic in design. These contain magnets that are within a coil, or coils, of copper wire. Such pickups are usually placed directly underneath the guitar strings. Electromagnetic pickups work on the same principles and in a similar manner to an electric generator. The vibration of the strings creates a small electric current in the coils surrounding the magnets. This signal current is carried to a guitar amplifier that drives a loudspeaker.
Traditional electromagnetic pickups are either single-coil or double-coil. Single-coil pickups are susceptible to noise induced by stray electromagnetic fields, usually mains-frequency (60 or 50 hertz) hum. The introduction of the double-coil humbucker in the mid-1950s solved this problem through the use of two coils, one of which is wired in opposite polarity to cancel or "buck" stray fields.
The types and models of pickups used can greatly affect the tone of the guitar. Typically, humbuckers, which are two magnet-coil assemblies attached to each other, are traditionally associated with a heavier sound. Single-coil pickups, one magnet wrapped in copper wire, are used by guitarists seeking a brighter, twangier sound with greater dynamic range.
Modern pickups are tailored to the sound desired. A commonly applied approximation used in selection of a pickup is that less wire (lower electrical impedance) gives brighter sound, more wire gives a "fat" tone. Other options include specialized switching that produces coil-splitting, in/out of phase and other effects. Guitar circuits are either active, needing a battery to power their circuit, or, as in most cases, equipped with a passive circuit.
Fender Stratocaster-type guitars generally utilize three single-coil pickups, while most Gibson Les Paul types use humbucker pickups.
Piezoelectric, or piezo, pickups represent another class of pickup. These employ piezoelectricity to generate the musical signal and are popular in hybrid electro-acoustic guitars. A crystal is located under each string, usually in the saddle. When the string vibrates, the shape of the crystal is distorted, and the stresses associated with this change produce tiny voltages across the crystal that can be amplified and manipulated. Piezo pickups usually require a powered pre-amplifier to lift their output to match that of electromagnetic pickups. Power is typically delivered by an on-board battery.
Most pickup-equipped guitars feature onboard controls, such as volume or tone, or pickup selection. At their simplest, these consist of passive components, such as potentiometers and capacitors, but may also include specialized integrated circuits or other active components requiring batteries for power, for preamplification and signal processing, or even for electronic tuning. In many cases, the electronics have some sort of shielding to prevent pickup of external interference and noise.
Guitars may be shipped or retrofitted with a hexaphonic pickup, which produces a separate output for each string, usually from a discrete piezoelectric or magnetic pickup. This arrangement lets on-board or external electronics process the strings individually for modeling or Musical Instrument Digital Interface (MIDI) conversion. Roland makes "GK" hexaphonic pickups for guitar and bass, and a line of guitar modeling and synthesis products. Line 6's hexaphonic-equipped Variax guitars use on-board electronics to model the sound after various vintage instruments, and vary pitch on individual strings.
MIDI converters use a hexaphonic guitar signal to determine pitch, duration, attack, and decay characteristics. The MIDI sends the note information to an internal or external sound bank device. The resulting sound closely mimics numerous instruments. The MIDI setup can also let the guitar be used as a game controller (i.e., Rock Band Squier) or as an instructional tool, as with the Fretlight Guitar.
Notationally, the guitar is considered a transposing instrument. Its pitch sounds one octave lower than it is notated on a score.
A variety of tunings may be used. The most common tuning, known as "Standard Tuning", has the strings tuned from a low E, to a high E, traversing a two octave range—EADGBE. When all strings are played open the resulting chord is an Em7/add11.
The pitches are as follows:
The table below shows a pitch's name found over the six strings of a guitar in standard tuning, from the nut (zero), to the twelfth fret.
For four strings, the 5th fret on one string is the same open-note as the next string; for example, a 5th-fret note on the sixth string is the same note as the open fifth string. However, between the second and third strings, an irregularity occurs: The "4th"-fret note on the third string is equivalent to the open second string.
Standard tuning has evolved to provide a good compromise between simple fingering for many chords and the ability to play common scales with reasonable left-hand movement. There are also a variety of commonly used alternative tunings, for example, the classes of "open", "regular", and "dropped" tunings.
"Open tuning" refers to a guitar tuned so that strumming the open strings produces a chord, typically a major chord. The base chord consists of at least 3 notes and may include all the strings or a subset. The tuning is named for the open chord, Open D, open G, and open A are popular tunings. All similar chords in the chromatic scale can then be played by barring a single fret. Open tunings are common in blues music and folk music, and they are used in the playing of slide and bottleneck guitars. Many musicians use open tunings when playing slide guitar.
For the standard tuning, there is exactly one interval of a major third between the second and third strings, and all the other intervals are fourths. The irregularity has a price – chords cannot be shifted around the fretboard in the standard tuning E-A-D-G-B-E, which requires four chord-shapes for the major chords. There are separate chord-forms for chords having their root note on the third, fourth, fifth, and sixth strings.
In contrast, "regular" tunings have equal intervals between the strings, and so they have symmetrical scales all along the fretboard. This makes it simpler to translate chords. For the regular tunings, chords may be moved diagonally around the fretboard. The diagonal movement of chords is especially simple for the regular tunings that are repetitive, in which case chords can be moved vertically: Chords can be moved three strings up (or down) in major-thirds tuning and chords can be moved two strings up (or down) in augmented-fourths tuning. Regular tunings thus appeal to new guitarists and also to jazz-guitarists, whose improvisation is simplified by regular intervals.
On the other hand, some chords are more difficult to play in a regular tuning than in standard tuning. It can be difficult to play conventional chords especially in augmented-fourths tuning and all-fifths tuning, in which the large spacings require hand stretching. Some chords, which are conventional in folk music, are difficult to play even in all-fourths and major-thirds tunings, which do not require more hand-stretching than standard tuning.
Another class of alternative tunings are called drop tunings, because the tuning "drops down" the lowest string. Dropping down the lowest string a whole tone results in the "drop-D" (or "dropped D") tuning. Its open-string notes DADGBE (from low to high) allow for a deep bass D note, which can be used in keys such as D major, d minor and G major. It simplifies the playing of simple fifths (powerchords). Many contemporary rock bands re-tune all strings down, making, for example, Drop-C or Drop-B tunings.
Many scordatura (alternate tunings) modify the standard tuning of the lute, especially when playing Renaissance music repertoire originally written for that instrument. Some scordatura drop the pitch of one or more strings, giving access to new lower notes. Some scordatura make it easier to play in unusual keys.
Though a guitar may be played on its own, there are a variety of common accessories used for holding and playing the guitar.
A capo (short for "capotasto") is used to change the pitch of open strings. Capos are clipped onto the fretboard with the aid of spring tension or, in some models, elastic tension. To raise the guitar's pitch by one semitone, the player would clip the capo onto the fretboard just below the first fret. Its use allows players to play in different keys without having to change the chord formations they use. For example, if a folk guitar player wanted to play a song in the key of B Major, they could put a capo on the second fret of the instrument, and then play the song as if it were in the key of A Major, but with the capo the instrument would make the sounds of B Major. This is because with the capo barring the entire second fret, open chords would all sound two semitones (in other words, one tone) higher in pitch. For example, if a guitarist played an open A Major chord (a very common open chord), it would sound like a B Major chord. All of the other open chords would be similarly modified in pitch. Because of the ease with which they allow guitar players to change keys, they are sometimes referred to with pejorative names, such as "cheaters" or the "hillbilly crutch". Despite this negative viewpoint, another benefit of the capo is that it enables guitarists to obtain the ringing, resonant sound of the common keys (C, G, A, etc.) in "harder" and less-commonly used keys. Classical performers are known to use them to enable modern instruments to match the pitch of historical instruments such as the Renaissance music lute.
A slide, (neck of a bottle, knife blade or round metal or glass bar or cylinder) is used in blues and rock to create a glissando or "Hawaiian" effect. The slide is used to fret notes on the neck, instead of using the fretting hand's fingers. The characteristic use of the slide is to move up to the intended pitch by, as the name implies, sliding up the neck to the desired note. The necks of bottles were often used in blues and country music as improvised slides. Modern slides are constructed of glass, plastic, ceramic, chrome, brass or steel bars or cylinders, depending on the weight and tone desired (and the amount of money a guitarist can spend). An instrument that is played exclusively in this manner (using a metal bar) is called a steel guitar or pedal steel. Slide playing to this day is very popular in blues music and country music. Some slide players use a so-called Dobro guitar. Some performers who have become famous for playing slide are Robert Johnson, Elmore James, Ry Cooder, George Harrison, Bonnie Raitt, Derek Trucks, Warren Haynes, Duane Allman, Muddy Waters, Rory Gallagher, and George Thorogood.
A "guitar pick" or "plectrum" is a small piece of hard material generally held between the thumb and first finger of the picking hand and is used to "pick" the strings. Though most classical players pick with a combination of fingernails and fleshy fingertips, the pick is most often used for electric and steel-string acoustic guitars. Though today they are mainly plastic, variations do exist, such as bone, wood, steel or tortoise shell. Tortoise shell was the most commonly used material in the early days of pick-making, but as tortoises and turtles became endangered, the practice of using their shells for picks or anything else was banned. Tortoise-shell picks made before the ban are often coveted for a supposedly superior tone and ease of use, and their scarcity has made them valuable.
Picks come in many shapes and sizes. Picks vary from the small jazz pick to the large bass pick. The thickness of the pick often determines its use. A thinner pick (between 0.2 and 0.5 mm) is usually used for strumming or rhythm playing, whereas thicker picks (between 0.7 and 1.5+ mm) are usually used for single-note lines or lead playing. The distinctive guitar sound of Billy Gibbons is attributed to using a quarter or peso as a pick. Similarly, Brian May is known to use a sixpence coin as a pick, while noted 1970s and early 1980s session musician David Persons is known for using old credit cards, cut to the correct size, as plectrums.
Thumb picks and finger picks that attach to the finger tips are sometimes employed in finger-picking styles on steel strings. These allow the fingers and thumb to operate independently, whereas a flat pick requires the thumb and one or two fingers to manipulate.
A guitar strap is a strip of material with an attachment mechanism on each end, made to hold a guitar via the shoulders at an adjustable length. Guitars have varying accommodations for attaching a strap. The most common are strap buttons, also called strap pins, which are flanged steel posts anchored to the guitar with screws. Two strap buttons come pre-attached to virtually all electric guitars, and many steel-string acoustic guitars. Strap buttons are sometimes replaced with "strap locks", which connect the guitar to the strap more securely.
The lower strap button is usually located at the bottom (bridge end) of the body. The upper strap button is usually located near or at the top (neck end) of the body: on the upper body curve, at the tip of the upper "horn" (on a double cutaway), or at the neck joint (heel). Some electrics, especially those with odd-shaped bodies, have one or both strap buttons on the back of the body. Some Steinberger electric guitars, owing to their minimalist and lightweight design, have both strap buttons at the bottom of the body. Rarely, on some acoustics, the upper strap button is located on the headstock. Some acoustic and classical guitars only have a single strap button at the bottom of the body—the other end must be tied onto the headstock, above the nut and below the machine heads.
Electric guitars and bass guitars have to be used with a guitar amplifier and loudspeaker or a bass amplifier and speaker, respectively, in order to make enough sound to be heard by the performer and audience. Electric guitars and bass guitars almost always use magnetic pickups, which generate an electric signal when the musician plucks, strums or otherwise plays the instrument. The amplifier and speaker strengthen this signal using a power amplifier and a loudspeaker. Acoustic guitars that are equipped with a piezoelectric pickup or microphone can also be plugged into an instrument amplifier, acoustic guitar amp or PA system to make them louder. With electric guitar and bass, the amplifier and speaker are not just used to make the instrument louder; by adjusting the equalizer controls, the preamplifier, and any onboard effects units (reverb, distortion/overdrive, etc.) the player can also modify the tone (also called the timbre or "colour") and sound of the instrument. Acoustic guitar players can also use the amp to change the sound of their instrument, but in general, acoustic guitar amps are used to make the natural acoustic sound of the instrument louder without significantly changing its sound.
|
https://en.wikipedia.org/wiki?curid=11846
|
Gnutella
Gnutella is a large peer-to-peer network. It was the first decentralized peer-to-peer network of its kind, leading to other, later networks adopting the model. It celebrated two decades of existence on March 14, 2020, and has a user base in the millions for peer-to-peer file sharing.
In June 2005, Gnutella's population was 1.81 million computers increasing to over three million nodes by January 2006. In late 2007, it was the most popular file-sharing network on the Internet with an estimated market share of more than 40%.
The first client (also called Gnutella) from which the network got its name was developed by Justin Frankel and Tom Pepper of Nullsoft in early 2000, soon after the company's acquisition by AOL. On March 14, the program was made available for download on Nullsoft's servers. The event was prematurely announced on Slashdot, and thousands downloaded the program that day. The source code was to be released later, under the GNU General Public License (GPL); however, the original developers never got the chance to accomplish this purpose.
The next day, AOL stopped the availability of the program over legal concerns and restrained Nullsoft from doing any further work on the project. This did not stop Gnutella; after a few days, the protocol had been reverse engineered, and compatible free and open source clones began to appear. This parallel development of different clients by different groups remains the "modus operandi" of Gnutella development today.
Among the first independent Gnutella pioneers were Gene Kan and Spencer Kimball, they launched the first portal aimed to assemble the open-source community to work on Gnutella, and also developed "GNUbile", one of the first open-source (GNU-GPL) programs to implement the Gnutella protocol.
The Gnutella network is a fully distributed alternative to such semi-centralized systems as FastTrack (KaZaA) and the original Napster. The initial popularity of the network was spurred on by Napster's threatened legal demise in early 2001. This growing surge in popularity revealed the limits of the initial protocol's scalability. In early 2001, variations on the protocol (first implemented in proprietary and closed source clients) allowed an improvement in scalability. Instead of treating every user as client and server, some users were now treated as "ultrapeers", routing search requests and responses for users connected to them.
This allowed the network to grow in popularity. In late 2001, the Gnutella client LimeWire Basic became free and open source. In February 2002, Morpheus, a commercial file sharing group, abandoned its FastTrack-based peer-to-peer software and released a new client based on the free and open source Gnutella client Gnucleus.
The word "Gnutella" today refers not to any one project or piece of software, but to the open protocol used by the various clients.
The name is a portmanteau of "GNU" and "Nutella", the brand name of an Italian hazelnut flavored spread: supposedly, Frankel and Pepper ate a lot of Nutella working on the original project, and intended to license their finished program under the GNU General Public License. Gnutella is not associated with the GNU project or GNU's own peer-to-peer network, GNUnet.
On October 26, 2010, the popular Gnutella client LimeWire was ordered shut down by Judge Kimba Wood of the United States District Court for the Southern District of New York when she signed a Consent Decree to which recording industry plaintiffs and LimeWire had agreed. This event was the likely cause of a notable drop in the size of the network, because, while negotiating the injunction, LimeWire staff had inserted remote-disabling code into the software. As the injunction came into force, users who had installed affected versions (newer than 5.5.10) were cut off from the P2P network. Since LimeWire was free software, nothing had prevented the creation of forks that omitted the disabling code, as long as LimeWire trademarks were not used. The shutdown did not affect, for example, FrostWire, a fork of LimeWire created in 2004 that carries neither the remote-disabling code nor adware.
On November 9, 2010, LimeWire was resurrected by a secret team of developers and named LimeWire Pirate Edition. It was based on LimeWire 5.6 BETA. This version had its server dependencies removed and all the PRO features enabled for free.
To envision how Gnutella originally worked, imagine a large circle of users "(called nodes)," each of whom has Gnutella client software. On initial startup, the client software must bootstrap and find at least one other node. Various methods have been used for this, including a pre-existing address list of possibly working nodes shipped with the software, using updated web caches of known nodes (called "Gnutella Web Caches"), UDP host caches and, rarely, even IRC. Once connected, the client requests a list of working addresses. The client tries to connect to the nodes it was shipped with, as well as nodes it receives from other clients until it reaches a certain quota. It connects to only that many nodes, locally caching the addresses it has not yet tried and discards the addresses it tried that were invalid.
When the user wants to do a search, the client sends the request to each actively connected node. In version 0.4 of the protocol, the number of actively connected nodes for a client was quite small (around 5), so each node then forwarded the request to all its actively connected nodes, and they, in turn, forwarded the request, and so on, until the packet reached a predetermined number of "hops" from the sender (maximum 7).
Since version 0.6 (2002), Gnutella is a composite network made of leaf nodes and ultra nodes (also called ultrapeers). The leaf nodes are connected to a small number of ultrapeers (typically 3) while each ultrapeer is connected to more than 32 other ultrapeers. With this higher outdegree, the maximum number of "hops" a query can travel was lowered to 4.
Leaves and ultrapeers use the Query Routing Protocol to exchange a Query Routing Table (QRT), a table of 64 Ki-slots and up to 2 Mi-slots consisting of hashed keywords. A leaf node sends its QRT to each of the ultrapeers it is connected to, and ultrapeers merge the QRT of all their leaves (downsized to 128 Ki-slots) plus their own QRT (if they share files) and exchange that with their own neighbors. Query routing is then done by hashing the words of the query and seeing whether all of them match in the QRT. Ultrapeers do that check before forwarding a query to a leaf node, and also before forwarding the query to a peer ultra node provided this is the last hop the query can travel.
If a search request turns up a result, the node that has the result contacts the searcher. In the classic Gnutella protocol, response messages were sent back along the route the query came through, as the query itself did not contain identifying information of the node. This scheme was later revised, so that search results now are delivered over User Datagram Protocol (UDP) directly to the node that initiated the search, usually an ultrapeer of the node. Thus, in the current protocol, the queries carry the IP address and port number of either node. This lowers the amount of traffic routed through the Gnutella network, making it significantly more scalable.
If the user decides to download the file, they negotiate the file transfer. If the node which has the requested file is not firewalled, the querying node can connect to it directly. However, if the node is firewalled, stopping the source node from receiving incoming connections, the client wanting to download a file sends it a so-called "push request" to the server for the remote client to initiate the connection instead (to "push" the file). At first, these push requests were routed along the original chain it used to send the query. This was rather unreliable because routes would often break and routed packets are always subject to flow control. Therefore, so-called "push proxies" were introduced. These are usually the ultrapeers of a leaf node and they are announced in search results. The client connects to one of these "push proxies" using an HTTP request and the proxy sends a "push request" to leaf on behalf of the client. Normally, it is also possible to send a push request over UDP to the push proxy which is more efficient than using TCP. Push proxies have two advantages: First, ultrapeer-leaf connections are more stable than routes which makes push requests much more reliable. Second, it reduces the amount of traffic routed through the Gnutella network.
Finally, when a user disconnects, the client software saves the list of nodes that it was actively connected to and those collected from pong packets for use the next time it attempts to connect so that it becomes independent from any kind of bootstrap services.
In practice, this method of searching on the Gnutella network was often unreliable. Each node is a regular computer user; as such, they are constantly connecting and disconnecting, so the network is never completely stable. Also, the bandwidth cost of searching on Gnutella grew exponentially to the number of connected users, often saturating connections and rendering slower nodes useless. Therefore, search requests would often be dropped, and most queries reached only a very small part of the network. This observation identified the Gnutella network as an unscalable distributed system, and inspired the development of distributed hash tables, which are much more scalable but support only exact-match, rather than keyword, search.
To address the problems of bottlenecks, Gnutella developers implemented a tiered system of "ultrapeers" and "leaves". Instead of all nodes being considered equal, nodes entering into the network were kept at the 'edge' of the network as a leaf, not responsible for any routing, and nodes which were capable of routing messages were promoted to ultrapeers, which would accept leaf connections and route searches and network maintenance messages. This allowed searches to propagate further through the network, and allowed for numerous alterations in the topology which have improved the efficiency and scalability greatly.
Additionally, gnutella adopted a number of other techniques to reduce traffic overhead and make searches more efficient. Most notable are Query Routing Protocol (QRP) and Dynamic Querying (DQ). With QRP a search reaches only those clients which are likely to have the files, so rare files searches grow vastly more efficient, and with DQ the search stops as soon as the program has acquired enough search results, which vastly reduces the amount of traffic caused by popular searches. Gnutella For Users has a vast amount of information about these and other improvements to Gnutella in user-friendly style.
One of the benefits of having Gnutella so decentralized is to make it very difficult to shut the network down and to make it a network in which the users are the only ones who can decide which content will be available. Unlike Napster, where the entire network relied on the central server, Gnutella cannot be shut down by shutting down any one node. A decentralized network prevents bad actors from taking control of the contents of the network and/or manipulating data by controlling the central server.
Gnutella did once operate on a purely query flooding-based protocol. The outdated Gnutella version 0.4 network protocol employs five different packet types, namely
These are mainly concerned with searching the Gnutella network. File transfers are handled using HTTP.
The development of the Gnutella protocol is currently led by the Gnutella Developers Forum (The GDF). Many protocol extensions have been and are being developed by the software vendors and free Gnutella developers of the GDF. These extensions include intelligent query routing, SHA-1 checksums, query hit transmission via UDP, querying via UDP, dynamic queries via TCP, file transfers via UDP, XML metadata, source exchange (also termed "the download mesh") and parallel downloading in slices (swarming).
There are efforts to finalize these protocol extensions in the Gnutella 0.6 specification at the Gnutella protocol development website. The Gnutella 0.4 standard, although still being the latest protocol specification since all extensions only exist as proposals so far, is outdated. In fact, it is hard or impossible to connect today with the 0.4 handshakes and according to developers in the GDF, version 0.6 is what new developers should pursue using the work-in-progress specifications.
The Gnutella protocol remains under development and in spite of attempts to make a clean break with the complexity inherited from the old Gnutella 0.4 and to design a clean new message architecture, it is still one of the most successful file-sharing protocols to date.
The following tables compare general and technical information for a number of applications supporting the Gnutella network. The tables do not attempt to give a complete list of Gnutella clients. The tables are limited to clients that can participate in the current Gnutella network.
The Gnutella2 protocol (often referred to as G2), despite its name, is not a successor protocol of Gnutella nor related to the original Gnutella project, but rather is a completely different protocol that forked from the original project and piggybacked on the Gnutella name. A sore point with many Gnutella developers is that the "Gnutella2" name conveys an upgrade or superiority, which led to a flame war. Other criticism included the use of the Gnutella network to bootstrap G2 peers and poor documentation of the G2 protocol. Additionally, the more frequent search retries of the Shareaza client, one of the initial G2 clients, could unnecessarily burden the Gnutella network.
Both protocols have undergone significant changes since the fork in 2002. G2 has advantages and disadvantages compared to Gnutella. An advantage often cited is Gnutella2's hybrid search is more efficient than the original Gnutella's query flooding, which was later replaced by more efficient search methods, starting with Query Routing in 2002, which was proposed in 2001 by Limewire developers. An advantage for Gnutella is that its users number in the millions, whereas the G2 network is approximately an order of magnitude smaller. It is difficult to compare the protocols in their current form; the individual client choice will probably have as much of an effect to an end user on either network.
|
https://en.wikipedia.org/wiki?curid=11856
|
George Lucas
George Walton Lucas Jr. (born May 14, 1944) is an American film director, producer, screenwriter, and entrepreneur. Lucas is best known for creating the "Star Wars" and "Indiana Jones" franchises and founding Lucasfilm, LucasArts, and Industrial Light & Magic. He served as chairman of Lucasfilm before selling it to The Walt Disney Company in 2012.
After graduating from the University of Southern California in 1967, Lucas co-founded American Zoetrope with filmmaker Francis Ford Coppola. Lucas wrote and directed "THX 1138" (1971), based on his earlier student short "", which was a critical success but a financial failure. His next work as a writer-director was the film "American Graffiti" (1973), inspired by his youth in the early 1960s Modesto, California, and produced through the newly founded Lucasfilm. The film was critically and commercially successful and received five Academy Award nominations, including Best Picture.
Lucas's next film, the epic space opera "Star Wars" (1977), had a troubled production but was a surprise hit, becoming the highest-grossing film at the time, winning six Academy Awards and sparking a cultural phenomenon. Lucas produced and co-wrote the sequels "The Empire Strikes Back" (1980) and "Return of the Jedi" (1983). With director Steven Spielberg, he created, produced, and co-wrote the "Indiana Jones" films "Raiders of the Lost Ark" (1981), "The Temple of Doom" (1984), "The Last Crusade" (1989) and "The Kingdom of the Crystal Skull" (2008). He also produced and wrote a variety of films and television series through Lucasfilm between the 1970s and the 2010s.
In 1997, Lucas re-released the "Star Wars" Trilogy as part of a special edition featuring several alterations; home media versions with further changes were released in 2004 and 2011. He returned to directing with a "Star Wars" prequel trilogy comprising "" (1999), "" (2002) and "" (2005). He last collaborated on the CGI-animated television series "" (2008–2014, 2020), the war film "Red Tails" (2012), and the CGI film "Strange Magic" (2015).
Lucas is one of history's most financially successful filmmakers and has been nominated for four Academy Awards. His films are among the 100 highest-grossing movies at the North American box office, adjusted for ticket-price inflation. Lucas is considered a significant figure of the 20th-century New Hollywood movement.
Lucas was born and raised in Modesto, California, the son of Dorothy Ellinore Lucas (née Bomberger) and George Walton Lucas Sr., and is of German, Swiss-German, English, Scottish, and distant Dutch and French descent. His family attended Disneyland during its opening week in July 1955, and Lucas would remain enthusiastic about the park. He was interested in comics and science fiction, including television programs such as the "Flash Gordon" serials. Long before Lucas began making films, he yearned to be a racecar driver, and he spent most of his high school years racing on the underground circuit at fairgrounds and hanging out at garages. On June 12, 1962, a few days before his high school graduation, Lucas was driving his souped-up Autobianchi Bianchina when another driver broadsided him, flipping his car several times before it crashed into a tree; Lucas's seatbelt had snapped, ejecting him and thereby saving his life. However, his lungs were bruised from severe hemorrhaging and he required emergency medical treatment. This incident caused him to lose interest in racing as a career, but also inspired him to pursue his other interests.
Lucas's father owned a stationery store, and had wanted George to work for him when he turned 18. Lucas had been planning to go to art school, and declared upon leaving home that he would be a millionaire by the age of 30. He attended Modesto Junior College, where he studied anthropology, sociology, and literature, amongst other subjects. He also began shooting with an 8 mm camera, including filming car races.
At this time, Lucas and his friend John Plummer became interested in Canyon Cinema: screenings of underground, avant-garde 16 mm filmmakers like Jordan Belson, Stan Brakhage, and Bruce Conner. Lucas and Plummer also saw classic European films of the time, including Jean-Luc Godard's "Breathless", François Truffaut's "Jules et Jim", and Federico Fellini's "8½". "That's when George really started exploring," Plummer said. Through his interest in autocross racing, Lucas met renowned cinematographer Haskell Wexler, another race enthusiast. Wexler, later to work with Lucas on several occasions, was impressed by Lucas' talent. "George had a very good eye, and he thought visually," he recalled.
At Plummer's recommendation, Lucas then transferred to the University of Southern California (USC) School of Cinematic Arts. USC was one of the earliest universities to have a school devoted to motion picture film. During the years at USC, Lucas shared a dorm room with Randal Kleiser. Along with classmates such as Walter Murch, Hal Barwood, and John Milius, they became a clique of film students known as The Dirty Dozen. He also became good friends with fellow acclaimed student filmmaker and future "Indiana Jones" collaborator, Steven Spielberg. Lucas was deeply influenced by the Filmic Expression course taught at the school by filmmaker Lester Novros which concentrated on the non-narrative elements of Film Form like color, light, movement, space, and time. Another inspiration was the Serbian montagist (and dean of the USC Film Department) Slavko Vorkapić, a film theoretician who made stunning montage sequences for Hollywood studio features at MGM, RKO, and Paramount. Vorkapich taught the autonomous nature of the cinematic art form, emphasizing kinetic energy inherent in motion pictures.
Lucas saw many inspiring films in class, particularly the visual films coming out of the National Film Board of Canada like Arthur Lipsett's "21-87", the French-Canadian cameraman Jean-Claude Labrecque's cinéma vérité "60 Cycles", the work of Norman McLaren, and the documentaries of Claude Jutra. Lucas fell madly in love with pure cinema and quickly became prolific at making 16 mm nonstory noncharacter visual tone poems and cinéma vérité with such titles as "Look at Life", "Herbie", "", "The Emperor", "Anyone Lived in a Pretty (how) Town", "Filmmaker", and "6-18-67". He was passionate and interested in camerawork and editing, defining himself as a filmmaker as opposed to being a director, and he loved making abstract visual films that created emotions purely through cinema.
After graduating with a bachelor of fine arts in film in 1967, he tried joining the United States Air Force as an officer, but he was immediately turned down because of his numerous speeding tickets. He was later drafted by the Army for military service in Vietnam, but he was exempted from service after medical tests showed he had diabetes, the disease that killed his paternal grandfather.
In 1967, Lucas re-enrolled as a USC graduate student in film production. He began working under Verna Fields for the United States Information Agency, where he met his future wife Marcia Griffin. Working as a teaching instructor for a class of U.S. Navy students who were being taught documentary cinematography, Lucas directed the short film "", which won first prize at the 1967–68 National Student film festival. Lucas was awarded a student scholarship by Warner Bros. to observe and work on the making of a film of his choosing. The film he chose was "Finian's Rainbow" (1968) which was being directed by Francis Ford Coppola, who was revered among film school students of the time as a cinema graduate who had "made it" in Hollywood. In 1969, Lucas was one of the camera operators on the classic Rolling Stones concert film "Gimme Shelter".
In 1969, Lucas co-founded the studio American Zoetrope with Coppola, hoping to create a liberating environment for filmmakers to direct outside the perceived oppressive control of the Hollywood studio system. Coppola thought Lucas's "Electronic Labyrinth" could be adapted into his first full-length feature film, which was produced by American Zoetrope as "THX 1138", but was not a success. Lucas then created his own company, Lucasfilm, Ltd., and directed the successful "American Graffiti" (1973).
Lucas then set his sights on adapting Flash Gordon, an adventure serial from his childhood that he fondly remembered. When he was unable to obtain the rights, he set out to write an original space adventure that would eventually become "Star Wars". Despite his success with his previous film, all but one studio turned "Star Wars" down. It was only because Alan Ladd, Jr., at 20th Century Fox liked "American Graffiti" that he forced through a production and distribution deal for the film, which ended up restoring Fox to financial stability after a number of flops. "Star Wars" was significantly influenced by samurai films of Akira Kurosawa, Spaghetti Westerns, as well as classic sword and sorcery fantasy stories.
"Star Wars" quickly became the highest-grossing film of all-time, displaced five years later by Spielberg's "E.T. the Extra-Terrestrial". After the success of "American Graffiti" and prior to the beginning of filming on "Star Wars", Lucas was encouraged to renegotiate for a higher fee for writing and directing "Star Wars" than the $150,000 agreed. He declined to do so, instead negotiating for advantage in some of the as-yet-unspecified parts of his contract with Fox, in particular, ownership of licensing and merchandising rights (for novelizations, clothing, toys, etc.) and contractual arrangements for sequels. Lucasfilm has earned hundreds of millions of dollars from licensed games, toys, and collectibles created for the franchise.
The original "Star Wars" film went through a tumultuous production, and during editing, Lucas suffered chest pains initially feared to be a heart attack, but actually a fit of hypertension and exhaustion.
Following the release of the first "Star Wars" film, Lucas worked extensively as a writer and producer, including on the many "Star Wars" spinoffs made for film, television, and other media. Lucas acted as executive producer for the next two "Star Wars" films, commissioning Irvin Kershner to direct "The Empire Strikes Back", and Richard Marquand to direct "Return of the Jedi", while receiving a story credit on the former and sharing a screenwriting credit with Lawrence Kasdan on the latter. He also acted as story writer and executive producer on all four of the "Indiana Jones" films, which his colleague and good friend Steven Spielberg directed.
Other successful projects where Lucas acted as an executive producer and occasional story writer in this period include Kurosawa's "Kagemusha" (1980), Lawrence Kasdan's "Body Heat" (1981), "" (1984), "" (1985), Jim Henson's "Labyrinth" (1986), Godfrey Reggio's "Powaqqatsi" (1986), Don Bluth's "The Land Before Time" (1988), and the "Indiana Jones" television spinoff "The Young Indiana Jones Chronicles" (1992–96). There were unsuccessful projects, however, including "More American Graffiti" (1979), Willard Huyck's "Howard the Duck" (1986), which was the biggest flop of Lucas's career, Ron Howard's "Willow" (1988), Coppola's "" (1988), and Mel Smith's "Radioland Murders" (1994).
The animation studio Pixar was founded in 1979 as the Graphics Group, one third of the Computer Division of Lucasfilm. Pixar's early computer graphics research resulted in groundbreaking effects in films such as "" and "Young Sherlock Holmes", and the group was purchased in 1986 by Steve Jobs shortly after he left Apple Computer. Jobs paid Lucas US$5 million and put US$5 million as capital into the company. The sale reflected Lucas' desire to stop the cash flow losses from his 7-year research projects associated with new entertainment technology tools, as well as his company's new focus on creating entertainment products rather than tools. As of June 1983, Lucas was worth US$60 million, but he met cash-flow difficulties following his divorce that year, concurrent with the sudden dropoff in revenues from "Star Wars" licenses following the theatrical run of "Return of the Jedi". At this point, Lucas had no desire to return to "Star Wars", and had unofficially canceled the sequel trilogy.
Also in 1983, Lucas and Tomlinson Holman founded the audio company THX Ltd. The company was formerly owned by Lucasfilm, and contains equipment for stereo, digital, and theatrical sound for films, and music. Skywalker Sound and Industrial Light & Magic, are the sound and visual effects subdivisions of Lucasfilm, while Lucasfilm Games, later renamed LucasArts, produces products for the gaming industry.
Having lost much of his fortune in a divorce settlement in 1987, Lucas was reluctant to return to "Star Wars". However, the prequels, which were still only a series of basic ideas partially pulled from his original drafts of "The Star Wars", continued to tantalize him with technical possibilities that would make it worthwhile to revisit his older material. When "Star Wars" became popular once again, in the wake of Dark Horse's comic book line and Timothy Zahn's trilogy of spin-off novels, Lucas realized that there was still a large audience. His children were older, and with the explosion of CGI technology he began to consider directing once again.
By 1993, it was announced, in "Variety" among other sources, that Lucas would be making the prequels. He began penning more to the story, indicating that the series would be a tragic one, examining Anakin Skywalker's fall to the dark side. Lucas also began to change the prequels status relative to the originals; at first, they were supposed to be a "filling-in" of history tangential to the originals, but now he saw that they could form the beginning of one long story that started with Anakin's childhood and ended with his death. This was the final step towards turning the film series into a "Saga". In 1994, Lucas began work on the screenplay of the first prequel, tentatively titled "Episode I: The Beginning".
In 1997, to celebrate the 20th anniversary of "Star Wars," Lucas returned to the original trilogy and made numerous modifications using newly available digital technology, releasing them in theaters as the "Star Wars Special Edition". For DVD releases in 2004 and Blu-ray releases in 2011, the trilogy received further revisions to make them congruent with the prequel trilogy. Besides the additions to the "Star Wars" franchise, Lucas released a "Director's Cut" of "THX 1138" in 2004, with the film re-cut and containing a number of CGI revisions.
The first "Star Wars" prequel was finished and released in 1999 as "", which would be the first film Lucas had directed in over two decades. Following the release of the first prequel, Lucas announced that he would also be directing the next two, and began working on "Episode II". The first draft of "Episode II" was completed just weeks before principal photography, and Lucas hired Jonathan Hales, a writer from "The Young Indiana Jones Chronicles", to polish it. It was completed and released in 2002 as "". The final prequel, "", began production in 2002 and was released in 2005. Numerous fans and critics considered the prequels inferior to the original trilogy, though they were box office successes. From 2003 to 2005, Lucas also served as an executive producer on "", an animated microseries on Cartoon Network created by Genndy Tartakovsky, that bridged the events between "Attack of the Clones" and "Revenge of the Sith".
Lucas collaborated with Jeff Nathanson as a writer of the 2008 film "Indiana Jones and the Kingdom of the Crystal Skull", directed by Steven Spielberg. Like the "Star Wars" prequels, reception was mixed, with numerous fans and critics once again considering it inferior to its predecessors. From 2008 to 2014, Lucas also served as the creator and executive producer and for a second "Star Wars" animated series on Cartoon Network, "" which premiered with a before airing its first episode. The supervising director for this series was Dave Filoni, who was chosen by Lucas and closely collaborated with him on its development. Like the previous series it bridged the events between "Attack of the Clones" and "Revenge of the Sith". The animated series also featured the last "Star Wars" stories on which Lucas was majorly involved.
In 2012, Lucas served as executive producer for "Red Tails", a war film based on the exploits of the Tuskegee Airmen during World War II. He also took over direction of reshoots while director Anthony Hemingway worked on other projects.
In January 2012, Lucas announced his retirement from producing large blockbuster films and instead re-focusing his career on smaller, independently budgeted features.
In June 2012, it was announced that producer Kathleen Kennedy, a long-term collaborator with Steven Spielberg and a producer of the "Indiana Jones" films, had been appointed as co-chair of Lucasfilm Ltd. It was reported that Kennedy would work alongside Lucas, who would remain chief executive and serve as co-chairman for at least one year, after which she would succeed him as the company's sole leader. With the sale of Lucasfilm to Disney, Lucas is currently Disney's second largest single shareholder after the estate of Steve Jobs.
Lucas worked as a creative consultant on the "Star Wars" sequel trilogy's first film, "The Force Awakens". As creative consultant on the film, Lucas' involvement included attending early story meetings; according to Lucas, "I mostly say, 'You can't do this. You can do that.' You know, 'The cars don't have wheels. They fly with antigravity.' There's a million little pieces ... I know all that stuff." Lucas' son Jett told "The Guardian "that his father was "very torn" about having sold the rights to the franchise, despite having hand-picked Abrams to direct, and that his father was "there to guide" but that "he wants to let it go and become its new generation." Among the materials turned over to the production team were rough story treatments Lucas developed when he considered creating episodes "VII"–"IX" himself years earlier; in January 2015, Lucas stated that Disney had discarded his story ideas.
"The Force Awakens", directed by J. J. Abrams, was released on December 18, 2015. Kathleen Kennedy executive produced the film and its sequels. The new sequel trilogy was jointly produced by Lucasfilm and The Walt Disney Company, which had acquired Lucasfilm in 2012. During an interview with talk show host and journalist Charlie Rose that aired on December 24, 2015, Lucas likened his decision to sell Lucasfilm to Disney to a divorce and outlined the creative differences between him and the producers of "The Force Awakens". Lucas described the previous six "Star Wars" films as his "children" and defended his vision for them, while criticizing "The Force Awakens" for having a "retro feel", saying, "I worked very hard to make them completely different, with different planets, with different spaceships – you know, to make it new." Lucas also drew some criticism and subsequently apologized for his remark likening Disney to "white slavers".
In 2015, Lucas wrote the CGI film "Strange Magic", his first musical. The film was produced at Skywalker Ranch. Gary Rydstrom directed the movie. At the same time the sequel trilogy was announced a fifth installment of the "Indiana Jones" series also entered pre-development phase with Harrison Ford and Steven Spielberg set to return. Lucas originally did not specify whether the selling of Lucasfilm would affect his involvement with the film. In October 2016, Lucas announced his decision to not be involved in the story of the film, but would remain an executive producer. In 2016, "", the first film of a "Star Wars" anthology series was released. It told the story of the rebels who stole the plans for the Death Star featured in the original "Star Wars" film, and it was reported that Lucas liked it more than "The Force Awakens". "The Last Jedi", the second film in the sequel trilogy, was released in 2017; Lucas described the film as "beautifully made".
Lucas has had cursory involvement with "" (2018), the "Star Wars" streaming series "The Mandalorian", and the premiere of the eighth season of "Game of Thrones". Lucas met with J. J. Abrams before the latter began writing the script to the sequel trilogy's final film, "The Rise of Skywalker", which was released in 2019.
Lucas has pledged to give half of his fortune to charity as part of an effort called The Giving Pledge led by Bill Gates and Warren Buffett to persuade America's richest individuals to donate their financial wealth to charities.
In 1991, The George Lucas Educational Foundation was founded as a nonprofit operating foundation to celebrate and encourage innovation in schools. The Foundation's content is available under the brand Edutopia, in an award-winning web site, social media and via documentary films. Lucas, through his foundation, was one of the leading proponents of the E-rate program in the universal service fund, which was enacted as part of the Telecommunications Act of 1996. On June 24, 2008, Lucas testified before the United States House of Representatives subcommittee on Telecommunications and the Internet as the head of his Foundation to advocate for a free wireless broadband educational network.
In 2012, Lucas sold Lucasfilm to The Walt Disney Company for a reported sum of $4.05 billion. It was widely reported at the time that Lucas intends to give the majority of the proceeds from the sale to charity. A spokesperson for Lucasfilm said, "George Lucas has expressed his intention, in the event the deal closes, to donate the majority of the proceeds to his philanthropic endeavors." Lucas also spoke on the matter: "For 41 years, the majority of my time and money has been put into the company. As I start a new chapter in my life, it is gratifying that I have the opportunity to devote more time and resources to philanthropy."
By June 2013, Lucas was considering establishing a museum, the Lucas Cultural Arts Museum, to be built on Crissy Field near the Golden Gate Bridge in San Francisco, which would display his collection of illustrations and pop art, with an estimated value of more than $1 billion. Lucas offered to pay the estimated $300 million cost of constructing the museum, and would endow it with $400 million when it opened, eventually adding an additional $400 million to its endowment. After being unable to reach an agreement with The Presidio Trust, Lucas turned to Chicago. A potential lakefront site on Museum Campus in Chicago was proposed in May 2014. By June 2014, Chicago had been selected, pending approval of the Chicago Plan Commission, which was granted. The museum project was renamed the Lucas Museum of Narrative Art. On June 24, 2016, Lucas announced that he was abandoning his plans to locate the museum in Chicago, due to a lawsuit by a local preservation group, Friends of the Parks, and would instead build the museum in California. On January 17, 2017, Lucas announced that the museum will be constructed in Exposition Park, Los Angeles California.
In 2005, Lucas gave US$1 million to help build the Martin Luther King Jr. Memorial on the National Mall in Washington D.C. to commemorate American civil rights leader Martin Luther King, Jr.
On September 19, 2006, USC announced that Lucas had donated $175–180 million to his alma mater to expand the film school. It is the largest single donation to USC and the largest gift to a film school anywhere. Previous donations led to the already existing George Lucas Instructional Building and Marcia Lucas Post-Production building.
In 2013, Lucas and his wife Mellody Hobson donated $25 million to the Chicago-based not-for-profit After School Matters, of which Hobson is the chair.
On April 15, 2016, it was reported that Lucas had donated between $501,000 and $1 million through the Lucas Family Foundation to the Obama Foundation, which is charged with overseeing the construction of the Barack Obama Presidential Center on Chicago's South Side.
In 1969, Lucas married film editor Marcia Lou Griffin, who went on to win an Academy Award for her editing work on the original "Star Wars" film. They adopted a daughter, Amanda Lucas, in 1981, and divorced in 1983. Lucas subsequently adopted two more children as a single parent: daughter Katie Lucas, born in 1988, and son Jett Lucas, born in 1993. His three eldest children all appeared in the three "Star Wars" prequels, as did Lucas himself. Following his divorce, Lucas was in a relationship with singer Linda Ronstadt in the 1980s.
Lucas began dating Mellody Hobson, president of Ariel Investments and chair of DreamWorks Animation, in 2006. Lucas and Hobson announced their engagement in January 2013, and married on June 22, 2013, at Lucas's Skywalker Ranch in Marin County, California. They have one daughter together, born via gestational carrier in August 2013.
Lucas was born and raised in a Methodist family. The religious and mythical themes in "Star Wars" were inspired by Lucas's interest in the writings of mythologist Joseph Campbell, and he would eventually come to identify strongly with the Eastern religious philosophies he studied and incorporated into his films, which were a major inspiration for "the Force". Lucas has come to state that his religion is "Buddhist Methodist". He resides in Marin County.
Lucas is a major collector of the American illustrator and painter Norman Rockwell. A collection of 57 Rockwell paintings and drawings owned by Lucas and fellow Rockwell collector and film director Steven Spielberg were displayed at the Smithsonian American Art Museum from July 2, 2010, to January 2, 2011, in an exhibition titled "Telling Stories".
Lucas has said that he is a fan of Seth MacFarlane's hit TV show "Family Guy". MacFarlane has said that Lucasfilm was extremely helpful when the "Family Guy" crew wanted to .
Lucas supported Democratic candidate Hillary Clinton in the run-up for the 2016 U.S. presidential election.
The American Film Institute awarded Lucas its Life Achievement Award on June 9, 2005. This was shortly after the release of "", about which he joked stating that, since he views the entire "Star Wars" series as one film, he could actually receive the award now that he had finally "gone back and finished the movie."
Lucas was nominated for four Academy Awards: Best Directing and Writing for "American Graffiti" and "Star Wars". He received the Academy's Irving G. Thalberg Award in 1991. He appeared at the 79th Academy Awards ceremony in 2007 with Steven Spielberg and Francis Ford Coppola to present the Best Director award to their friend Martin Scorsese. During the speech, Spielberg and Coppola talked about the joy of winning an Oscar, making fun of Lucas, who has not won a competitive Oscar.
The Science Fiction Hall of Fame inducted Lucas in 2006, its second "Film, Television, and Media" contributor, after Spielberg. The Discovery Channel named him one of the 100 "Greatest Americans" in September 2008. Lucas served as Grand Marshal for the Tournament of Roses Parade and made the ceremonial coin toss at the Rose Bowl, New Year's Day 2007. In 2009, he was one of 13 California Hall of Fame inductees in The California Museum's yearlong exhibit.
In July 2013, Lucas was awarded the National Medal of Arts by President Barack Obama for his contributions to American cinema.
In October 2014, Lucas received Honorary Membership of the Society of Motion Picture and Television Engineers.
In August 2015, Lucas was inducted as a Disney Legend, and on December 6, 2015, he was an honoree at the Kennedy Center Honors.
Footnotes
Citations
|
https://en.wikipedia.org/wiki?curid=11857
|
Gothenburg
Gothenburg (; abbreviated Gbg; ) is the second-largest city in Sweden, fifth-largest in the Nordic countries, and capital of the Västra Götaland County. It is situated by Kattegat, on the west coast of Sweden, and has a population of approximately 570,000 in the city proper and about 1 million inhabitants in the metropolitan area.
Gothenburg was founded as a heavily fortified, primarily Dutch, trading colony, by royal charter in 1621 by King Gustavus Adolphus. In addition to the generous privileges (e.g. tax relaxation) given to his Dutch allies from the then-ongoing Thirty Years' War, the king also attracted significant numbers of his German and Scottish allies to populate his only town on the western coast. At a key strategic location at the mouth of the Göta älv, where Scandinavia's largest drainage basin enters the sea, the Port of Gothenburg is now the largest port in the Nordic countries.
Gothenburg is home to many students, as the city includes the University of Gothenburg and Chalmers University of Technology. Volvo was founded in Gothenburg in 1927. The original parent Volvo Group and the now separate Volvo Car Corporation are still headquartered on the island of Hisingen in the city. Other key companies are SKF and Astra Zeneca.
Gothenburg is served by Göteborg Landvetter Airport southeast of the city center. The smaller Göteborg City Airport, from the city center, was closed to regular airline traffic in 2015.
The city hosts the Gothia Cup, the world's largest youth football tournament, and the , Europe's largest youth basketball tournament, alongside some of the largest annual events in Scandinavia. The Gothenburg Film Festival, held in January since 1979, is the leading Scandinavian film festival with over 155,000 visitors each year. In summer, a wide variety of music festivals are held in the city, including the popular Way Out West Festival.
The city was named Göteborg in the city's charter in 1621 and simultaneously given the German and English name Gothenburg. The Swedish name was given after the "Göta älv", called Göta River in English, and other cities ending in "-borg".
Both the Swedish and German/English names were in use before 1621 and had already been used for the previous city founded in 1604 and burned down in 1611. Gothenburg is one of few Swedish cities to still have an official and widely used exonym. Another example is the province of Scania in southern Sweden.
The city council of 1641 consisted of four Swedish, three Dutch, three German, and two Scottish members. In Dutch, Scots, English, and German, all languages with a long history in this trade and maritime-oriented city, the name Gothenburg is or was (in the case of German) used for the city. Variations of the official German/English name Gothenburg in the city's 1621 charter existed or exist in many languages. The French form of the city name is "Gothembourg", but in French texts, the Swedish name "Göteborg" is more frequent. "Gothenburg" can also be seen in some older English texts. In Spanish and Portuguese the city is called Gotemburgo. These traditional forms are sometimes replaced with the use of the Swedish "Göteborg", for example by The Göteborg Opera and the Göteborg Ballet. However, "Göteborgs universitet", previously designated as the Göteborg University in English, changed its name to the University of Gothenburg in 2008. The Gothenburg municipality has also reverted to the use of the English name in international contexts.
In 2009, the city council launched a new logotype for Gothenburg. Since the name "Göteborg" contains the Swedish letter "ö", they planned to make the name more international and "up to date" by turning the "ö" sideways. , the name is spelled "Go:teborg" on a large number of signs in the city.
In the early modern period, the configuration of Sweden's borders made Gothenburg strategically critical as the only Swedish gateway to Skagerrak, the North Sea and Atlantic, situated on the west coast in a very narrow strip of Swedish territory between Danish Halland in the south and Norwegian Bohuslän in the north. After several failed attempts, Gothenburg was successfully founded in 1621 by King Gustavus Adolphus (Gustaf II Adolf).
The site of the first church built in Gothenburg, subsequently destroyed by Danish invaders, is marked by a stone near the north end of the Älvsborg Bridge in the Färjenäs Park. The church was built in 1603 and destroyed in 1611. The city was heavily influenced by the Dutch, Germans, and Scots, and Dutch planners and engineers were contracted to construct the city as they had the skills needed to drain and build in the marshy areas chosen for the city. The town was designed like Dutch cities such as Amsterdam, Batavia (Jakarta) and New Amsterdam (Manhattan). The planning of the streets and canals of Gothenburg closely resembled that of Jakarta, which was built by the Dutch around the same time. The Dutchmen initially won political power, and it was not until 1652, when the last Dutch politician in the city's council died, that Swedes acquired political power over Gothenburg. During the Dutch period, the town followed Dutch town laws and Dutch was proposed as the official language in the town. Robust city walls were built during the 17th century. In 1807, a decision was made to tear down most of the city's wall. The work started in 1810 and was carried out by 150 soldiers from the Bohus regiment.
Along with the Dutch, the town also was heavily influenced by Scots who settled down in Gothenburg. Many became people of high-profile. William Chalmers, the son of a Scottish immigrant, donated his fortunes to set up what later became the Chalmers University of Technology. In 1841, the Scotsman Alexander Keiller founded the Götaverken shipbuilding company that was in business until 1989. His son James Keiller donated Keiller Park to the city in 1906.
The Gothenburg coat of arms was based on the lion of the coat of arms of Sweden, symbolically holding a shield with the national emblem, the Three Crowns, to defend the city against its enemies.
In the Treaty of Roskilde (1658), Denmark–Norway ceded the then Danish province Halland, in the south, and the Norwegian province of Bohus County or "Bohuslän" in the north, leaving Gothenburg less exposed. Gothenburg was able to grow into a significant port and trade centre on the west coast, because it was the only city on the west coast that, along with Marstrand, was granted the rights to trade with merchants from other countries.
In the 18th century, fishing was the most important industry. However, in 1731, the Swedish East India Company was founded, and the city flourished due to its foreign trade with highly profitable commercial expeditions to China.
The harbour developed into Sweden's main harbour for trade towards the west, and when Swedish emigration to the United States increased, Gothenburg became Sweden's main point of departure for these travellers. The impact of Gothenburg as a main port of embarkation for Swedish emigrants is reflected by Gothenburg, Nebraska, a small Swedish settlement in the United States.
With the 19th century, Gothenburg evolved into a modern industrial city that continued on into the 20th century. The population increased tenfold in the century, from 13,000 (1800) to 130,000 (1900). In the 20th century, major companies that developed included SKF (1907) and Volvo (1927).
Gothenburg is located on the west coast, in southwestern Sweden, about halfway between the capitals Copenhagen, Denmark, and Oslo, Norway. The location at the mouth of the Göta älv, which feeds into Kattegatt, an arm of the North Sea, has helped the city grow in significance as a trading city. The archipelago of Gothenburg consists of rough, barren rocks and cliffs, which also is typical for the coast of Bohuslän. Due to the Gulf Stream, the city has a mild climate and moderately heavy precipitation. It is the second-largest city in Sweden after the capital Stockholm.
The Gothenburg Metropolitan Area ("Stor-Göteborg") has 982,360 inhabitants and extends to the municipalities of Ale, Alingsås, Göteborg, Härryda, Kungälv, Lerum, Lilla Edet, Mölndal, Partille, Stenungsund, Tjörn, Öckerö within Västra Götaland County, and Kungsbacka within Halland County.
Angered, a suburb outside Gothenburg, consists of Hjällbo, Eriksbo, Rannebergen, Hammarkullen, Gårdsten, and Lövgärdet. It is a Million Programme part of Gothenburg, like Rosengård in Malmö and Botkyrka in Stockholm. Angered had about 50,000 inhabitants in 2015.[?] It lies north of Gothenburg and is isolated from the rest of the city. Bergsjön is another Million Programme suburb north of Gothenburg, it has 14,000 inhabitants. Biskopsgården is the biggest multicultural suburb on the island of Hisingen, which is a part of Gothenburg but separated from the city by the river.
Gothenburg has a warm-summer humid continental climate using the 0 °C isotherm (Köppen "Dfb") and an oceanic climate using the -3 °C isotherm (Köppen "Cfb") according to the Köppen climate classification. Despite its northern latitude, temperatures are quite mild throughout the year and warmer than places in similar latitude, for example, Stockholm, or even somewhat further south, mainly because of the moderating influence of the warm Gulf Stream. During the summer, daylight extends 18 hours and 5 minutes, but lasts 6 hours and 32 minutes in late December. The climate has become significantly milder in later decades, particularly in summer and winter; July temperatures used to be below Stockholm's 1961–1990 averages, but have since been warmer than that benchmark.
Summers are warm and pleasant with average high temperatures of and lows of , but temperatures of occur on many days during the summer.
Winters are cold and windy with temperatures of around , though it rarely drops below . Precipitation is regular but generally moderate throughout the year. Snow mainly occurs from December to March, but is not unusual in November and April and can sometimes occur even in October and May, in extreme cases even in September.
Gothenburg has several parks and nature reserves ranging in size from tens of square metres to hundreds of hectares. It also has many green areas that are not designated as parks or reserves.
Selection of parks:
Very few houses are left from the 17th century when the city was founded, since all but the military and royal houses were built of wood. A rare exception is the Skansen Kronan.
The first major architecturally interesting period is the 18th century when the East India Company made Gothenburg an important trade city. Imposing stone houses in Neo-Classical style were erected around the canals. One example from this period is the East India House, which today houses the Göteborg City Museum.
In the 19th century, the wealthy bourgeoisie began to move outside the city walls which had protected the city. The style now was an eclectic, academic, somewhat overdecorated style which the middle-class favoured. The working class lived in the overcrowded city district Haga in wooden houses.
In the 19th century, the first comprehensive town plan after the founding of city was created, which led to the construction of the main street, Kungsportsavenyen. Perhaps the most significant type of houses of the city, Landshövdingehusen, were built in the end of the 19th century – three-storey houses with the first floor in stone and the other two in wood.
The early 20th century, characterized by the National Romantic style, was rich in architectural achievements. Masthugg Church is a noted example of the style of this period. In the early 1920s, on the city's 300th anniversary, the Götaplatsen square with its Neoclassical look was built.
After this, the predominant style in Gothenburg and rest of Sweden was Functionalism which especially dominated the suburbs such as Västra Frölunda and Bergsjön. The Swedish functionalist architect Uno Åhrén served as city planner from 1932 through 1943. In the 1950s, the big stadium Ullevi was built when Sweden hosted the 1958 FIFA World Cup.
The modern architecture of the city has been formed by such architects as Gert Wingårdh, who started as a Post-modernist in the 1980s.
Gustaf Adolf Square is a town square located in central Gothenburg. Noted buildings on the square include Gothenburg City Hall (formerly the stock exchange, opened in 1849) and the Nordic Classicism law court. The main canal of Gothenburg also flanks the square.
The Gothenburg Central Station is in the centre of the city, next to Nordstan and Drottningtorget. The building has been renovated and expanded numerous times since the grand opening in October 1858. In 2003, a major reconstruction was finished which brought the 19th-century building into the 21st century expanding the capacity for trains, travellers, and shopping. Not far from the central station is the Skanskaskrapan, or more commonly known as "The Lipstick". It is high with 22 floors and coloured in red-white stripes. The skyscraper was designed by Ralph Erskine and built by Skanska in the late 1980s as the headquarters for the company.
By the shore of the Göta Älv at Lilla Bommen is The Göteborg Opera. It was completed in 1994. The architect Jan Izikowitz was inspired by the landscape and described his vision as "Something that makes your mind float over the squiggling landscape like the wings of a seagull."
Feskekörka, or "Fiskhallen", is an indoor fishmarket by the Rosenlundskanalen in central Gothenburg. Feskekörkan was opened on 1November 1874 and its name from the building's resemblance to a Gothic church. The Gothenburg city hall is in the Beaux-Arts architectural style. The Gothenburg Synagogue at Stora Nygatan, near Drottningtorget, was built in 1855 according to the designs of the German architect August Krüger.
The Gunnebo House is a country house located to the south of Gothenburg, in Mölndal. It was built in a neoclassical architecture towards the end of the 18th century. Created in the early 1900s was the Vasa Church. It is located in Vasastan and is built of granite in a neo-Romanesque style.
Another noted construction is Brudaremossen TV Tower, one of the few partially guyed towers in the world.
The sea, trade, and industrial history of the city are evident in the cultural life of Gothenburg. It is also a popular destination for tourists on the Swedish west coast.
Many of the cultural institutions, as well as hospitals and the university, were created by donations from rich merchants and industrialists, for example the Röhsska Museum. On 29December 2004, the Museum of World Culture opened near Korsvägen. Museums include the Gothenburg Museum of Art, and several museums of sea and navigation history, natural history, the sciences, and East India. Aeroseum, close to the Göteborg City Airport, is an aircraft museum in a former military underground air force base. The Volvo museum has exhibits of the history of Volvo and the development from 1927 until today. Products shown include cars, trucks, marine engines, and buses.
Universeum is a public science centre that opened in 2001, the largest of its kind in Scandinavia. It is divided into six sections, each containing experimental workshops and a collection of reptiles, fish, and insects. Universeum occasionally host debates between Swedish secondary-school students and Nobel Prize laureates or other scholars.
The most noted attraction is the amusement park Liseberg, located in the central part of the city. It is the largest amusement park in Scandinavia by number of rides, and was chosen as one of the top ten amusement parks in the world (2005) by "Forbes". It is the most popular attraction in Sweden by number of visitors per year (more than 3 million).
There are a number of independent theatre ensembles in the city, besides institutions such as Gothenburg City Theatre, Backa Theatre (youth theatre), and Folkteatern.
The main boulevard is called Kungsportsavenyn (commonly known as "Avenyn", "The Avenue"). It is about long and starts at Götaplatsen – which is the location of the Gothenburg Museum of Art, the city's theatre, and the city library, as well as the concert hall – and stretches all the way to Kungsportsplatsen in the old city centre of Gothenburg, crossing a canal and a small park. The "Avenyn" was created in the 1860s and 1870s as a result of an international architecture contest, and is the product of a period of extensive town planning and remodelling. "Avenyn" has Gothenburg's highest concentration of pubs and clubs. Gothenburg's largest shopping centre (8th largest in Sweden), Nordstan, is located in central Gothenburg.
Gothenburg's Haga district is known for its picturesque wooden houses and its cafés serving the well-known "Haga bulle" – a large cinnamon roll similar to the "kanelbulle".
Five Gothenburg restaurants have a star in the 2008 "Michelin Guide": 28 +, Basement, Fond, Kock & Vin, Fiskekrogen, and Sjömagasinet.
The city has a number of star chefs – over the past decade, seven of the Swedish Chef of the Year awards have been won by people from Gothenburg.
The Gustavus Adolphus pastry, eaten every 6November in Sweden, Gustavus Adolphus Day, is especially connected to, and appreciated in, Gothenburg because the city was founded by King Gustavus Adolphus.
One of Gothenburg's most popular natural tourist attractions is the southern Gothenburg archipelago, which is a set of several islands that can be reached by ferry boats mainly operating from Saltholmen. Within the archipelago are the Älvsborg fortress, Vinga and Styrsö islands.
The annual Gothenburg Film Festival, is the largest film festival in Scandinavia. The Gothenburg Book Fair, held each year in September. It is the largest literary festival in Scandinavia, and the second largest book fair in Europe.
The International Science Festival in Gothenburg is an annual festival since April 1997, in central Gothenburg with thought-provoking science activities for the public. The festival is visited by about people each year. This makes it the largest popular-science event in Sweden and one of the leading popular-science events in Europe.
Citing the financial crisis, the International Federation of Library Associations and Institutions moved the 2010 World Library and Information Congress, previously to be held in Brisbane, Australia, to Gothenburg. The event took place on 10–15August 2010.
Gothenburg has a diverse music community—the Gothenburg Symphony Orchestra is the best-known in classical music. Gothenburg also was the birthplace of the Swedish composer Kurt Atterberg. The first internationally successfully Swedish group, instrumental rock group The Spotnicks came from Gothenburg. Bands such as The Soundtrack of Our Lives and Ace of Base are well-known pop representatives of the city. During the 1970s, Gothenburg had strong roots in the Swedish progressive movement (progg) with such groups as Nationalteatern, Nynningen, and Motvind. The record company Nacksving and the editorial office for the magazine Musikens Makt which also were part of the progg movement was located in Gothenburg during this time as well. There is also an active indie scene in Gothenburg. For example, the musician Jens Lekman was born in the suburb of Angered and named his 2007 release "Night Falls Over Kortedala" after another suburb, Kortedala. Other internationally acclaimed indie artists include the electro pop duos Studio, The Knife, Air France, The Tough Alliance, songwriter José González, and pop singer El Perro del Mar, as well as genre-bending quartet Little Dragon fronted by vocalist Yukimi Nagano. Another son of the city is one of Sweden's most popular singers, Håkan Hellström, who often includes many places from the city in his songs. The glam rock group Supergroupies derives from Gothenburg.
Gothenburg's own commercially successful At the Gates, In Flames, and Dark Tranquillity are credited with pioneering melodic death metal. Other well-known bands of the Gothenburg scene are thrash metal band The Haunted, progressive power metal band Evergrey, and power metal bands HammerFall and Dream Evil.
Many music festivals take place in the city every year. The Metaltown Festival is a two-day festival featuring heavy metal music bands, held in Gothenburg. It has been arranged annually since 2004, taking place at the Frihamnen venue. In June 2012, the festival included bands such as In Flames, Marilyn Manson, Slayer, Lamb of God, and Mastodon. Another popular festival, Way Out West, focuses more on rock, electronic, and hip-hop genres.
As in all of Sweden, a variety of sports are followed, including football, ice hockey, basketball, handball, baseball, and figure skating. A varied amateur and professional sports clubs scene exists.
Gothenburg is the birthplace of football in Sweden as the first football match in Sweden was played there in 1892. The city's three major football clubs, IFK Göteborg, Örgryte IS, and GAIS share a total of 34 Swedish championships between them. IFK has also won the UEFA Cup twice. Other notable clubs include BK Häcken (football), Pixbo Wallenstam IBK (floorball), multiple national handball champion Redbergslids IK, and four-time national ice hockey champion Frölunda HC, Gothenburg had a professional basketball team, Gothia Basket, until 2010 when it ceased. The bandy department of GAIS, GAIS Bandy, played the first season in the highest division Elitserien last season. The group stage match between the main rivals Sweden and Russia in the 2013 Bandy World Championship was played at Arena Heden in central Gothenburg.
The city's most notable sports venues are Scandinavium, and Ullevi (multisport) and the newly built Gamla Ullevi (football).
The 2003 World Allround Speed Skating Championships were held in Rudhallen, Sweden's only indoor speed-skating arena. It is a part of Ruddalens IP, which also has a bandy field and several football fields.
The only Swedish heavyweight champion of the world in boxing, Ingemar Johansson, who took the title from Floyd Paterson in 1959, was from Gothenburg.
Gothenburg has hosted a number of international sporting events including the 1958 FIFA World Cup, the 1983 European Cup Winners' Cup Final, an NFL preseason game on 14August 1988 between the Chicago Bears and the Minnesota Vikings, the 1992 European Football Championship, the 1993 and the 2002 World Men's Handball Championship, the 1995 World Championships in Athletics, the 1997 World Championships in Swimming (short track), the 2002 Ice Hockey World Championships, the 2004 UEFA Cup final, the 2006 European Championships in Athletics, and the 2008 World Figure Skating Championships. Annual events held in the city are the Gothia Cup and the Göteborgsvarvet. The annual Gothia Cup, is the world's largest football tournament with regards to the number of participants: in 2011, a total of 35,200 players from 1,567 teams and 72 nations participated.
Gothenburg hosted the XIII FINA World Masters Championships in 2010. Diving, swimming, synchronized swimming and open-water competitions were held on 28July to 7August. The water polo events were played on the neighboring city of Borås.
Gothenburg is also home to the Gothenburg Sharks, a professional baseball team in the Elitserien division of baseball in Sweden.
With around 25,000 sailboats and yachts scattered about the city, sailing is a popular sports activity in the region, particularly because of the nearby Gothenburg archipelago. In June 2015, the Volvo Ocean Race, professional sailing's leading crewed offshore race, concluded in Gothenburg, as well as an event in the 2015–2016 America's Cup World Series in August 2015.
The Gothenburg Amateur Diving Club (Göteborgs amatördykarklubb) Has been operating since October 1938.
Due to Gothenburg's advantageous location in the centre of Scandinavia, trade and shipping have always played a major role in the city's economic history, and they continue to do so. Gothenburg port has come to be the largest harbour in Scandinavia.
Apart from trade, the second pillar of Gothenburg has traditionally been manufacturing and industry, which significantly contributes to the city's wealth. Major companies operating plants in the area include SKF, Volvo (both cars and trucks), and Ericsson. Volvo Cars is the largest employer in Gothenburg, not including jobs in supply companies. The blue-collar industries which have dominated the city for long are still important factors in the city's economy, but they are being gradually replaced by high-tech industries.
Banking and finance are also important, as well as the event and tourist industry.
Gothenburg is the terminus of the Valdemar-Göteborg gas pipeline, which brings natural gas from the North Sea fields to Sweden, through Denmark.
Historically, Gothenburg was home base from the 18th century of the Swedish East India Company. From its founding until the late 1970s, the city was a world leader in shipbuilding, with such shipyards as Eriksbergs Mekaniska Verkstad, Götaverken, Arendalsvarvet, and Lindholmens varv. Gothenburg is classified as a global city by GaWC, with a ranking of Gamma−. The city has been ranked as the 12th-most inventive city in the world by "Forbes".
Gothenburg became a city municipality with an elected city council when the first Swedish local government acts were implemented in 1863. The municipality has an assembly consisting of 81 members, elected every fourth year. Political decisions depend on citizens considering them legitimate. Political legitimacy can be based on various factors: legality, due process, and equality before the law, as well as the efficiency and effectiveness of public policy. One method used to achieve greater legitimacy for controversial policy reforms such as congestion charges is to allow citizens to decide or advise on the issue in public referendums. In December 2010 a petition for a local referendum on the congestion tax, signed by 28,000 citizens, was submitted to the City Council. This right to submit so-called “people's initiatives” was inscribed in the Local Government Act, which obliged local governments to hold a local referendum if petitioned by 5% of the citizens unless the issue was deemed to be outside their area of jurisdiction or if a majority in the City Council voted against holding such a referendum. A second petition for a referendum, signed by 57,000 citizens, was submitted to the local government in February 2013. This petition followed a campaign organised by a local newspaper – Göteborgs Tidningen – whose editor-in-chief argued that the paper's involvement was justified by the large public response to a series of articles on the congestion tax, as well as out of concern for the local democracy.
In 2019, approximately 28% (159,342 residents) of the population of Gothenburg were foreign born and approximately 46% (265,019 residents) had at least one parent born abroad. In addition, approximately 12% (69,263 residents) were foreign citizens.
In 2016, 45% of Gothenburg's immigrant population is from other parts of Europe, and 10% of the total population is from another Nordic country.
Gothenburg has two universities, both of which started as colleges founded by private donations in the 19th century. The University of Gothenburg has about 38,000 students and is one of the largest universities in Scandinavia, and one of the most versatile in Sweden. Chalmers University of Technology is a well-known university located in Johanneberg south of the inner city, lately also established at Lindholmen in Norra Älvstranden, Hisingen.
In 2015, there were ten adult education centres in Gothenburg: "Agnesbergs folkhögskola", "Arbetarrörelsens folkhögskola i Göteborg", "Finska folkhögskolan", "Folkhögskolan i Angered", "Göteborgs folkhögskola", "Kvinnofolkhögskolan", "Mo Gård folkhögskola", "S:ta Birgittas folkhögskola", "Västra Götalands folkhögskolor" and "Wendelsbergs folkhögskola".
In 2015, there were 49 high schools Gothenburg. Some of the more notable schools are Hvitfeldtska gymnasiet, Göteborgs Högre Samskola, Sigrid Rudebecks gymnasium and Polhemsgymnasiet. Some high-schools are also connected to large Swedish corporations, such as SKF Technical high-school that is owned by SKF and Gothenburg's technical high-school that is jointly owned by Volvo, Volvo Cars and Gothenburg municipality.
There are two folkhögskola that teach fine arts: Domen and Goteborg Folkhögskola.
With over of double track, the Gothenburg tram network covers most of the city and is the largest tram/light rail network in Scandinavia. Gothenburg also has a bus network. Boat and ferry services connect the Gothenburg archipelago to the mainland. The lack of a subway is due to the soft ground on which Gothenburg is situated. Tunneling is very expensive in such conditions.
The Gothenburg commuter rail with three lines services some nearby cities and towns.
Other major transportation hubs are "Centralstationen" (Gothenburg Central Station) and the Nils Ericson Terminal with trains and buses to various destinations in Sweden, as well as connections to Oslo and Copenhagen (via Malmö).
Gothenburg is served by Göteborg Landvetter Airport , located about 20 km (12 mi) east of the city centre. It is named after nearby locality Landvetter. Flygbussarna offer frequent bus connections to and from Gothenburg with travel time 20–30 minutes. Swebus, Flixbus and Nettbuss also serve the airport with several daily departures to Gothenburg, Borås and other destinations along European route E4. Västtrafik, the local public transport provider in the area, offers additional connections to Landvetter.
The airport is operated by Swedish national airport operator Swedavia, and with 6.8 million passengers served in 2017, it is Sweden's second-largest airport after Stockholm Arlanda. It serves as a base for several domestic and international airlines, e.g. Scandinavian Airlines, Norwegian Air Shuttle and Ryanair. Göteborg Landvetter, however, does not serve as a hub for any airline. In total, there are about 50 destinations with scheduled direct flights to and from Gothenburg, most of them European. An additional 40 destinations are served via charter.
The second airport in the area, Göteborg City Airport , is closed. On 13January 2015, Swedish airport operator Swedavia announced that Göteborg City Airport will not reopen for commercial services following an extensive rebuild of the airport started in November 2014, citing that the cost of making the airport viable for commercial operations again was too high, at 250 million kronor ($31 million). Commercial operations will be gradually wound down. The airport was located northwest of the city centre. It was formerly known as "Säve Flygplats." It is located within the borders of Gothenburg Municipality. In addition to commercial airlines, the airport was also operated by a number of rescue services, including the Swedish Coast Guard, and was used for other general aviation. Most civil air traffic to Göteborg City Airport was via low-cost airlines such as Ryanair and Wizz Air. Those companies have now been relocated to Landvetter Airport.
The Swedish company Stena Line operates between Gothenburg/Frederikshavn in Denmark and Gothenburg/Kiel in Germany.
The "England ferry" ("Englandsfärjan") to Newcastle via Kristiansand (run by the Danish company DFDS Seaways) ceased at the end of October 2006, after being a Gothenburg institution since the 19th century. DFDS Seaways' sister company, DFDS Tor Line, continues to run scheduled cargo ships between Gothenburg and several English ports, and these used to have limited capacity for passengers and their private vehicles. Also freight ships to North America and East Asia leave from the port.
Gothenburg is an intermodal logistics hub and Gothenburg harbour has access to Sweden and Norway via rail and trucks. Gothenburg harbour is the largest port in Scandinavia with a cargo turnover of 36.9 million tonnes per year in 2004.
Two of the noted people from Gothenburg are fictional, but have become synonymous with "people from Gothenburg". They are a working class couple called Kal and Ada, featured in "Gothenburg jokes" ("göteborgsvitsar"), songs, plays and names of events. Each year two persons who have significantly contributed to culture in the city are given the honorary titles of "Kal and Ada". A bronze statue of the couple made by Svenrobert Lundquist, was placed outside the entrance to Liseberg in 1995.
Some of the noted people from Gothenburg are Academy Award Winning actress Alicia Vikander, cookbook author Sofia von Porat, footballer Gunnar Gren, artist Evert Taube, golfer Helen Alfredsson, industrialist Victor Hasselblad, singer-songwriter Björn Ulvaeus, diplomat Jan Eliasson, British Open Winner and professional golfer Henrik Stenson, YouTuber PewDiePie (Felix Kjellberg), the most subscribed-to individual on the platform, with over 100 million subscribers and YouTuber RoomieOfficial (Joel Berghult).
Gothenburg has performed well in international rankings, some of which are mentioned below:
The Global Destination Sustainability Index has named Gothenburg the world's most sustainable destination every year since 2016.
In 2019 Gothenburg was selected by the EU as one of the top 2020 European Capitals of Smart Tourism.
In 2020 Business Region Göteborg received the 'European Entrepreneurial Region Award 2020' (EER Award 2020) from the EU.
The Gothenburg Award is the city's international prize that recognises and supports work to achieve sustainable development – in the Gothenburg region and from a global perspective. The award, which is one million Swedish crowns, is administrated and funded by a coalition of the City of Gothenburg and 12 companies. Past winners of the award have included Kofi Annan, Al Gore, and Michael Biddle.
Gothenburg is twinned with:
With Lyon (France) there is no formal partnership, but "a joint willingness to cooperate".
Gothenburg had signed an agreement with Shanghai in 1986 which was upgraded in 2003 to include exchanges in culture, economics, trade and sport. The agreement was allowed to lapse in 2020.
|
https://en.wikipedia.org/wiki?curid=11861
|
Global Positioning System
The Global Positioning System (GPS), originally NAVSTAR GPS, is a satellite-based radionavigation system owned by the United States government and operated by the United States Space Force. It is one of the global navigation satellite systems (GNSS) that provides geolocation and time information to a GPS receiver anywhere on or near the Earth where there is an unobstructed line of sight to four or more GPS satellites. Obstacles such as mountains and buildings block the relatively weak GPS signals.
The GPS does not require the user to transmit any data, and it operates independently of any telephonic or internet reception, though these technologies can enhance the usefulness of the GPS positioning information. The GPS provides critical positioning capabilities to military, civil, and commercial users around the world. The United States government created the system, maintains it, and makes it freely accessible to anyone with a GPS receiver.
The GPS project was started by the U.S. Department of Defense in 1973, with the first prototype spacecraft launched in 1978 and the full constellation of 24 satellites operational in 1993. Originally limited to use by the United States military, civilian use was allowed from the 1980s following an executive order from President Ronald Reagan. Advances in technology and new demands on the existing system have now led to efforts to modernize the GPS and implement the next generation of GPS Block IIIA satellites and Next Generation Operational Control System (OCX). Announcements from Vice President Al Gore and the White House in 1998 initiated these changes. In 2000, the U.S. Congress authorized the modernization effort, GPS III. During the 1990s, GPS quality was degraded by the United States government in a program called "Selective Availability"; this was discontinued in May 2000 by a law signed by President Bill Clinton.
The GPS service is provided by the United States government, which can selectively deny access to the system, as happened to the Indian military in 1999 during the Kargil War, or degrade the service at any time. As a result, several countries have developed or are in the process of setting up other global or regional satellite navigation systems. The Russian Global Navigation Satellite System (GLONASS) was developed contemporaneously with GPS, but suffered from incomplete coverage of the globe until the mid-2000s. GLONASS can be added to GPS devices, making more satellites available and enabling positions to be fixed more quickly and accurately, to within . China's BeiDou Navigation Satellite System began global services in 2018, and finished its full deployment in 2020.
There are also the European Union Galileo positioning system, and India's NavIC. Japan's Quasi-Zenith Satellite System (QZSS) is a GNSS satellite-based augmentation system to enhance GNSS's accuracy in Asia-Oceania, with satellite navigation independent of GPS scheduled for 2023.
When selective availability was lifted in 2000, GPS had about a accuracy. The latest stage of accuracy enhancement uses the L5 band and is now fully deployed. GPS receivers released in 2018 that use the L5 band can have much higher accuracy, pinpointing to within .
The GPS project was launched in the United States in 1973 to overcome the limitations of previous navigation systems, integrating ideas from several predecessors, including classified engineering design studies from the 1960s. The U.S. Department of Defense developed the system, which originally used 24 satellites. It was initially developed for use by the United States military and became fully operational in 1995. Civilian use was allowed from the 1980s. Roger L. Easton of the Naval Research Laboratory, Ivan A. Getting of The Aerospace Corporation, and Bradford Parkinson of the Applied Physics Laboratory are credited with inventing it. The work of Gladys West is credited as instrumental in the development of computational techniques for detecting satellite positions with the precision needed for GPS.
The design of GPS is based partly on similar ground-based radio-navigation systems, such as LORAN and the Decca Navigator, developed in the early 1940s.
In 1955, Friedwardt Winterberg proposed a test of general relativity – detecting time slowing in a strong gravitational field using accurate atomic clocks placed in orbit inside artificial satellites.
Special and general relativity predict that the clocks on the GPS satellites would be seen by the Earth's observers to run 38 microseconds faster per day than the clocks on the Earth. The GPS calculated positions would quickly drift into error, accumulating to . This was corrected for in the design of GPS.
When the Soviet Union launched the first artificial satellite (Sputnik 1) in 1957, two American physicists, William Guier and George Weiffenbach, at Johns Hopkins University's Applied Physics Laboratory (APL) decided to monitor its radio transmissions. Within hours they realized that, because of the Doppler effect, they could pinpoint where the satellite was along its orbit. The Director of the APL gave them access to their UNIVAC to do the heavy calculations required.
Early the next year, Frank McClure, the deputy director of the APL, asked Guier and Weiffenbach to investigate the inverse problem—pinpointing the user's location, given the satellite's. (At the time, the Navy was developing the submarine-launched Polaris missile, which required them to know the submarine's location.) This led them and APL to develop the TRANSIT system. In 1959, ARPA (renamed DARPA in 1972) also played a role in TRANSIT.
TRANSIT was first successfully tested in 1960. It used a constellation of five satellites and could provide a navigational fix approximately once per hour.
In 1967, the U.S. Navy developed the Timation satellite, which proved the feasibility of placing accurate clocks in space, a technology required for GPS.
In the 1970s, the ground-based OMEGA navigation system, based on phase comparison of signal transmission from pairs of stations, became the first worldwide radio navigation system. Limitations of these systems drove the need for a more universal navigation solution with greater accuracy.
Although there were wide needs for accurate navigation in military and civilian sectors, almost none of those was seen as justification for the billions of dollars it would cost in research, development, deployment, and operation of a constellation of navigation satellites. During the Cold War arms race, the nuclear threat to the existence of the United States was the one need that did justify this cost in the view of the United States Congress. This deterrent effect is why GPS was funded. It is also the reason for the ultra-secrecy at that time. The nuclear triad consisted of the United States Navy's submarine-launched ballistic missiles (SLBMs) along with United States Air Force (USAF) strategic bombers and intercontinental ballistic missiles (ICBMs). Considered vital to the nuclear deterrence posture, accurate determination of the SLBM launch position was a force multiplier.
Precise navigation would enable United States ballistic missile submarines to get an accurate fix of their positions before they launched their SLBMs. The USAF, with two thirds of the nuclear triad, also had requirements for a more accurate and reliable navigation system. The Navy and Air Force were developing their own technologies in parallel to solve what was essentially the same problem.
To increase the survivability of ICBMs, there was a proposal to use mobile launch platforms (comparable to the Soviet SS-24 and SS-25) and so the need to fix the launch position had similarity to the SLBM situation.
In 1960, the Air Force proposed a radio-navigation system called MOSAIC (MObile System for Accurate ICBM Control) that was essentially a 3-D LORAN. A follow-on study, Project 57, was worked in 1963 and it was "in this study that the GPS concept was born." That same year, the concept was pursued as Project 621B, which had "many of the attributes that you now see in GPS" and promised increased accuracy for Air Force bombers as well as ICBMs.
Updates from the Navy TRANSIT system were too slow for the high speeds of Air Force operation. The Naval Research Laboratory continued making advances with their Timation (Time Navigation) satellites, first launched in 1967, with the third one in 1974 carrying the first atomic clock into orbit.
Another important predecessor to GPS came from a different branch of the United States military. In 1964, the United States Army orbited its first Sequential Collation of Range (SECOR) satellite used for geodetic surveying. The SECOR system included three ground-based transmitters at known locations that would send signals to the satellite transponder in orbit. A fourth ground-based station, at an undetermined position, could then use those signals to fix its location precisely. The last SECOR satellite was launched in 1969.
With these parallel developments in the 1960s, it was realized that a superior system could be developed by synthesizing the best technologies from 621B, Transit, Timation, and SECOR in a multi-service program. Satellite orbital position errors, induced by variations in the gravity field and radar refraction among others, had to be resolved. A team led by Harold L Jury of Pan Am Aerospace Division in Florida from 1970–1973, used real-time data assimilation and recursive estimation to do so, reducing systematic and residual errors to a manageable level to permit accurate navigation.
During Labor Day weekend in 1973, a meeting of about twelve military officers at the Pentagon discussed the creation of a "Defense Navigation Satellite System (DNSS)". It was at this meeting that the real synthesis that became GPS was created. Later that year, the DNSS program was named "Navstar." Navstar is often erroneously considered an acronym for "NAVigation System Using Timing and Ranging" but was never considered as such by the GPS Joint Program Office (TRW may have once advocated for a different navigational system that used that acronym). With the individual satellites being associated with the name Navstar (as with the predecessors Transit and Timation), a more fully encompassing name was used to identify the constellation of Navstar satellites, "Navstar-GPS". Ten "Block I" prototype satellites were launched between 1978 and 1985 (an additional unit was destroyed in a launch failure).
The effect of the ionosphere on radio transmission was investigated in a geophysics laboratory of Air Force Cambridge Research Laboratory. Located at Hanscom Air Force Base, outside Boston, the lab was renamed the Air Force Geophysical Research Lab (AFGRL) in 1974. AFGRL developed the Klobuchar model for computing ionospheric corrections to GPS location. Of note is work done by Australian space scientist Elizabeth Essex-Cohen at AFGRL in 1974. She was concerned with the curving of the paths of radio waves traversing the ionosphere from NavSTAR satellites.
After Korean Air Lines Flight 007, a Boeing 747 carrying 269 people, was shot down in 1983 after straying into the USSR's prohibited airspace, in the vicinity of Sakhalin and Moneron Islands, President Ronald Reagan issued a directive making GPS freely available for civilian use, once it was sufficiently developed, as a common good. The first Block II satellite was launched on February 14, 1989, and the 24th satellite was launched in 1994. The GPS program cost at this point, not including the cost of the user equipment but including the costs of the satellite launches, has been estimated at US$5 billion (then-year dollars).
Initially, the highest-quality signal was reserved for military use, and the signal available for civilian use was intentionally degraded, in a policy known as Selective Availability. This changed with President Bill Clinton signing on May 1, 2000 a policy directive to turn off Selective Availability to provide the same accuracy to civilians that was afforded to the military. The directive was proposed by the U.S. Secretary of Defense, William Perry, in view of the widespread growth of differential GPS services by private industry to improve civilian accuracy. Moreover, the U.S. military was actively developing technologies to deny GPS service to potential adversaries on a regional basis.
Since its deployment, the U.S. has implemented several improvements to the GPS service, including new signals for civil use and increased accuracy and integrity for all users, all the while maintaining compatibility with existing GPS equipment. Modernization of the satellite system has been an ongoing initiative by the U.S. Department of Defense through a series of satellite acquisitions to meet the growing needs of the military, civilians, and the commercial market.
As of early 2015, high-quality, FAA grade, Standard Positioning Service (SPS) GPS receivers provided horizontal accuracy of better than , although many factors such as receiver quality and atmospheric issues can affect this accuracy.
GPS is owned and operated by the United States government as a national resource. The Department of Defense is the steward of GPS. The "Interagency GPS Executive Board (IGEB)" oversaw GPS policy matters from 1996 to 2004. After that, the National Space-Based Positioning, Navigation and Timing Executive Committee was established by presidential directive in 2004 to advise and coordinate federal departments and agencies on matters concerning the GPS and related systems. The executive committee is chaired jointly by the Deputy Secretaries of Defense and Transportation. Its membership includes equivalent-level officials from the Departments of State, Commerce, and Homeland Security, the Joint Chiefs of Staff and NASA. Components of the executive office of the president participate as observers to the executive committee, and the FCC chairman participates as a liaison.
The U.S. Department of Defense is required by law to "maintain a Standard Positioning Service (as defined in the federal radio navigation plan and the standard positioning service signal specification) that will be available on a continuous, worldwide basis," and "develop measures to prevent hostile use of GPS and its augmentations without unduly disrupting or degrading civilian uses."
On February 10, 1993, the National Aeronautic Association selected the GPS Team as winners of the 1992 Robert J. Collier Trophy, the US's most prestigious aviation award. This team combines researchers from the Naval Research Laboratory, the USAF, the Aerospace Corporation, Rockwell International Corporation, and IBM Federal Systems Company. The citation honors them "for the most significant development for safe and efficient navigation and surveillance of air and spacecraft since the introduction of radio navigation 50 years ago."
Two GPS developers received the National Academy of Engineering Charles Stark Draper Prize for 2003:
GPS developer Roger L. Easton received the National Medal of Technology on February 13, 2006.
Francis X. Kane (Col. USAF, ret.) was inducted into the U.S. Air Force Space and Missile Pioneers Hall of Fame at Lackland A.F.B., San Antonio, Texas, March 2, 2010 for his role in space technology development and the engineering design concept of GPS conducted as part of Project 621B.
In 1998, GPS technology was inducted into the Space Foundation Space Technology Hall of Fame.
On October 4, 2011, the International Astronautical Federation (IAF) awarded the Global Positioning System (GPS) its 60th Anniversary Award, nominated by IAF member, the American Institute for Aeronautics and Astronautics (AIAA). The IAF Honors and Awards Committee recognized the uniqueness of the GPS program and the exemplary role it has played in building international collaboration for the benefit of humanity.
Gladys West was inducted into the Air Force Space and Missile Pioneers Hall of Fame in 2018 for recognition of her computational work which led to breakthroughs for GPS technology.
On February 12, 2019, four founding members of the project were awarded the Queen Elizabeth Prize for Engineering with the chair of the awarding board stating "Engineering is the foundation of civilisation; there is no other foundation; it makes things happen. And that's exactly what today's Laureates have done - they've made things happen. They've re-written, in a major way, the infrastructure of our world."
The GPS concept is based on time and the known position of GPS specialized satellites. The satellites carry very stable atomic clocks that are synchronized with one another and with the ground clocks. Any drift from time maintained on the ground is corrected daily. In the same manner, the satellite locations are known with great precision. GPS receivers have clocks as well, but they are less stable and less precise.
Each GPS satellite continuously transmits a radio signal containing the current time and data about its position. Since the speed of radio waves is constant and independent of the satellite speed, the time delay between when the satellite transmits a signal and the receiver receives it is proportional to the distance from the satellite to the receiver. A GPS receiver monitors multiple satellites and solves equations to determine the precise position of the receiver and its deviation from true time. At a minimum, four satellites must be in view of the receiver for it to compute four unknown quantities (three position coordinates and clock deviation from satellite time).
Each GPS satellite continually broadcasts a signal (carrier wave with modulation) that includes:
Conceptually, the receiver measures the TOAs (according to its own clock) of four satellite signals. From the TOAs and the TOTs, the receiver forms four time of flight (TOF) values, which are (given the speed of light) approximately equivalent to receiver-satellite ranges plus time difference between the receiver and GPS satellites multiplied by speed of light, which are called as pseudo-ranges. The receiver then computes its three-dimensional position and clock deviation from the four TOFs.
In practice the receiver position (in three dimensional Cartesian coordinates with origin at the Earth's center) and the offset of the receiver clock relative to the GPS time are computed simultaneously, using the navigation equations to process the TOFs.
The receiver's Earth-centered solution location is usually converted to latitude, longitude and height relative to an ellipsoidal Earth model. The height may then be further converted to height relative to the geoid, which is essentially mean sea level. These coordinates may be displayed, such as on a moving map display, or recorded or used by some other system, such as a vehicle guidance system.
Although usually not formed explicitly in the receiver processing, the conceptual time differences of arrival (TDOAs) define the measurement geometry. Each TDOA corresponds to a hyperboloid of revolution (see Multilateration). The line connecting the two satellites involved (and its extensions) forms the axis of the hyperboloid. The receiver is located at the point where three hyperboloids intersect.
It is sometimes incorrectly said that the user location is at the intersection of three spheres. While simpler to visualize, this is the case only if the receiver has a clock synchronized with the satellite clocks (i.e., the receiver measures true ranges to the satellites rather than range differences). There are marked performance benefits to the user carrying a clock synchronized with the satellites. Foremost is that only three satellites are needed to compute a position solution. If it were an essential part of the GPS concept that all users needed to carry a synchronized clock, a smaller number of satellites could be deployed, but the cost and complexity of the user equipment would increase.
The description above is representative of a receiver start-up situation. Most receivers have a track algorithm, sometimes called a "tracker", that combines sets of satellite measurements collected at different times—in effect, taking advantage of the fact that successive receiver positions are usually close to each other. After a set of measurements are processed, the tracker predicts the receiver location corresponding to the next set of satellite measurements. When the new measurements are collected, the receiver uses a weighting scheme to combine the new measurements with the tracker prediction. In general, a tracker can (a) improve receiver position and time accuracy, (b) reject bad measurements, and (c) estimate receiver speed and direction.
The disadvantage of a tracker is that changes in speed or direction can be computed only with a delay, and that derived direction becomes inaccurate when the distance traveled between two position measurements drops below or near the random error of position measurement. GPS units can use measurements of the Doppler shift of the signals received to compute velocity accurately. More advanced navigation systems use additional sensors like a compass or an inertial navigation system to complement GPS.
GPS requires four or more satellites to be visible for accurate navigation. The solution of the navigation equations gives the position of the receiver along with the difference between the time kept by the receiver's on-board clock and the true time-of-day, thereby eliminating the need for a more precise and possibly impractical receiver based clock. Applications for GPS such as time transfer, traffic signal timing, and synchronization of cell phone base stations, make use of this cheap and highly accurate timing. Some GPS applications use this time for display, or, other than for the basic position calculations, do not use it at all.
Although four satellites are required for normal operation, fewer apply in special cases. If one variable is already known, a receiver can determine its position using only three satellites. For example, a ship or aircraft may have known elevation. Some GPS receivers may use additional clues or assumptions such as reusing the last known altitude, dead reckoning, inertial navigation, or including information from the vehicle computer, to give a (possibly degraded) position when fewer than four satellites are visible.
The current GPS consists of three major segments. These are the space segment, a control segment, and a user segment. The U.S. Space Force develops, maintains, and operates the space and control segments. GPS satellites broadcast signals from space, and each GPS receiver uses these signals to calculate its three-dimensional location (latitude, longitude, and altitude) and the current time.
The space segment (SS) is composed of 24 to 32 satellites, or Space Vehicles (SV), in medium Earth orbit, and also includes the payload adapters to the boosters required to launch them into orbit. The GPS design originally called for 24 SVs, eight each in three approximately circular orbits, but this was modified to six orbital planes with four satellites each. The six orbit planes have approximately 55° inclination (tilt relative to the Earth's equator) and are separated by 60° right ascension of the ascending node (angle along the equator from a reference point to the orbit's intersection). The orbital period is one-half a sidereal day, i.e., 11 hours and 58 minutes so that the satellites pass over the same locations or almost the same locations every day. The orbits are arranged so that at least six satellites are always within line of sight from everywhere on the Earth's surface (see animation at right). The result of this objective is that the four satellites are not evenly spaced (90°) apart within each orbit. In general terms, the angular difference between satellites in each orbit is 30°, 105°, 120°, and 105° apart, which sum to 360°.
Orbiting at an altitude of approximately ; orbital radius of approximately , each SV makes two complete orbits each sidereal day, repeating the same ground track each day. This was very helpful during development because even with only four satellites, correct alignment means all four are visible from one spot for a few hours each day. For military operations, the ground track repeat can be used to ensure good coverage in combat zones.
, there are 31 satellites in the GPS constellation, 27 of which are in use at a given time with the rest allocated as stand-bys. A 32nd was launched in 2018. , this last is still in evaluation. More decommissioned satellites are in orbit and available as spares. The additional satellites over 24 improve the precision of GPS receiver calculations by providing redundant measurements. With the increased number of satellites, the constellation was changed to a nonuniform arrangement. Such an arrangement was shown to improve accuracy but also improves reliability and availability of the system, relative to a uniform system, when multiple satellites fail. With the expanded constellation, nine satellites are usually visible from any point on the ground at any one time, ensuring considerable redundancy over the minimum four satellites needed for a position.
The control segment (CS) is composed of:
The MCS can also access U.S. Air Force Satellite Control Network (AFSCN) ground antennas (for additional command and control capability) and NGA (National Geospatial-Intelligence Agency) monitor stations. The flight paths of the satellites are tracked by dedicated U.S. Space Force monitoring stations in Hawaii, Kwajalein Atoll, Ascension Island, Diego Garcia, Colorado Springs, Colorado and Cape Canaveral, along with shared NGA monitor stations operated in England, Argentina, Ecuador, Bahrain, Australia and Washington DC. The tracking information is sent to the MCS at Schriever Air Force Base ESE of Colorado Springs, which is operated by the 2nd Space Operations Squadron (2 SOPS) of the U.S. Space Force. Then 2 SOPS contacts each GPS satellite regularly with a navigational update using dedicated or shared (AFSCN) ground antennas (GPS dedicated ground antennas are located at Kwajalein, Ascension Island, Diego Garcia, and Cape Canaveral). These updates synchronize the atomic clocks on board the satellites to within a few nanoseconds of each other, and adjust the ephemeris of each satellite's internal orbital model. The updates are created by a Kalman filter that uses inputs from the ground monitoring stations, space weather information, and various other inputs.
Satellite maneuvers are not precise by GPS standards—so to change a satellite's orbit, the satellite must be marked "unhealthy", so receivers don't use it. After the satellite maneuver, engineers track the new orbit from the ground, upload the new ephemeris, and mark the satellite healthy again.
The operation control segment (OCS) currently serves as the control segment of record. It provides the operational capability that supports GPS users and keeps the GPS operational and performing within specification.
OCS successfully replaced the legacy 1970s-era mainframe computer at Schriever Air Force Base in September 2007. After installation, the system helped enable upgrades and provide a foundation for a new security architecture that supported U.S. armed forces.
OCS will continue to be the ground control system of record until the new segment, Next Generation GPS Operation Control System (OCX), is fully developed and functional. The new capabilities provided by OCX will be the cornerstone for revolutionizing GPS's mission capabilities, enabling U.S. Space Force to greatly enhance GPS operational services to U.S. combat forces, civil partners and myriad domestic and international users. The GPS OCX program also will reduce cost, schedule and technical risk. It is designed to provide 50% sustainment cost savings through efficient software architecture and Performance-Based Logistics. In addition, GPS OCX is expected to cost millions less than the cost to upgrade OCS while providing four times the capability.
The GPS OCX program represents a critical part of GPS modernization and provides significant information assurance improvements over the current GPS OCS program.
On September 14, 2011, the U.S. Air Force announced the completion of GPS OCX Preliminary Design Review and confirmed that the OCX program is ready for the next phase of development.
The GPS OCX program has missed major milestones and is pushing its launch into 2021, 5 years past the original deadline. According to the Government Accounting Office, even this new deadline looks shaky.
The user segment (US) is composed of hundreds of thousands of U.S. and allied military users of the secure GPS Precise Positioning Service, and tens of millions of civil, commercial and scientific users of the Standard Positioning Service (see GPS navigation devices). In general, GPS receivers are composed of an antenna, tuned to the frequencies transmitted by the satellites, receiver-processors, and a highly stable clock (often a crystal oscillator). They may also include a display for providing location and speed information to the user. A receiver is often described by its number of channels: this signifies how many satellites it can monitor simultaneously. Originally limited to four or five, this has progressively increased over the years so that, , receivers typically have between 12 and 20 channels. Though there are many receiver manufacturers, they almost all use one of the chipsets produced for this purpose.
GPS receivers may include an input for differential corrections, using the RTCM SC-104 format. This is typically in the form of an RS-232 port at 4,800 bit/s speed. Data is actually sent at a much lower rate, which limits the accuracy of the signal sent using RTCM. Receivers with internal DGPS receivers can outperform those using external RTCM data. , even low-cost units commonly include Wide Area Augmentation System (WAAS) receivers.
Many GPS receivers can relay position data to a PC or other device using the NMEA 0183 protocol. Although this protocol is officially defined by the National Marine Electronics Association (NMEA), references to this protocol have been compiled from public records, allowing open source tools like gpsd to read the protocol without violating intellectual property laws. Other proprietary protocols exist as well, such as the SiRF and MTK protocols. Receivers can interface with other devices using methods including a serial connection, USB, or Bluetooth.
While originally a military project, GPS is considered a dual-use technology, meaning it has significant civilian applications as well.
GPS has become a widely deployed and useful tool for commerce, scientific uses, tracking, and surveillance. GPS's accurate time facilitates everyday activities such as banking, mobile phone operations, and even the control of power grids by allowing well synchronized hand-off switching.
Many civilian applications use one or more of GPS's three basic components: absolute location, relative movement, and time transfer.
The U.S. government controls the export of some civilian receivers. All GPS receivers capable of functioning above above sea level and , or designed or modified for use with unmanned missiles and aircraft, are classified as munitions (weapons)—which means they require State Department export licenses.
This rule applies even to otherwise purely civilian units that only receive the L1 frequency and the C/A (Coarse/Acquisition) code.
Disabling operation above these limits exempts the receiver from classification as a munition. Vendor interpretations differ. The rule refers to operation at both the target altitude and speed, but some receivers stop operating even when stationary. This has caused problems with some amateur radio balloon launches that regularly reach .
These limits only apply to units or components exported from the United States. A growing trade in various components exists, including GPS units from other countries. These are expressly sold as ITAR-free.
As of 2009, military GPS applications include:
GPS type navigation was first used in war in the 1991 Persian Gulf War, before GPS was fully developed in 1995, to assist Coalition Forces to navigate and perform maneuvers in the war. The war also demonstrated the vulnerability of GPS to being jammed, when Iraqi forces installed jamming devices on likely targets that emitted radio noise, disrupting reception of the weak GPS signal.
GPS's vulnerability to jamming is a threat that continues to grow as jamming equipment and experience grows. GPS signals have been reported to have been jammed many times over the years for military purposes. Russia seems to have several objectives for this behavior, such as intimidating neighbors while undermining confidence in their reliance on American systems, promoting their GLONASS alternative, disrupting Western military exercises, and protecting assets from drones. China uses jamming to discourage US surveillance aircraft near the contested Spratly Islands. North Korea has mounted several major jamming operations near its border with South Korea and offshore, disrupting flights, shipping and fishing operations.
While most clocks derive their time from Coordinated Universal Time (UTC), the atomic clocks on the satellites are set to GPS time (GPST; see the page of United States Naval Observatory). The difference is that GPS time is not corrected to match the rotation of the Earth, so it does not contain leap seconds or other corrections that are periodically added to UTC. GPS time was set to match UTC in 1980, but has since diverged. The lack of corrections means that GPS time remains at a constant offset with International Atomic Time (TAI) (TAI − GPS = 19 seconds). Periodic corrections are performed to the on-board clocks to keep them synchronized with ground clocks.
The GPS navigation message includes the difference between GPS time and UTC. GPS time is 18 seconds ahead of UTC because of the leap second added to UTC on December 31, 2016. Receivers subtract this offset from GPS time to calculate UTC and specific time zone values. New GPS units may not show the correct UTC time until after receiving the UTC offset message. The GPS-UTC offset field can accommodate 255 leap seconds (eight bits).
GPS time is theoretically accurate to about 14 nanoseconds, due to the clock drift that atomic clocks experience in GPS transmitters, relative to International Atomic Time. Most receivers lose accuracy in the interpretation of the signals and are only accurate to 100 nanoseconds.
As opposed to the year, month, and day format of the Gregorian calendar, the GPS date is expressed as a week number and a seconds-into-week number. The week number is transmitted as a ten-bit field in the C/A and P(Y) navigation messages, and so it becomes zero again every 1,024 weeks (19.6 years). GPS week zero started at 00:00:00 UTC (00:00:19 TAI) on January 6, 1980, and the week number became zero again for the first time at 23:59:47 UTC on August 21, 1999 (00:00:19 TAI on August 22, 1999). It happened the second time at 23:59:42 UTC on April 6, 2019. To determine the current Gregorian date, a GPS receiver must be provided with the approximate date (to within 3,584 days) to correctly translate the GPS date signal. To address this concern in the future the modernized GPS civil navigation (CNAV) message will use a 13-bit field that only repeats every 8,192 weeks (157 years), thus lasting until 2137 (157 years after GPS week zero).
The navigational signals transmitted by GPS satellites encode a variety of information including satellite positions, the state of the internal clocks, and the health of the network. These signals are transmitted on two separate carrier frequencies that are common to all satellites in the network. Two different encodings are used: a public encoding that enables lower resolution navigation, and an encrypted encoding used by the U.S. military.
Each GPS satellite continuously broadcasts a "navigation message" on L1 (C/A and P/Y) and L2 (P/Y) frequencies at a rate of 50 bits per second (see bitrate). Each complete message takes 750 seconds (12 1/2 minutes) to complete. The message structure has a basic format of a 1500-bit-long frame made up of five subframes, each subframe being 300 bits (6 seconds) long. Subframes 4 and 5 are subcommutated 25 times each, so that a complete data message requires the transmission of 25 full frames. Each subframe consists of ten words, each 30 bits long. Thus, with 300 bits in a subframe times 5 subframes in a frame times 25 frames in a message, each message is 37,500 bits long. At a transmission rate of 50-bit/s, this gives 750 seconds to transmit an entire almanac message (GPS). Each 30-second frame begins precisely on the minute or half-minute as indicated by the atomic clock on each satellite.
The first subframe of each frame encodes the week number and the time within the week, as well as the data about the health of the satellite. The second and the third subframes contain the "ephemeris" – the precise orbit for the satellite. The fourth and fifth subframes contain the "almanac", which contains coarse orbit and status information for up to 32 satellites in the constellation as well as data related to error correction. Thus, to obtain an accurate satellite location from this transmitted message, the receiver must demodulate the message from each satellite it includes in its solution for 18 to 30 seconds. To collect all transmitted almanacs, the receiver must demodulate the message for 732 to 750 seconds or 12 1/2 minutes.
All satellites broadcast at the same frequencies, encoding signals using unique code division multiple access (CDMA) so receivers can distinguish individual satellites from each other. The system uses two distinct CDMA encoding types: the coarse/acquisition (C/A) code, which is accessible by the general public, and the precise (P(Y)) code, which is encrypted so that only the U.S. military and other NATO nations who have been given access to the encryption code can access it.
The ephemeris is updated every 2 hours and is generally valid for 4 hours, with provisions for updates every 6 hours or longer in non-nominal conditions. The almanac is updated typically every 24 hours. Additionally, data for a few weeks following is uploaded in case of transmission updates that delay data upload.
All satellites broadcast at the same two frequencies, 1.57542 GHz (L1 signal) and 1.2276 GHz (L2 signal). The satellite network uses a CDMA spread-spectrum technique where the low-bitrate message data is encoded with a high-rate pseudo-random (PRN) sequence that is different for each satellite. The receiver must be aware of the PRN codes for each satellite to reconstruct the actual message data. The C/A code, for civilian use, transmits data at 1.023 million chips per second, whereas the P code, for U.S. military use, transmits at 10.23 million chips per second. The actual internal reference of the satellites is 10.22999999543 MHz to compensate for relativistic effects that make observers on the Earth perceive a different time reference with respect to the transmitters in orbit. The L1 carrier is modulated by both the C/A and P codes, while the L2 carrier is only modulated by the P code. The P code can be encrypted as a so-called P(Y) code that is only available to military equipment with a proper decryption key. Both the C/A and P(Y) codes impart the precise time-of-day to the user.
The L3 signal at a frequency of 1.38105 GHz is used to transmit data from the satellites to ground stations. This data is used by the United States Nuclear Detonation (NUDET) Detection System (USNDS) to detect, locate, and report nuclear detonations (NUDETs) in the Earth's atmosphere and near space. One usage is the enforcement of nuclear test ban treaties.
The L4 band at 1.379913 GHz is being studied for additional ionospheric correction.
The L5 frequency band at 1.17645 GHz was added in the process of GPS modernization. This frequency falls into an internationally protected range for aeronautical navigation, promising little or no interference under all circumstances. The first Block IIF satellite that provides this signal was launched in May 2010. On February 5th 2016, the 12th and final Block IIF satellite was launched. The L5 consists of two carrier components that are in phase quadrature with each other. Each carrier component is bi-phase shift key (BPSK) modulated by a separate bit train. "L5, the third civil GPS signal, will eventually support safety-of-life applications for aviation and provide improved availability and accuracy."
In 2011, a conditional waiver was granted to LightSquared to operate a terrestrial broadband service near the L1 band. Although LightSquared had applied for a license to operate in the 1525 to 1559 band as early as 2003 and it was put out for public comment, the FCC asked LightSquared to form a study group with the GPS community to test GPS receivers and identify issue that might arise due to the larger signal power from the LightSquared terrestrial network. The GPS community had not objected to the LightSquared (formerly MSV and SkyTerra) applications until November 2010, when LightSquared applied for a modification to its Ancillary Terrestrial Component (ATC) authorization. This filing (SAT-MOD-20101118-00239) amounted to a request to run several orders of magnitude more power in the same frequency band for terrestrial base stations, essentially repurposing what was supposed to be a "quiet neighborhood" for signals from space as the equivalent of a cellular network. Testing in the first half of 2011 has demonstrated that the impact of the lower 10 MHz of spectrum is minimal to GPS devices (less than 1% of the total GPS devices are affected). The upper 10 MHz intended for use by LightSquared may have some impact on GPS devices. There is some concern that this may seriously degrade the GPS signal for many consumer uses. "Aviation Week" magazine reports that the latest testing (June 2011) confirms "significant jamming" of GPS by LightSquared's system.
Because all of the satellite signals are modulated onto the same L1 carrier frequency, the signals must be separated after demodulation. This is done by assigning each satellite a unique binary sequence known as a Gold code. The signals are decoded after demodulation using addition of the Gold codes corresponding to the satellites monitored by the receiver.
If the almanac information has previously been acquired, the receiver picks the satellites to listen for by their PRNs, unique numbers in the range 1 through 32. If the almanac information is not in memory, the receiver enters a search mode until a lock is obtained on one of the satellites. To obtain a lock, it is necessary that there be an unobstructed line of sight from the receiver to the satellite. The receiver can then acquire the almanac and determine the satellites it should listen for. As it detects each satellite's signal, it identifies it by its distinct C/A code pattern. There can be a delay of up to 30 seconds before the first estimate of position because of the need to read the ephemeris data.
Processing of the navigation message enables the determination of the time of transmission and the satellite position at this time. For more information see Demodulation and Decoding, Advanced.
The receiver uses messages received from satellites to determine the satellite positions and time sent. The "x, y," and "z" components of satellite position and the time sent are designated as ["xi, yi, zi, si"] where the subscript "i" denotes the satellite and has the value 1, 2, ..., "n", where "n" ≥ 4. When the time of message reception indicated by the on-board receiver clock is "t̃i", the true reception time is , where "b" is the receiver's clock bias from the much more accurate GPS clocks employed by the satellites. The receiver clock bias is the same for all received satellite signals (assuming the satellite clocks are all perfectly synchronized). The message's transit time is , where "si" is the satellite time. Assuming the message traveled at the speed of light, "c", the distance traveled is .
For n satellites, the equations to satisfy are:
where "di" is the geometric distance or range between receiver and satellite "i" (the values without subscripts are the "x, y," and "z" components of receiver position):
Defining "pseudoranges" as formula_3, we see they are biased versions of the true range:
Since the equations have four unknowns ["x, y, z, b"]—the three components of GPS receiver position and the clock bias—signals from at least four satellites are necessary to attempt solving these equations. They can be solved by algebraic or numerical methods. Existence and uniqueness of GPS solutions are discussed by Abell and Chaffee. When "n" is greater than 4 this system is overdetermined and a fitting method must be used.
The amount of error in the results varies with the received satellites' locations in the sky, since certain configurations (when the received satellites are close together in the sky) cause larger errors. Receivers usually calculate a running estimate of the error in the calculated position. This is done by multiplying the basic resolution of the receiver by quantities called the geometric dilution of position (GDOP) factors, calculated from the relative sky directions of the satellites used. The receiver location is expressed in a specific coordinate system, such as latitude and longitude using the WGS 84 geodetic datum or a country-specific system.
The GPS equations can be solved by numerical and analytical methods. Geometrical interpretations can enhance the understanding of these solution methods.
The measured ranges, called pseudoranges, contain clock errors. In a simplified idealization in which the ranges are synchronized, these true ranges represent the radii of spheres, each centered on one of the transmitting satellites. The solution for the position of the receiver is then at the intersection of the surfaces of these spheres. Signals from at minimum three satellites are required, and their three spheres would typically intersect at two points. One of the points is the location of the receiver, and the other moves rapidly in successive measurements and would not usually be on Earth's surface.
In practice, there are many sources of inaccuracy besides clock bias, including random errors as well as the potential for precision loss from subtracting numbers close to each other if the centers of the spheres are relatively close together. This means that the position calculated from three satellites alone is unlikely to be accurate enough. Data from more satellites can help because of the tendency for random errors to cancel out and also by giving a larger spread between the sphere centers. But at the same time, more spheres will not generally intersect at one point. Therefore, a near intersection gets computed, typically via least squares. The more signals available, the better the approximation is likely to be.
If the pseudorange between the receiver and satellite "i" and the pseudorange between the receiver and satellite "j" are subtracted, , the common receiver clock bias ("b") cancels out, resulting in a difference of distances . The locus of points having a constant difference in distance to two points (here, two satellites) is a hyperbola on a plane and a hyperboloid of revolution in 3D space (see Multilateration). Thus, from four pseudorange measurements, the receiver can be placed at the intersection of the surfaces of three hyperboloids each with foci at a pair of satellites. With additional satellites, the multiple intersections are not necessarily unique, and a best-fitting solution is sought instead.
The receiver position can be interpreted as the center of an inscribed sphere (insphere) of radius "bc", given by the receiver clock bias "b" (scaled by the speed of light "c"). The insphere location is such that it touches other spheres (see Problem of Apollonius#Applications). The circumscribing spheres are centered at the GPS satellites, whose radii equal the measured pseudoranges "p"i. This configuration is distinct from the one described in section #Spheres, in which the spheres' radii were the unbiased or geometric ranges "d"i.
The clock in the receiver is usually not of the same quality as the ones in the satellites and will not be accurately synchronised to them. This produces large errors in the computed distances to the satellites. Therefore, in practice, the time difference between the receiver clock and the satellite time is defined as an unknown clock bias "b". The equations are then solved simultaneously for the receiver position and the clock bias. The solution space ["x, y, z, b"] can be seen as a four-dimensional geometric space, and signals from at minimum four satellites are needed. In that case each of the equations describes a spherical cone, with the cusp located at the satellite, and the base a sphere around the satellite. The receiver is at the intersection of four or more of such cones.
When more than four satellites are available, the calculation can use the four best, or more than four simultaneously (up to all visible satellites), depending on the number of receiver channels, processing capability, and geometric dilution of precision (GDOP).
Using more than four involves an over-determined system of equations with no unique solution; such a system can be solved by a least-squares or weighted least squares method.
Both the equations for four satellites, or the least squares equations for more than four, are non-linear and need special solution methods. A common approach is by iteration on a linearized form of the equations, such as the Gauss–Newton algorithm.
The GPS was initially developed assuming use of a numerical least-squares solution method—i.e., before closed-form solutions were found.
One closed-form solution to the above set of equations was developed by S. Bancroft. Its properties are well known; in particular, proponents claim it is superior in low-GDOP situations, compared to iterative least squares methods.
Bancroft's method is algebraic, as opposed to numerical, and can be used for four or more satellites. When four satellites are used, the key steps are inversion of a 4x4 matrix and solution of a single-variable quadratic equation. Bancroft's method provides one or two solutions for the unknown quantities. When there are two (usually the case), only one is a near-Earth sensible solution.
When a receiver uses more than four satellites for a solution, Bancroft uses the generalized inverse (i.e., the pseudoinverse) to find a solution. A case has been made that iterative methods, such as the Gauss–Newton algorithm approach for solving over-determined non-linear least squares (NLLS) problems, generally provide more accurate solutions.
Leick et al. (2015) states that "Bancroft's (1985) solution is a very early, if not the first, closed-form solution."
Other closed-form solutions were published afterwards, although their adoption in practice is unclear.
GPS error analysis examines error sources in GPS results and the expected size of those errors. GPS makes corrections for receiver clock errors and other effects, but some residual errors remain uncorrected. Error sources include signal arrival time measurements, numerical calculations, atmospheric effects (ionospheric/tropospheric delays), ephemeris and clock data, multipath signals, and natural and artificial interference. Magnitude of residual errors from these sources depends on geometric dilution of precision. Artificial errors may result from jamming devices and threaten ships and aircraft or from intentional signal degradation through selective availability, which limited accuracy to ≈ , but has been switched off since May 1, 2000.
Integrating external information into the calculation process can materially improve accuracy. Such augmentation systems are generally named or described based on how the information arrives. Some systems transmit additional error information (such as clock drift, ephemera, or ionospheric delay), others characterize prior errors, while a third group provides additional navigational or vehicle information.
Examples of augmentation systems include the Wide Area Augmentation System (WAAS), European Geostationary Navigation Overlay Service (EGNOS), Differential GPS (DGPS), inertial navigation systems (INS) and Assisted GPS. The standard accuracy of about can be augmented to with DGPS, and to about with WAAS.
Accuracy can be improved through precise monitoring and measurement of existing GPS signals in additional or alternative ways.
The largest remaining error is usually the unpredictable delay through the ionosphere. The spacecraft broadcast ionospheric model parameters, but some errors remain. This is one reason GPS spacecraft transmit on at least two frequencies, L1 and L2. Ionospheric delay is a well-defined function of frequency and the total electron content (TEC) along the path, so measuring the arrival time difference between the frequencies determines TEC and thus the precise ionospheric delay at each frequency.
Military receivers can decode the P(Y) code transmitted on both L1 and L2. Without decryption keys, it is still possible to use a "codeless" technique to compare the P(Y) codes on L1 and L2 to gain much of the same error information. This technique is slow, so it is currently available only on specialized surveying equipment. In the future, additional civilian codes are expected to be transmitted on the L2 and L5 frequencies (see GPS modernization). All users will then be able to perform dual-frequency measurements and directly compute ionospheric delay errors.
A second form of precise monitoring is called "Carrier-Phase Enhancement" (CPGPS). This corrects the error that arises because the pulse transition of the PRN is not instantaneous, and thus the correlation (satellite–receiver sequence matching) operation is imperfect. CPGPS uses the L1 carrier wave, which has a period of formula_6, which is about one-thousandth of the C/A Gold code bit period of formula_7, to act as an additional clock signal and resolve the uncertainty. The phase difference error in the normal GPS amounts to of ambiguity. CPGPS working to within 1% of perfect transition reduces this error to of ambiguity. By eliminating this error source, CPGPS coupled with DGPS normally realizes between of absolute accuracy.
"Relative Kinematic Positioning" (RKP) is a third alternative for a precise GPS-based positioning system. In this approach, determination of range signal can be resolved to a precision of less than . This is done by resolving the number of cycles that the signal is transmitted and received by the receiver by using a combination of differential GPS (DGPS) correction data, transmitting GPS signal phase information and ambiguity resolution techniques via statistical tests—possibly with processing in real-time (real-time kinematic positioning, RTK).
Another method that is used in surveying applications is carrier phase tracking. The period of the carrier frequency multiplied by the speed of light gives the wavelength, which is about for the L1 carrier. Accuracy within 1% of wavelength in detecting the leading edge reduces this component of pseudorange error to as little as . This compares to for the C/A code and for the P code.
Triple differencing followed by numerical root finding, and a mathematical technique called least squares can estimate the position of one receiver given the position of another. First, compute the difference between satellites, then between receivers, and finally between epochs. Other orders of taking differences are equally valid. Detailed discussion of the errors is omitted.
The satellite carrier total phase can be measured with ambiguity as to the number of cycles. Let formula_8 denote the phase of the carrier of satellite "j" measured by receiver "i" at time formula_9. This notation shows the meaning of the subscripts "i, j," and "k." The receiver ("r"), satellite ("s"), and time ("t") come in alphabetical order as arguments of formula_10 and to balance readability and conciseness, let formula_11 be a concise abbreviation. Also we define three functions, :formula_12, which return differences between receivers, satellites, and time points, respectively. Each function has variables with three subscripts as its arguments. These three functions are defined below. If formula_13 is a function of the three integer arguments, "i, j," and "k" then it is a valid argument for the functions, :formula_12, with the values defined as
Also if formula_18 are valid arguments for the three functions and "a" and "b" are constants then
formula_19 is a valid argument with values defined as
Receiver clock errors can be approximately eliminated by differencing the phases measured from satellite 1 with that from satellite 2 at the same epoch. This difference is designated as formula_23
Double differencing computes the difference of receiver 1's satellite difference from that of receiver 2. This approximately eliminates satellite clock errors. This double difference is:
Triple differencing subtracts the receiver difference from time 1 from that of time 2. This eliminates the ambiguity associated with the integral number of wavelengths in carrier phase provided this ambiguity does not change with time. Thus the triple difference result eliminates practically all clock bias errors and the integer ambiguity. Atmospheric delay and satellite ephemeris errors have been significantly reduced. This triple difference is:
Triple difference results can be used to estimate unknown variables. For example, if the position of receiver 1 is known but the position of receiver 2 unknown, it may be possible to estimate the position of receiver 2 using numerical root finding and least squares. Triple difference results for three independent time pairs may be sufficient to solve for receiver 2's three position components. This may require a numerical procedure. An approximation of receiver 2's position is required to use such a numerical method. This initial value can probably be provided from the navigation message and the intersection of sphere surfaces. Such a reasonable estimate can be key to successful multidimensional root finding. Iterating from three time pairs and a fairly good initial value produces one observed triple difference result for receiver 2's position. Processing additional time pairs can improve accuracy, overdetermining the answer with multiple solutions. Least squares can estimate an overdetermined system. Least squares determines the position of receiver 2 that best fits the observed triple difference results for receiver 2 positions under the criterion of minimizing the sum of the squares.
In the United States, GPS receivers are regulated under the Federal Communications Commission's (FCC) Part 15 rules. As indicated in the manuals of GPS-enabled devices sold in the United States, as a Part 15 device, it "must accept any interference received, including interference that may cause undesired operation." With respect to GPS devices in particular, the FCC states that GPS receiver manufacturers, "must use receivers that reasonably discriminate against reception of signals outside their allocated spectrum." For the last 30 years, GPS receivers have operated next to the Mobile Satellite Service band, and have discriminated against reception of mobile satellite services, such as Inmarsat, without any issue.
The spectrum allocated for GPS L1 use by the FCC is 1559 to 1610 MHz, while the spectrum allocated for satellite-to-ground use owned by Lightsquared is the Mobile Satellite Service band. Since 1996, the FCC has authorized licensed use of the spectrum neighboring the GPS band of 1525 to 1559 MHz to the Virginia company LightSquared. On March 1, 2001, the FCC received an application from LightSquared's predecessor, Motient Services, to use their allocated frequencies for an integrated satellite-terrestrial service. In 2002, the U.S. GPS Industry Council came to an out-of-band-emissions (OOBE) agreement with LightSquared to prevent transmissions from LightSquared's ground-based stations from emitting transmissions into the neighboring GPS band of 1559 to 1610 MHz. In 2004, the FCC adopted the OOBE agreement in its authorization for LightSquared to deploy a ground-based network ancillary to their satellite system – known as the Ancillary Tower Components (ATCs) – "We will authorize MSS ATC subject to conditions that ensure that the added terrestrial component remains ancillary to the principal MSS offering. We do not intend, nor will we permit, the terrestrial component to become a stand-alone service." This authorization was reviewed and approved by the U.S. Interdepartment Radio Advisory Committee, which includes the U.S. Department of Agriculture, U.S. Space Force, U.S. Army, U.S. Coast Guard, Federal Aviation Administration, National Aeronautics and Space Administration, Interior, and U.S. Department of Transportation.
In January 2011, the FCC conditionally authorized LightSquared's wholesale customers—such as Best Buy, Sharp, and C Spire—to only purchase an integrated satellite-ground-based service from LightSquared and re-sell that integrated service on devices that are equipped to only use the ground-based signal using LightSquared's allocated frequencies of 1525 to 1559 MHz. In December 2010, GPS receiver manufacturers expressed concerns to the FCC that LightSquared's signal would interfere with GPS receiver devices although the FCC's policy considerations leading up to the January 2011 order did not pertain to any proposed changes to the maximum number of ground-based LightSquared stations or the maximum power at which these stations could operate. The January 2011 order makes final authorization contingent upon studies of GPS interference issues carried out by a LightSquared led working group along with GPS industry and Federal agency participation. On February 14, 2012, the FCC initiated proceedings to vacate LightSquared's Conditional Waiver Order based on the NTIA's conclusion that there was currently no practical way to mitigate potential GPS interference.
GPS receiver manufacturers design GPS receivers to use spectrum beyond the GPS-allocated band. In some cases, GPS receivers are designed to use up to 400 MHz of spectrum in either direction of the L1 frequency of 1575.42 MHz, because mobile satellite services in those regions are broadcasting from space to ground, and at power levels commensurate with mobile satellite services. As regulated under the FCC's Part 15 rules, GPS receivers are not warranted protection from signals outside GPS-allocated spectrum. This is why GPS operates next to the Mobile Satellite Service band, and also why the Mobile Satellite Service band operates next to GPS. The symbiotic relationship of spectrum allocation ensures that users of both bands are able to operate cooperatively and freely.
The FCC adopted rules in February 2003 that allowed Mobile Satellite Service (MSS) licensees such as LightSquared to construct a small number of ancillary ground-based towers in their licensed spectrum to "promote more efficient use of terrestrial wireless spectrum." In those 2003 rules, the FCC stated "As a preliminary matter, terrestrial [Commercial Mobile Radio Service (“CMRS”)] and MSS ATC are expected to have different prices, coverage, product acceptance and distribution; therefore, the two services appear, at best, to be imperfect substitutes for one another that would be operating in predominantly different market segments... MSS ATC is unlikely to compete directly with terrestrial CMRS for the same customer base...". In 2004, the FCC clarified that the ground-based towers would be ancillary, noting that "We will authorize MSS ATC subject to conditions that ensure that the added terrestrial component remains ancillary to the principal MSS offering. We do not intend, nor will we permit, the terrestrial component to become a stand-alone service." In July 2010, the FCC stated that it expected LightSquared to use its authority to offer an integrated satellite-terrestrial service to "provide mobile broadband services similar to those provided by terrestrial mobile providers and enhance competition in the mobile broadband sector." GPS receiver manufacturers have argued that LightSquared's licensed spectrum of 1525 to 1559 MHz was never envisioned as being used for high-speed wireless broadband based on the 2003 and 2004 FCC ATC rulings making clear that the Ancillary Tower Component (ATC) would be, in fact, ancillary to the primary satellite component. To build public support of efforts to continue the 2004 FCC authorization of LightSquared's ancillary terrestrial component vs. a simple ground-based LTE service in the Mobile Satellite Service band, GPS receiver manufacturer Trimble Navigation Ltd. formed the "Coalition To Save Our GPS."
The FCC and LightSquared have each made public commitments to solve the GPS interference issue before the network is allowed to operate. According to Chris Dancy of the Aircraft Owners and Pilots Association, airline pilots with the type of systems that would be affected "may go off course and not even realize it." The problems could also affect the Federal Aviation Administration upgrade to the air traffic control system, United States Defense Department guidance, and local emergency services including 911.
On February 14, 2012, the U.S. Federal Communications Commission (FCC) moved to bar LightSquared's planned national broadband network after being informed by the National Telecommunications and Information Administration (NTIA), the federal agency that coordinates spectrum uses for the military and other federal government entities, that "there is no practical way to mitigate potential interference at this time". LightSquared is challenging the FCC's action.
Other notable satellite navigation systems in use or various states of development include:
|
https://en.wikipedia.org/wiki?curid=11866
|
Germany
Germany (, ), officially the Federal Republic of Germany (, ), is a country in Central and Western Europe. Covering an area of , it lies between the Baltic and North seas to the north, and the Alps to the south. It borders Denmark to the north, Poland and the Czech Republic to the east, Austria and Switzerland to the south, and France, Luxembourg, Belgium, and the Netherlands to the west.
Various Germanic tribes have inhabited the northern parts of modern Germany since classical antiquity. A region named Germania was documented before AD 100. Beginning in the 10th century, German territories formed a central part of the Holy Roman Empire. During the 16th century, northern German regions became the centre of the Protestant Reformation. After the collapse of the Holy Roman Empire, the German Confederation was formed in 1815. In 1871, Germany became a nation state when most of the German states unified into the Prussian-dominated German Empire. After World War I and the German Revolution of 1918–1919, the Empire was replaced by the parliamentary Weimar Republic. The Nazi seizure of power in 1933 led to the establishment of a dictatorship, World War II, and the Holocaust. After the end of World War II in Europe and a period of Allied occupation, two new German states were founded: West Germany and East Germany. The Federal Republic of Germany was a founding member of the European Economic Community and the European Union. The country was reunified on 3 October 1990.
Today, Germany is a federal parliamentary republic led by a chancellor. With 83 million inhabitants of its 16 constituent states, it is the second-most populous country in Europe after Russia, as well as the most populous member state of the European Union. Its capital and largest city is Berlin, and its financial centre is Frankfurt; the largest urban area is the Ruhr.
Germany is a great power with a strong economy; it has the largest economy in Europe, the world's fourth-largest economy by nominal GDP, and the fifth-largest by PPP. As a global leader in several industrial and technological sectors, it is both the world's third-largest exporter and importer of goods. A highly developed country with a very high standard of living, it offers social security and a universal health care system, environmental protections, and a tuition-free university education. Germany is also a member of the United Nations, NATO, the G7, the G20, and the OECD. Known for its long and rich cultural history, Germany has many World Heritage sites and is among the top tourism destinations in the world.
The English word "Germany" derives from the Latin , which came into use after Julius Caesar adopted it for the peoples east of the Rhine. The German term , originally ("the German lands") is derived from , descended from Old High German "of the people" (from or "people"), originally used to distinguish the language of the common people from Latin and its Romance descendants. This in turn descends from Proto-Germanic "of the people" (see also the Latinised form ), derived from , descended from Proto-Indo-European *"" "people", from which the word "Teutons" also originates.
Ancient humans were present in Germany at least 600,000 years ago. The first non-modern human fossil (the Neanderthal) was discovered in the Neander Valley. Similarly dated evidence of modern humans has been found in the Swabian Jura, including 42,000-year-old flutes which are the oldest musical instruments ever found, the 40,000-year-old Lion Man, and the 35,000-year-old Venus of Hohle Fels. The Nebra sky disk, created during the European Bronze Age, is attributed to a German site.
The Germanic tribes are thought to date from the Nordic Bronze Age or the Pre-Roman Iron Age. From southern Scandinavia and north Germany, they expanded south, east and west, coming into contact with the Celtic, Iranian, Baltic, and Slavic tribes.
Under Augustus, Rome began to invade Germania. In 9 AD, three Roman legions were defeated by Arminius. By 100 AD, when Tacitus wrote "Germania", Germanic tribes had settled along the Rhine and the Danube (the Limes Germanicus), occupying most of modern Germany. However, Baden Württemberg, southern Bavaria, southern Hesse and the western Rhineland had been incorporated into Roman provinces. Around 260, Germanic peoples broke into Roman-controlled lands. After the invasion of the Huns in 375, and with the decline of Rome from 395, Germanic tribes moved farther southwest: the Franks established the Frankish Kingdom and pushed east to subjugate Saxony and Bavaria, and areas of what is today eastern Germany were inhabited by Western Slavic tribes.
Charlemagne founded the Carolingian Empire in 800; it was divided in 843 and the Holy Roman Empire emerged from the eastern portion. The territory initially known as East Francia stretched from the Rhine in the west to the Elbe River in the east and from the North Sea to the Alps. The Ottonian rulers (919–1024) consolidated several major duchies. In 996 Gregory V became the first German Pope, appointed by his cousin Otto III, whom he shortly after crowned Holy Roman Emperor. The Holy Roman Empire absorbed northern Italy and Burgundy under the Salian emperors (1024–1125), although the emperors lost power through the Investiture controversy.
Under the Hohenstaufen emperors (1138–1254), German princes encouraged German settlement to the south and east "(Ostsiedlung)". Members of the Hanseatic League, mostly north German towns, prospered in the expansion of trade. Population declined starting with the Great Famine in 1315, followed by the Black Death of 1348–50. The Golden Bull issued in 1356 provided the constitutional structure of the Empire and codified the election of the emperor by seven prince-electors.
Johannes Gutenberg introduced moveable-type printing to Europe, laying the basis for the democratization of knowledge. In 1517, Martin Luther incited the Protestant Reformation; the 1555 Peace of Augsburg tolerated the "Evangelical" faith (Lutheranism), but also decreed that the faith of the prince was to be the faith of his subjects ("cuius regio, eius religio"). From the Cologne War through the Thirty Years' Wars (1618–1648), religious conflict devastated German lands and significantly reduced the population.
The Peace of Westphalia ended religious warfare among the Imperial Estates; their mostly German-speaking rulers were able to choose Roman Catholicism, Lutheranism, or the Reformed faith as their official religion. The legal system initiated by a series of Imperial Reforms (approximately 1495–1555) provided for considerable local autonomy and a stronger Imperial Diet. The House of Habsburg held the imperial crown from 1438 until the death of Charles VI in 1740. Following the War of Austrian Succession and the Treaty of Aix-la-Chapelle, Charles VI's daughter Maria Theresa ruled as Empress Consort when her husband, Francis I, became Emperor.
From 1740, dualism between the Austrian Habsburg Monarchy and the Kingdom of Prussia dominated German history. In 1772, 1793, and 1795, Prussia and Austria, along with the Russian Empire, agreed to the Partitions of Poland. During the period of the French Revolutionary Wars, the Napoleonic era and the subsequent final meeting of the Imperial Diet, most of the Free Imperial Cities were annexed by dynastic territories; the ecclesiastical territories were secularised and annexed. In 1806 the "Imperium" was dissolved; France, Russia, Prussia and the Habsburgs (Austria) competed for hegemony in the German states during the Napoleonic Wars.
Following the fall of Napoleon, the Congress of Vienna founded the German Confederation, a loose league of 39 sovereign states. The appointment of the Emperor of Austria as the permanent president reflected the Congress's rejection of Prussia's rising influence. Disagreement within restoration politics partly led to the rise of liberal movements, followed by new measures of repression by Austrian statesman Metternich. The "Zollverein", a tariff union, furthered economic unity. In light of revolutionary movements in Europe, intellectuals and commoners started the Revolutions of 1848 in the German states. King Frederick William IV of Prussia was offered the title of Emperor, but with a loss of power; he rejected the crown and the proposed constitution, a temporary setback for the movement.
King William I appointed Otto von Bismarck as the Minister President of Prussia in 1862. Bismarck successfully concluded war on Denmark; the subsequent decisive Prussian victory in the Austro-Prussian War of 1866 enabled him to create the North German Confederation which excluded Austria. After the French defeat in the Franco-Prussian War, the German princes proclaimed the founding of the German Empire in 1871. Prussia was the dominant constituent state of the new empire; the King of Prussia ruled as its Kaiser, and Berlin became its capital.
In the period following the unification of Germany, Bismarck's foreign policy as Chancellor of Germany secured Germany's position as a great nation by forging alliances and avoiding war. However, under Wilhelm II, Germany took an imperialistic course, leading to friction with neighbouring countries. A dual alliance was created with the multinational realm of Austria-Hungary; the Triple Alliance of 1882 included Italy. Britain, France and Russia also concluded alliances to protect against Habsburg interference with Russian interests in the Balkans or German interference against France. At the Berlin Conference in 1884, Germany claimed several colonies including German East Africa, German South West Africa, Togoland, and Kamerun. Later, Germany further expanded its colonial empire to include holdings in the Pacific and China. The colonial government in South West Africa (present-day Namibia), from 1904 to 1907, carried out the annihilation of the local Herero and Namaqua peoples as punishment for an uprising; this was the 20th century's first genocide.
The assassination of Austria's crown prince on 28 June 1914 provided the pretext for the Austrian Empire to attack Serbia and trigger World War I. After four years of warfare, in which approximately two million German soldiers were killed, a general armistice ended the fighting. In the German Revolution (November 1918), Emperor Wilhelm II and the ruling princes abdicated their positions and Germany was declared a federal republic. Germany's new leadership signed the Treaty of Versailles in 1919, accepting defeat by the Allies. Germans perceived the treaty as humiliating, which was seen by historians as influential in the rise of Adolf Hitler. Germany lost around 13% of its European territory and ceded all of its colonial possessions in Africa and the South Sea.
On 11 August 1919, President Friedrich Ebert signed the democratic Weimar Constitution. In the subsequent struggle for power, Communists seized power in Bavaria, but conservative elements elsewhere attempted to overthrow the Republic in the Kapp Putsch. Street fighting in the major industrial centres, the occupation of the Ruhr by Belgian and French troops, and a period of hyperinflation followed. A debt restructuring plan and the creation of a new currency in 1924 ushered in the Golden Twenties, an era of artistic innovation and liberal cultural life.
The worldwide Great Depression hit Germany in 1929. Chancellor Heinrich Brüning's government pursued a policy of fiscal austerity and deflation which caused unemployment of nearly 30% by 1932. The Nazi Party led by Adolf Hitler won a special election in 1932 and Hindenburg appointed Hitler as Chancellor of Germany on 30 January 1933. After the Reichstag fire, a decree abrogated basic civil rights and the first Nazi concentration camp opened. The Enabling Act gave Hitler unrestricted legislative power, overriding the constitution; his government established a centralised totalitarian state, withdrew from the League of Nations, and dramatically increased the country's rearmament. A government-sponsored programme for economic renewal focused on public works, the most famous of which was the German autobahns.
In 1935, the regime withdrew from the Treaty of Versailles and introduced the Nuremberg Laws which targeted Jews and other minorities. Germany also reacquired control of the Saar in 1935, remilitarised the Rhineland in 1936, annexed Austria in 1938, annexed the Sudetenland in 1938 with the Munich Agreement, and in violation of the agreement occupied Czechoslovakia in March 1939. "Kristallnacht" saw the burning of synagogues, the destruction of Jewish businesses, and mass arrests of Jewish people.
In August 1939, Hitler's government negotiated the Molotov–Ribbentrop pact that divided Eastern Europe into German and Soviet spheres of influence. On 1 September 1939, Germany invaded Poland, beginning World War II in Europe; Britain and France declared war on Germany on 3 September. In the spring of 1940, Germany conquered Denmark and Norway, the Netherlands, Belgium, Luxembourg, and France, forcing the French government to sign an armistice. The British repelled German air attacks in the Battle of Britain in the same year. In 1941, German troops invaded Yugoslavia, Greece and the Soviet Union. By 1942, Germany and other Axis powers controlled most of continental Europe and North Africa, but following the Soviet victory at the Battle of Stalingrad, the allies' reconquest of North Africa and invasion of Italy in 1943, German forces suffered repeated military defeats. In 1944, the Soviets pushed into Eastern Europe; the Western allies landed in France and entered Germany despite a final German counteroffensive. Following Hitler's suicide during the Battle of Berlin, Germany surrendered on 8 May 1945, ending World War II in Europe. After World War II, Nazi officials were tried for war crimes at the Nuremberg trials.
In what later became known as the Holocaust, the German government persecuted minorities, including interning them in concentration and death camps across Europe. In total 17 million were systematically murdered, including 6 million Jews, at least 130,000 Romani, 275,000 persons with disabilities, thousands of Jehovah's Witnesses, thousands of homosexuals, and hundreds of thousands of political and religious opponents. Nazi policies in German-occupied countries resulted in the deaths of 2.7 million Poles, 1.3 million Ukrainians, 1 million Belarusians and 3.5 million Soviet war prisoners. German military war casualties have been estimated at 5.3 million, and around 900,000 German civilians died. Around 12 million ethnic Germans were expelled from across Eastern Europe, and Germany lost roughly one-quarter of its pre-war territory.
After Nazi Germany surrendered, the Allies partitioned Berlin and Germany's remaining territory into four occupation zones. The western sectors, controlled by France, the United Kingdom, and the United States, were merged on 23 May 1949 to form the Federal Republic of Germany ("Bundesrepublik Deutschland (BRD)"); on 7 October 1949, the Soviet Zone became the German Democratic Republic ("Deutsche Demokratische Republik (DDR)"). They were informally known as West Germany and East Germany. East Germany selected East Berlin as its capital, while West Germany chose Bonn as a provisional capital, to emphasise its stance that the two-state solution was temporary.
West Germany was established as a federal parliamentary republic with a "social market economy". Starting in 1948 West Germany became a major recipient of reconstruction aid under the Marshall Plan. Konrad Adenauer was elected the first Federal Chancellor of Germany in 1949. The country enjoyed prolonged economic growth ("Wirtschaftswunder") beginning in the early 1950s. West Germany joined NATO in 1955 and was a founding member of the European Economic Community.
East Germany was an Eastern Bloc state under political and military control by the USSR via occupation forces and the Warsaw Pact. Although East Germany claimed to be a democracy, political power was exercised solely by leading members ("Politbüro") of the communist-controlled Socialist Unity Party of Germany, supported by the Stasi, an immense secret service. While East German propaganda was based on the benefits of the GDR's social programmes and the alleged threat of a West German invasion, many of its citizens looked to the West for freedom and prosperity. The Berlin Wall, built in 1961, prevented East German citizens from escaping to West Germany, becoming a symbol of the Cold War.
Tensions between East and West Germany were reduced in the late 1960s by Chancellor Willy Brandt's . In 1989, Hungary decided to dismantle the Iron Curtain and open its border with Austria, causing the emigration of thousands of East Germans to West Germany via Hungary and Austria. This had devastating effects on the GDR, where regular mass demonstrations received increasing support. In an effort to help retain East Germany as a state, the East German authorities eased border restrictions, but this actually led to an acceleration of the "Wende" reform process culminating in the "Two Plus Four Treaty" under which Germany regained full sovereignty. This permitted German reunification on 3 October 1990, with the accession of the five re-established states of the former GDR. The fall of the Wall in 1989 became a symbol of the Fall of Communism, the Dissolution of the Soviet Union, German Reunification and "Die Wende".
United Germany was considered the enlarged continuation of West Germany so it retained its memberships in international organisations. Based on the Berlin/Bonn Act (1994), Berlin again became the capital of Germany, while Bonn obtained the unique status of a "Bundesstadt" (federal city) retaining some federal ministries. The relocation of the government was completed in 1999, and modernisation of the east German economy was scheduled to last until 2019.
Since reunification, Germany has taken a more active role in the European Union, signing the Maastricht Treaty in 1992 and the Lisbon Treaty in 2007, and co-founding the Eurozone. Germany sent a peacekeeping force to secure stability in the Balkans and sent German troops to Afghanistan as part of a NATO effort to provide security in that country after the ousting of the Taliban.
In the 2005 elections, Angela Merkel became the first female chancellor. In 2009 the German government approved a €50 billion stimulus plan. Among the major German political projects of the early 21st century are the advancement of European integration, the energy transition ("Energiewende") for a sustainable energy supply, the "Debt Brake" for balanced budgets, measures to increase the fertility rate (pronatalism), and high-tech strategies for the transition of the German economy, summarised as Industry 4.0. Germany was affected by the European migrant crisis in 2015: the country took in over a million migrants and developed a quota system which redistributed migrants around its federal states.
Germany is in Western and Central Europe, bordering Denmark to the north, Poland and the Czech Republic to the east, Austria to the southeast, and Switzerland to the south-southwest. France, Luxembourg and Belgium are situated to the west, with the Netherlands to the northwest. Germany is also bordered by the North Sea and, at the north-northeast, by the Baltic Sea. German territory covers , consisting of of land and of water. It is the seventh largest country by area in Europe and the 62nd largest in the world.
Elevation ranges from the mountains of the Alps (highest point: the Zugspitze at ) in the south to the shores of the North Sea ("Nordsee") in the northwest and the Baltic Sea ("Ostsee") in the northeast. The forested uplands of central Germany and the lowlands of northern Germany (lowest point: Wilstermarsch at below sea level) are traversed by such major rivers as the Rhine, Danube and Elbe. Significant natural resources include iron ore, coal, potash, timber, lignite, uranium, copper, natural gas, salt, and nickel.
Most of Germany has a temperate climate, ranging from oceanic in the north to continental in the east and southeast. Winters range from cold in the southern Alps to mild and are generally overcast with limited precipitation, while summers can vary from hot and dry to cool and rainy. The northern regions have prevailing westerly winds that bring in moist air from the North Sea, moderating the temperature and increasing precipitation. Conversely, the southeast regions have more extreme temperatures.
From February 2019–2020, average monthly temperatures in Germany ranged from a low of in January 2020 to a high of in June 2019. Average monthly precipitation ranged from 30 litres per square metre in February and April 2019 to 125 litres per square metre in February 2020. Average monthly hours of sunshine ranged from 45 in November 2019 to 300 in June 2019.
The territory of Germany can be divided into two ecoregions: European-Mediterranean montane mixed forests and Northeast-Atlantic shelf marine. 51% of Germany's land area is devoted to agriculture, while 30% is forested and 14% is covered by settlements or infrastructure.
Plants and animals include those generally common to Central Europe. According to the National Forest Inventory, beeches, oaks, and other deciduous trees constitute just over 40% of the forests; roughly 60% are conifers, particularly spruce and pine. There are many species of ferns, flowers, fungi, and mosses. Wild animals include roe deer, wild boar, mouflon (a subspecies of wild sheep), fox, badger, hare, and small numbers of the Eurasian beaver. The blue cornflower was once a German national symbol.
The 16 national parks in Germany include the Jasmund National Park, the Vorpommern Lagoon Area National Park, the Müritz National Park, the Wadden Sea National Parks, the Harz National Park, the Hainich National Park, the Black Forest National Park, the Saxon Switzerland National Park, the Bavarian Forest National Park and the Berchtesgaden National Park. In addition, there are 17 Biosphere Reserves and 105 nature parks. More than 400 zoos and animal parks operate in Germany. The Berlin Zoo, which opened in 1844, is the oldest in Germany, and claims the most comprehensive collection of species in the world.
Germany is a federal, parliamentary, representative democratic republic. Federal legislative power is vested in the parliament consisting of the "Bundestag" (Federal Diet) and "Bundesrat" (Federal Council), which together form the legislative body. The "Bundestag" is elected through direct elections: half by majority vote and half by proportional representation. The members of the "Bundesrat" represent and are appointed by the governments of the sixteen federated states. The German political system operates under a framework laid out in the 1949 constitution known as the "Grundgesetz" (Basic Law). Amendments generally require a two-thirds majority of both the "Bundestag" and the "Bundesrat"; the fundamental principles of the constitution, as expressed in the articles guaranteeing human dignity, the separation of powers, the federal structure, and the rule of law, are valid in perpetuity.
The president, currently Frank-Walter Steinmeier, is the head of state and invested primarily with representative responsibilities and powers. He is elected by the "Bundesversammlung" (federal convention), an institution consisting of the members of the "Bundestag" and an equal number of state delegates. The second-highest official in the German order of precedence is the "Bundestagspräsident" (president of the "Bundestag"), who is elected by the "Bundestag" and responsible for overseeing the daily sessions of the body. The third-highest official and the head of government is the chancellor, who is appointed by the "Bundespräsident" after being elected by the party or coalition with the most seats in the "Bundestag". The chancellor, currently Angela Merkel, is the head of government and exercises executive power through their Cabinet.
Since 1949, the party system has been dominated by the Christian Democratic Union and the Social Democratic Party of Germany. So far every chancellor has been a member of one of these parties. However, the smaller liberal Free Democratic Party and the Alliance '90/The Greens have also achieved some success. Since 2007, the left-wing populist party The Left has been a staple in the German "Bundestag", though they have never been part of the federal government. In the 2017 German federal election, the right-wing populist Alternative for Germany gained enough votes to attain representation in the parliament for the first time.
Germany comprises sixteen federal states which are collectively referred to as "Bundesländer". Each state has its own state constitution, and is largely autonomous in regard to its internal organisation. Germany is divided into 401 districts ("Kreise") at a municipal level; these consist of 294 rural districts and 107 urban districts.
Germany has a civil law system based on Roman law with some references to Germanic law. The "Bundesverfassungsgericht" (Federal Constitutional Court) is the German Supreme Court responsible for constitutional matters, with power of judicial review. Germany's supreme court system is specialised: for civil and criminal cases, the highest court of appeal is the inquisitorial Federal Court of Justice, and for other affairs the courts are the Federal Labour Court, the Federal Social Court, the Federal Finance Court and the Federal Administrative Court.
Criminal and private laws are codified on the national level in the "Strafgesetzbuch" and the "Bürgerliches Gesetzbuch" respectively. The German penal system seeks the rehabilitation of the criminal and the protection of the public. Except for petty crimes, which are tried before a single professional judge, and serious political crimes, all charges are tried before mixed tribunals on which lay judges ("") sit side by side with professional judges.
Germany has a low murder rate with 1.18 murders per 100,000 . In 2018, the overall crime rate fell to its lowest since 1992.
Germany has a network of 227 diplomatic missions abroad and maintains relations with more than 190 countries. Germany is a member of NATO, the OECD, the G8, the G20, the World Bank and the IMF. It has played an influential role in the European Union since its inception and has maintained a strong alliance with France and all neighbouring countries since 1990. Germany promotes the creation of a more unified European political, economic and security apparatus. The governments of Germany and the United States are close political allies. Cultural ties and economic interests have crafted a bond between the two countries resulting in Atlanticism.
The development policy of Germany is an independent area of foreign policy. It is formulated by the Federal Ministry for Economic Cooperation and Development and carried out by the implementing organisations. The German government sees development policy as a joint responsibility of the international community. It was the world's second biggest aid donor in 2019 after the United States.
Germany's military, the "Bundeswehr", is organised into "Heer" (Army and special forces KSK), "Marine" (Navy), "Luftwaffe" (Air Force), "Bundeswehr Joint Medical Service" and "Streitkräftebasis" (Joint Support Service) branches. In absolute terms, German military expenditure is the 8th highest in the world. In 2018, military spending was at $49.5 billion, about 1.2% of the country's GDP, well below the NATO target of 2%.
, the "Bundeswehr" has a strength of 184,001 active soldiers and 80,947 civilians. Reservists are available to the Armed Forces and participate in defence exercises and deployments abroad. Until 2011, military service was compulsory for men at age 18, but this has been officially suspended and replaced with a voluntary service. Since 2001 women may serve in all functions of service without restriction. According to SIPRI, Germany was the fourth largest exporter of major arms in the world from 2014 to 2018.
In peacetime, the Bundeswehr is commanded by the Minister of Defence. In state of defence, the Chancellor would become commander-in-chief of the "Bundeswehr". The role of the "Bundeswehr" is described in the Constitution of Germany as defensive only. But after a ruling of the Federal Constitutional Court in 1994 the term "defence" has been defined to not only include protection of the borders of Germany, but also crisis reaction and conflict prevention, or more broadly as guarding the security of Germany anywhere in the world. , the German military has about 3,600 troops stationed in foreign countries as part of international peacekeeping forces, including about 1,200 supporting operations against Daesh, 980 in the NATO-led Resolute Support Mission in Afghanistan, and 800 in Kosovo.
Germany has a social market economy with a highly skilled labour force, a low level of corruption, and a high level of innovation. It is the world's third largest exporter of goods, and has the largest national economy in Europe which is also the world's fourth largest by nominal GDP and the fifth by PPP. Its GDP per capita measured in purchasing power standards amounts to 121% of the EU27 average (100%). The service sector contributes approximately 69% of the total GDP, industry 31%, and agriculture 1% . The unemployment rate published by Eurostat amounts to 3.2% , which is the fourth-lowest in the EU.
Germany is part of the European single market which represents more than 450 million consumers. In 2017, the country accounted for 28% of the Eurozone economy according to the International Monetary Fund. Germany introduced the common European currency, the Euro, in 2002. Its monetary policy is set by the European Central Bank, which is headquartered in Frankfurt.
Being home to the modern car, the automotive industry in Germany is regarded as one of the most competitive and innovative in the world, and is the fourth largest by production. The top 10 exports of Germany are vehicles, machinery, chemical goods, electronic products, electrical equipments, pharmaceuticals, transport equipments, basic metals, food products, and rubber and plastics. Germany is one of the largest exporters globally.
Of the world's 500 largest stock-market-listed companies measured by revenue in 2019, the Fortune Global 500, 29 are headquartered in Germany. 30 major Germany-based companies are included in the DAX, the German stock market index which is operated by Frankfurt Stock Exchange. Well-known international brands include Mercedes-Benz, BMW, Volkswagen, Audi, Siemens, Allianz, Adidas, Porsche, Bosch and Deutsche Telekom. Berlin is a hub for startup companies and has become the leading location for venture capital funded firms in the European Union. Germany is recognised for its large portion of specialised small and medium enterprises, known as the "Mittelstand" model. These companies represent 48% global market leaders in their segments, labelled Hidden Champions.
Research and development efforts form an integral part of the German economy. The In 2018 Germany ranked fourth globally in terms of number of science and engineering research papers published. Research institutions in Germany include the Max Planck Society, the Helmholtz Association, and the Fraunhofer Society and the Leibniz Association. Germany is the largest contributor to the European Space Agency.
With its central position in Europe, Germany is a transport hub for the continent. Its road network is among the densest in Europe. The motorway (Autobahn) is widely known for having no federally mandated speed limit for some classes of vehicles. The InterCityExpress or "ICE" train network serves major German cities as well as destinations in neighbouring countries with speeds up to . The largest German airports are Frankfurt Airport and Munich Airport. The Port of Hamburg is one of the top twenty largest container ports in the world.
, Germany was the world's seventh-largest consumer of energy. The government and the nuclear power industry agreed to phase out all nuclear power plants by 2021. It meets the country's power demands using 40% renewable sources. Germany is committed to the Paris Agreement and several other treaties promoting biodiversity, low emission standards, and water management. The country's household recycling rate is among the highest in the world—at around 65%. Nevertheless, the country's total greenhouse gas emissions were the highest in the EU . The German energy transition ("Energiewende") is the recognised move to a sustainable economy by means of energy efficiency and renewable energy.
Germany is the ninth most visited country in the world , with 37.4 million visits. Berlin has become the third most visited city destination in Europe. Additionally, more than 30% of Germans spend their holiday in their own country. Domestic and international travel and tourism combined directly contribute over EUR43.2 billion to German GDP. Including indirect and induced impacts, the industry contributes 4.5% of German GDP and supports 2 million jobs (4.8% of total employment).
Germany's most visited and popular landmarks include Neuschwanstein Castle, Cologne Cathedral, the Brandenburg Gate, the Reichstag, Hofbräuhaus Munich, Heidelberg Castle, Dresden Zwinger, Fernsehturm Berlin, the Walhalla memorial, and Aachen Cathedral. The Europa-Park near Freiburg is Europe's second most popular theme park resort.
With a population of 80.2 million according to the 2011 census, rising to 83.1 million , Germany is the most populous country in the European Union, the second most populous country in Europe after Russia, and the 19th most populous country in the world. Its population density stands at 227 inhabitants per square kilometre (588 per square mile). The overall life expectancy in Germany at birth is 80.19 years (77.93 years for males and 82.58 years for females). The fertility rate of 1.41 children born per woman (2011 estimates) is below the replacement rate of 2.1 and is one of the lowest fertility rates in the world. Since the 1970s, Germany's death rate has exceeded its birth rate. However, Germany is witnessing increased birth rates and migration rates since the beginning of the 2010s, particularly a rise in the number of well-educated migrants. Germany has the third oldest population in the world, with the average age of 47.4 years.
Four sizeable groups of people are referred to as "national minorities" because their ancestors have lived in their respective regions for centuries: There is a Danish minority in the northernmost state of Schleswig-Holstein; the Sorbs, a Slavic population, are in the Lusatia region of Saxony and Brandenburg.; the Roma and Sinti live throughout the country; and the Frisians are concentrated in Schleswig-Holstein's western coast and in the north-western part of Lower Saxony.
After the United States, Germany is the second most popular immigration destination in the world. The majority of migrants live in western Germany, in particular in urban areas. Of the country's residents, 18.6 million people (22.5%) were of immigrant or partially immigrant descent in 2016 (including persons descending or partially descending from ethnic German repatriates). In 2015, the Population Division of the United Nations Department of Economic and Social Affairs listed Germany as host to the second-highest number of international migrants worldwide, about 5% or 12 million of all 244 million migrants. , Germany ranks fifth amongst EU countries in terms of the percentage of migrants in the country's population, at 12.9%.
As of 2017, Germans make up 78.7% of the population. Other significant ethnic groups include Turkish (3.2%), Poles (2.5%), Russians (1.0%), Italians (1.0%), Romanians (0.8%), Syrians (0.8%), Kazakhs (0.8%), Greeks (0.5%), Han Chinese (0.2%), Vietnamese (0.2%) and Americans (0.2%).
Germany has a number of large cities. There are 11 officially recognised metropolitan regions. The country's largest city is Berlin, while its largest urban area is the Ruhr.
The 2011 German Census showed Christianity as the largest religion in Germany, with 66.8% identified themselves as Christian, with 3.8% of those not being church members. 31.7% declared themselves as Protestants, including members of the Evangelical Church in Germany (which encompasses Lutheran, Reformed and administrative or confessional unions of both traditions) and the free churches (); 31.2% declared themselves as Roman Catholics, and Orthodox believers constituted 1.3%. According to data from 2016, the Catholic Church and the Evangelical Church claimed 28.5% and 27.5%, respectively, of the population. Islam is the second largest religion in the country. In the 2011 census, 1.9% of the census population (1.52 million people) gave their religion as Islam, but this figure is deemed unreliable because a disproportionate number of adherents of this religion (and other religions, such as Judaism) are likely to have made use of their right not to answer the question. Most of the Muslims are Sunnis and Alevites from Turkey, but there are a small number of Shi'ites, Ahmadiyyas and other denominations. Other religions comprise less than one percent of Germany's population.
A study in 2018 estimated that 38% of the population are not members of any religious organization or denomination, though up to a third may still consider themselves religious. Irreligion in Germany is strongest in the former East Germany, which used to be predominantly Protestant before state atheism, and in major metropolitan areas.
German is the official and predominant spoken language in Germany. It is one of 24 official and working languages of the European Union, and one of the three working languages of the European Commission. German is the most widely spoken first language in the European Union, with around 100 million native speakers.
Recognised native minority languages in Germany are Danish, Low German, Low Rhenish, Sorbian, Romany, North Frisian and Saterland Frisian; they are officially protected by the European Charter for Regional or Minority Languages. The most used immigrant languages are Turkish, Arabic, Kurdish, Polish, the Balkan languages and Russian. Germans are typically multilingual: 67% of German citizens claim to be able to communicate in at least one foreign language and 27% in at least two.
Responsibility for educational supervision in Germany is primarily organised within the individual federal states. Optional kindergarten education is provided for all children between three and six years old, after which school attendance is compulsory for at least nine years. Primary education usually lasts for four to six years. Secondary education includes three traditional types of schools focused on different academic levels: the "Gymnasium" enrols the most gifted children and prepares students for university studies; the "Realschule" for intermediate students lasts six years and the "Hauptschule" prepares pupils for vocational education. The "Gesamtschule" unifies all secondary education.
A system of apprenticeship called "Duale Ausbildung" leads to a skilled qualification which is almost comparable to an academic degree. It allows students in vocational training to learn in a company as well as in a state-run trade school. This model is well regarded and reproduced all around the world.
Most of the German universities are public institutions, and students traditionally study without fee payment. The general requirement for university is the "Abitur". According to an OECD report in 2014, Germany is the world's third leading destination for international study. The established universities in Germany include some of the oldest in the world, with Heidelberg University (established in 1386) being the oldest. The University of Berlin, founded in 1810 by the liberal educational reformer Wilhelm von Humboldt, became the academic model for many European and Western universities. In the contemporary era Germany has developed eleven Universities of Excellence.
Germany's system of hospitals, called "Krankenhäuser", dates from medieval times, and today, Germany has the world's oldest universal health care system, dating from Bismarck's social legislation of the 1880s. Since the 1880s, reforms and provisions have ensured a balanced health care system. The population is covered by a health insurance plan provided by statute, with criteria allowing some groups to opt for a private health insurance contract. According to the World Health Organization, Germany's health care system was 77% government-funded and 23% privately funded . In 2014, Germany spent 11.3% of its GDP on health care.
Germany ranked 20th in the world in 2013 in life expectancy with 77 years for men and 82 years for women, and it had a very low infant mortality rate (4 per 1,000 live births). , the principal cause of death was cardiovascular disease, at 37%. Obesity in Germany has been increasingly cited as a major health issue. A 2014 study showed that 52 percent of the adult German population was overweight or obese.
Culture in German states has been shaped by major intellectual and popular currents in Europe, both religious and secular. Historically, Germany has been called "Das Land der Dichter und Denker" ("the land of poets and thinkers"), because of the major role its writers and philosophers have played in the development of Western thought. A global opinion poll for the BBC revealed that Germany is recognised for having the most positive influence in the world in 2013 and 2014.
Germany is well known for such folk festival traditions as Oktoberfest and Christmas customs, which include Advent wreaths, Christmas pageants, Christmas trees, Stollen cakes, and other practices. UNESCO inscribed 41 properties in Germany on the World Heritage List. There are a number of public holidays in Germany determined by each state; 3 October has been a national day of Germany since 1990, celebrated as the "Tag der Deutschen Einheit" (German Unity Day).
German classical music includes works by some of the world's most well-known composers. Dieterich Buxtehude, Johann Sebastian Bach and Georg Friedrich Händel were influential composers of the Baroque period. Ludwig van Beethoven was a crucial figure in the transition between the Classical and Romantic eras. Carl Maria von Weber, Felix Mendelssohn, Robert Schumann and Johannes Brahms were significant Romantic composers. Richard Wagner was known for his operas. Richard Strauss was a leading composer of the late Romantic and early modern eras. Karlheinz Stockhausen and Hans Zimmer are important composers of the 20th and early 21st centuries.
As of 2013, Germany was the second largest music market in Europe, and fourth largest in the world. German popular music of the 20th and 21st centuries includes the movements of Neue Deutsche Welle, pop, Ostrock, heavy metal/rock, punk, pop rock, indie and schlager pop. German electronic music gained global influence, with Kraftwerk and Tangerine Dream pioneering in this genre. DJs and artists of the techno and house music scenes of Germany have become well known (e.g. Paul van Dyk, Paul Kalkbrenner, and Scooter).
German painters have influenced western art. Albrecht Dürer, Hans Holbein the Younger, Matthias Grünewald and Lucas Cranach the Elder were important German artists of the Renaissance, Peter Paul Rubens and Johann Baptist Zimmermann of the Baroque, Caspar David Friedrich and Carl Spitzweg of Romanticism, Max Liebermann of Impressionism and Max Ernst of Surrealism. Several German art groups formed in the 20th century; "Die Brücke" (The Bridge) and "Der Blaue Reiter" (The Blue Rider) influenced the development of expressionism in Munich and Berlin. The New Objectivity arose in response to expressionism during the Weimar Republic. After World War II, broad trends in German art include neo-expressionism, performance art and conceptual art.
Architectural contributions from Germany include the Carolingian and Ottonian styles, which were precursors of Romanesque. Brick Gothic is a distinctive medieval style that evolved in Germany. Also in Renaissance and Baroque art, regional and typically German elements evolved (e.g. Weser Renaissance and "Dresden Baroque"). The Wessobrunner School exerted a decisive influence on, and at times even dominated, the art of stucco in southern Germany in the 18th century. Vernacular architecture in Germany is often identified by its timber framing ("Fachwerk") traditions and varies across regions, and among carpentry styles. When industrialisation spread across Europe, Classicism and a distinctive style of historism developed in Germany, sometimes referred to as "Gründerzeit style". Regional historicist styles include the "Hanover School", "Nuremberg Style" and Dresden's "Semper-Nicolai School". Among the most famous of German buildings, the Schloss Neuschwanstein represents Romanesque Revival. Notable sub-styles that evolved since the 18th century are the German spa and seaside resort architecture. German artists, writers and gallerists like Siegfried Bing, Georg Hirth and Bruno Möhring also contributed to the development of Art Nouveau at the turn of the 20th century, known as "Jugendstil" in German.
Expressionist architecture developed in the 1910s in Germany and influenced Art Deco and other modern styles. Germany was particularly important in the early modernist movement: it is the home of Werkbund initiated by Hermann Muthesius (New Objectivity), and of the Bauhaus movement founded by Walter Gropius. Consequently, Germany is often considered the cradle of modern architecture and design. Ludwig Mies van der Rohe became one of the world's most renowned architects in the second half of the 20th century; he conceived of the glass façade skyscraper. Renowned contemporary architects and offices include Pritzker Prize winners Gottfried Böhm and Frei Otto.
German designers became early leaders of modern product design. The Berlin Fashion Week and the fashion trade fair Bread & Butter are held twice a year. Munich, Hamburg, Cologne and Düsseldorf are also important design, production and trade hubs of the domestic fashion industry.
German literature can be traced back to the Middle Ages and the works of writers such as Walther von der Vogelweide and Wolfram von Eschenbach. Well-known German authors include Johann Wolfgang von Goethe, Friedrich Schiller, Gotthold Ephraim Lessing and Theodor Fontane. The collections of folk tales published by the Brothers Grimm popularised German folklore on an international level. The Grimms also gathered and codified regional variants of the German language, grounding their work in historical principles; their "Deutsches Wörterbuch", or German Dictionary, sometimes called the Grimm dictionary, was begun in 1838 and the first volumes published in 1854.
Influential authors of the 20th century include Gerhart Hauptmann, Thomas Mann, Hermann Hesse, Heinrich Böll and Günter Grass. The German book market is the third largest in the world, after the United States and China. The Frankfurt Book Fair is the most important in the world for international deals and trading, with a tradition spanning over 500 years. The Leipzig Book Fair also retains a major position in Europe.
German philosophy is historically significant: Gottfried Leibniz's contributions to rationalism; the enlightenment philosophy by Immanuel Kant; the establishment of classical German idealism by Johann Gottlieb Fichte, Georg Wilhelm Friedrich Hegel and Friedrich Wilhelm Joseph Schelling; Arthur Schopenhauer's composition of metaphysical pessimism; the formulation of communist theory by Karl Marx and Friedrich Engels; Friedrich Nietzsche's development of perspectivism; Gottlob Frege's contributions to the dawn of analytic philosophy; Martin Heidegger's works on Being; Oswald Spengler's historical philosophy; the development of the Frankfurt School has been particularly influential.
The largest internationally operating media companies in Germany are the Bertelsmann enterprise, Axel Springer SE and ProSiebenSat.1 Media. Germany's television market is the largest in Europe, with some 38 million TV households. Around 90% of German households have cable or satellite TV, with a variety of free-to-view public and commercial channels. There are more than 300 public and private radio stations in Germany; Germany's national radio network is the Deutschlandradio and the public Deutsche Welle is the main German radio and television broadcaster in foreign languages. Germany's print market of newspapers and magazines is the largest in Europe. The papers with the highest circulation are "Bild", "Süddeutsche Zeitung", "Frankfurter Allgemeine Zeitung" and "Die Welt". The largest magazines include "ADAC Motorwelt" and "Der Spiegel". Germany has a large video gaming market, with over 34 million players nationwide.
German cinema has made major technical and artistic contributions to film. The first works of the Skladanowsky Brothers were shown to an audience in 1895. The renowned Babelsberg Studio in Potsdam was established in 1912, thus being the first large-scale film studio in the world. Early German cinema was particularly influential with German expressionists such as Robert Wiene and Friedrich Wilhelm Murnau. Director Fritz Lang's "Metropolis" (1927) is referred to as the first major science-fiction film. After 1945, many of the films of the immediate post-war period can be characterised as "Trümmerfilm" (rubble film). East German film was dominated by state-owned film studio DEFA, while the dominant genre in West Germany was the "Heimatfilm" ("homeland film"). During the 1970s and 1980s, New German Cinema directors such as Volker Schlöndorff, Werner Herzog, Wim Wenders, and Rainer Werner Fassbinder brought West German auteur cinema to critical acclaim.
The Academy Award for Best Foreign Language Film ("Oscar") went to the German production "Die Blechtrommel (The Tin Drum)" in 1979, to "Nirgendwo in Afrika (Nowhere in Africa)" in 2002, and to "Das Leben der Anderen (The Lives of Others)" in 2007. Various Germans won an Oscar for their performances in other films. The annual European Film Awards ceremony is held every other year in Berlin, home of the European Film Academy. The Berlin International Film Festival, known as "Berlinale", awarding the "Golden Bear" and held annually since 1951, is one of the world's leading film festivals. The "Lolas" are annually awarded in Berlin, at the German Film Awards.
German cuisine varies from region to region and often neighbouring regions share some culinary similarities (e.g. the southern regions of Bavaria and Swabia share some traditions with Switzerland and Austria). International varieties such as pizza, sushi, Chinese food, Greek food, Indian cuisine and doner kebab are also popular.
Bread is a significant part of German cuisine and German bakeries produce about 600 main types of bread and 1,200 types of pastries and rolls ("Brötchen"). German cheeses account for about 22% of all cheese produced in Europe. In 2012 over 99% of all meat produced in Germany was either pork, chicken or beef. Germans produce their ubiquitous sausages in almost 1,500 varieties, including Bratwursts and Weisswursts. Although wine is becoming more popular in many parts of Germany, especially close to German wine regions, the national alcoholic drink is beer. German beer consumption per person stands at in 2013 and remains among the highest in the world. German beer purity regulations date back to the 16th century.
The 2018 Michelin Guide awarded eleven restaurants in Germany three stars, giving the country a cumulative total of 300 stars.
Association football is the most popular sport in Germany. With more than 7 million official members, the German Football Association ("Deutscher Fußball-Bund") is the largest single-sport organisation worldwide, and the German top league, the Bundesliga, attracts the second highest average attendance of all professional sports leagues in the world. The German men's national football team won the FIFA World Cup in 1954, 1974, 1990, and 2014, the UEFA European Championship in 1972, 1980 and 1996, and the FIFA Confederations Cup in 2017.
Germany is one of the leading motor sports countries in the world. Constructors like BMW and Mercedes are prominent manufacturers in motor sport. Porsche has won the 24 Hours of Le Mans race 19 times, and Audi 13 times (). The driver Michael Schumacher has set many motor sport records during his career, having won seven Formula One World Drivers' Championships. Sebastian Vettel is also among the top five most successful Formula One drivers of all time.
Historically, German athletes have been successful contenders in the Olympic Games, ranking third in an all-time Olympic Games medal count (when combining East and West German medals). Germany was the last country to host both the summer and winter games in the same year, in 1936: the Berlin Summer Games and the Winter Games in Garmisch-Partenkirchen. Munich hosted the Summer Games of 1972.
|
https://en.wikipedia.org/wiki?curid=11867
|
Guatemala City
Guatemala City (), locally known as Guatemala or Guate, officially New Guatemala of the Assumption (), is the capital and largest city of Guatemala, and the most populous urban area in Central America. The city is located in the south-central part of the country, nestled in a mountain valley called Valle de la Ermita (). It is estimated that its population is about 1 million. Guatemala City is also the capital of the Municipality of Guatemala and of the Guatemala Department.
Guatemala City is the site of the Mayan city of Kaminaljuyu, founded around 1500 BC. Following the Spanish conquest, a new town was established, and in 1776 it was made capital of the Kingdom of Guatemala. In 1821, Guatemala City was the scene of the declaration of independence of Central America from Spain, after which it became the capital of the newly established United Provinces of Central America (later the Federal Republic of Central America). In 1847, Guatemala declared itself an independent republic, with Guatemala City as its capital. The city was almost completely destroyed by the 1917–18 earthquakes. Reconstructions following the earthquakes have resulted in a more modern architectural landscape.
Today, Guatemala City is the political, cultural, and economic center of Guatemala. It is served by La Aurora International Airport.
Human settlement on the present site of Guatemala City began with the Maya who built a city at Kaminaljuyu. The Spanish colonists established a small town, which was made a capital city in 1775. At this period the Central Square with the Cathedral and Royal Palace were constructed. After Central American independence from Spain the city became the capital of the United Provinces of Central America in 1821.
The 19th century saw the construction of the monumental Carrera Theater in the 1850s, and the Presidential Palace in the 1890s. At this time the city was expanding around the "30 de junio" Boulevard and elsewhere, displacing native settlements from the ancient site. Earthquakes in 1917–1918 destroyed many historic structures. Under Jorge Ubico in the 1930s a hippodrome and many new public buildings were constructed, although peripheral poor neighborhoods that formed after the 1917–1918 earthquakes continued to lack basic amenities.
During the Guatemalan Civil War, terror attacks beginning with the burning of the Spanish Embassy in 1980 led to severe destruction and loss of life in the city. In May 2010 two disasters struck: the eruption of the Pacaya volcano, and two days later Tropical Storm Agatha.
Guatemala City serves as the economic, governmental, and cultural epicenter of the nation of Guatemala. The city also functions as Guatemala's main transportation hub, hosting an international airport, La Aurora International Airport, and serving as the origination or end points for most of Guatemala's major highways. The city, with its robust economy, attracts hundreds of thousands of rural migrants from Guatemala's interior hinterlands and serves as the main entry point for most foreign immigrants seeking to settle in Guatemala.
In addition to a wide variety of restaurants, hotels, shops, and a modern BRT transport system (Transmetro), the city is home to many art galleries, theaters, sports venues and museums (including some fine collections of Pre-Columbian art) and provides a growing number of cultural offerings. Guatemala City not only possesses a history and culture unique to the Central American region, it also furnishes all the modern amenities of a world class city, ranging from an IMAX Theater to the Ícaro film festival (Festival Ícaro), where independent films produced in Guatemala and Central America are debuted.
Guatemala City is located in the mountainous highlands of the country, between the Pacific coastal plain to the south and the northern lowlands of the Peten region.
The city's metropolitan area has recently grown very rapidly and has absorbed most of the neighboring municipalities of Villa Nueva, San Miguel Petapa, Mixco, San Juan Sacatepequez, San José Pinula, Santa Catarina Pinula, Fraijanes, San Pedro Ayampuc, Amatitlán, Villa Canales, Palencia and Chinautla forming what is now known as the Guatemala City Metropolitan Area.
The city is subdivided into 22 zones ("Zonas") designed by the urban engineering of Raúl Aguilar Batres, each one with its own streets ("Calles"). avenues ("Avenidas") and sometimes "Diagonal" Streets, making it pretty easy to find addresses in the city. Zones are numbered 1–25 with Zones 20, 22 and 23 not existing as they would have fallen in two other municipalities' territory. Addresses are assigned according to the street or avenue number, followed by a dash and the number of metres it is away from the intersection.
For example, the INGUAT Office on "7a Av. 1-17, Zona 4" is a building which is located on Avenida 7, 17 meters away from the intersection with Calle 1, toward Calle 2 in zone 4.
7a Av. 1-17, Zona 4; and 7a Av. 1-17, Zona 10, are two radically different addresses.
Short streets/avenues do not get new sequenced number, for example, 6A Calle is a short street between 6a and 7a.
Some "avenidas" or "Calles" have a name in addition to their number, if it is very wide, for example Avenida la Reforma is an avenue which separates Zone 9 and 10 and Calle Montúfar is Calle 12 in Zone 9.
Calle 1 Avenida 1 Zona 1 is the center of every city in Guatemala.
Zone One is the Historic Center, (Centro Histórico), lying in the very heart of the city, the location of many important historic buildings including the Palacio Nacional de la Cultura (National Palace of Culture), the Metropolitan Cathedral, the National Congress, the Casa Presidencial (Presidential House), the National Library and Plaza de la Constitución (Constitution Plaza, old Central Park). Efforts to revitalize this important part of the city have been undertaken by the municipal government.
Besides the parks, the city offers a portfolio of entertainment in the region, focused on the so-called Zona Viva and the Calzada Roosevelt as well as four degrees North. Casino activity is considerable, with several located in different parts of the Zona Viva. The area around the East market is being redeveloped.
Within the financial district are the tallest buildings in the country including: Club Premier, Tinttorento, Atlantis building, Atrium, Tikal Futura, Building of Finances, Towers Building Batteries, Torres Botticelli, Tadeus, building of the INTECAP, Royal Towers, Towers Geminis, Industrial Bank towers, Holiday Inn Hotel, Premier of the Americas, among many others to be used for offices, apartments etc. Also included are projects such as Zona Pradera and Interamerica's World Financial Center.
One of the most outstanding mayors was the engineer Martin Prado Vélez, who took over in 1949, and ruled the city during the reformist Presidents Juan José Arévalo and Jacobo Arbenz Guzman, although he was not a member of the ruling party at the time and was elected due his well-known capabilities. Of cobanero origin, married with Marta Cobos, he studied at the University of San Carlos; under his tenure, among other modernist works of the city, infrastructure projects included El Incienso bridge, the construction of the Roosevelt Avenue, the main road axis from East to West of the city, the town hall building, and numerous road works which meant the widening of the colonial city, its order in the cardinal points and the generation of a ring road with the first cloverleaf interchange in the city.
In an attempt to control the rapid growth of the city, the municipal government (Municipalidad de Guatemala) headed by longtime Mayor Álvaro Arzú, has implemented a plan to focus growth along important arterial roads and apply Transit-oriented development (TOD) characteristics. This plan denominated POT (Plan de Ordenamiento Territorial) aims to allow taller building structures of mixed uses to be built next to large arterial roads and gradually decline in height and density moving away from such. It is also worth mentioning, that due to the airport being in the south of the city, height limits based on aeronautical considerations have been applied to the construction code. This limits the maximum height for a building, at in Zone 10, up to in Zone 1.
Despite its location in the tropics, Guatemala City's relatively high altitude moderates average temperatures. The city has a tropical savanna climate (Köppen "Aw") bordering on a subtropical highland climate ("Cwb"). Guatemala City is generally very warm, almost springlike, throughout the course of the year. It occasionally gets hot during the dry season, but not as hot and humid as in Central American cities at sea level. The hottest month is April. The rainy season extends from May to October, coinciding with the tropical storm and hurricane season in the western Atlantic Ocean and Caribbean Sea, while the dry season extends from November to April. The city can at times be windy, which also leads to lower ambient temperatures.
The average annual temperature ranges from during the day, and at night.
Average morning relative humidity: 82%, evening relative humidity: 58%. Average dew point is .
Four stratovolcanoes are visible from the city, two of them active. The nearest and most active is Pacaya, which at times erupts a considerable amount of ash. These volcanoes lie to the south of the Valle de la Ermita, providing a natural barrier between Guatemala City and the Pacific lowlands that define the southern regions of Guatemala. Agua, Fuego, Pacaya and Acatenango comprise a line of 33 stratovolcanoes that stretches across the breadth of Guatemala, from the Salvadorian border to the Mexican border.
Lying on the Ring of Fire, the Guatemalan highlands and the Valle de la Ermita are frequently shaken by large earthquakes. The last large tremor to hit the Guatemala City region occurred in the 1976, on the Motagua Fault, a left-lateral strike-slip fault that forms the boundary between the Caribbean Plate and the North American Plate. The 1976 event registered 7.5 on the moment magnitude scale. Smaller, less severe tremors are frequently felt in Guatemala City and environs.
Torrential downpours, similar to the more famous monsoons, occur frequently in the Valle de la Ermita during the rainy season, leading to flash floods that sometimes inundate the city. Due to these heavy rainfalls, some of the slums perched on the steep edges of the canyons that criss-cross the Valle de la Ermita are washed away and buried under mudslides, as in October 2005. Tropical waves, tropical storms and hurricanes sometimes strike the Guatemalan highlands, which also bring torrential rains to the Guatemala City region and trigger these deadly mudslides.
In February 2007, a very large, deep circular hole with vertical walls opened in northeastern Guatemala City (), killing five people. This sinkhole, which is classified by geologists as either a "piping feature" or "piping pseudokarst", was deep, and apparently was created by fluid from a sewer eroding the loose volcanic ash, limestone, and other pyroclastic deposits that underlie Guatemala City. As a result, one thousand people were evacuated from the area. This piping feature has since been mitigated by City Hall by providing proper maintenance to the sewerage collection system and plans to develop the site have been proposed. However, critics believe municipal authorities have neglected needed maintenance on the city's aging sewerage system, and have speculated that more dangerous piping features are likely to develop unless action is taken.
3 years later the 2010 Guatemala City sinkhole arose.
It is estimated that the population of Guatemala City proper is about 1 million, while its urban area is almost 3 million. The growth of the city's population has been robust since then, abetted by the mass migration of Guatemalans from the rural hinterlands to the largest and most vibrant regional economy in Guatemala. The inhabitants of Guatemala City are incredibly diverse given the size of the city, with those of Spanish and Mestizo descent being the most numerous. Guatemala City also has sizable indigenous populations, divided among the 23 distinct Mayan groups present in Guatemala. The numerous Mayan languages are now spoken in certain quarters of Guatemala City, making the city a linguistically rich area. Foreigners and foreign immigrants comprise the final distinct group of Guatemala City inhabitants, representing a very small minority among the city's citizens.
Due to mass migration from impoverished rural districts wracked with political instability, Guatemala City's population has exploded since the 1970s, severely straining the existing bureaucratic and physical infrastructure of the city. As a result, chronic traffic congestion, shortages of safe potable water in some areas of the city, and a sudden and prolonged surge in crime have become perennial problems. The infrastructure, although continuing to grow and improve in some areas, it is lagging in relation to the increasing population of those less fortunate. Guatemala City is not unique in facing and tackling problems all too common among rapidly expanding cities around the world.
Guatemala City is headquarters to many communications and telecom companies, among them Tigo, Claro-Telgua, and Movistar-Telefónica. These companies also offer cable television, internet services and telephone access. Due to Guatemala City's large and concentrated consumer base in comparison to the rest of the country, these telecom and communications companies provide most of their services and offerings within the confines of the city. There are also seven local television channels, in addition to numerous international channels. The international channels range from children's programming, like Nickelodeon and the Disney Channel, to more adult offerings, such as E! and HBO. While international programming is dominated by entertainment from the United States, domestic programming is dominated by shows from Mexico. Due to its small and relatively income-restricted domestic market, Guatemala City produces very little in the way of its own programming outside of local news and sports.
Guatemala City, as the capital, is home to Guatemala's central bank, from which Guatemala's monetary and fiscal policies are formulated and promulgated. Guatemala City is also headquarters to numerous regional private banks, among them CitiBank, Banco Agromercantil, Banco Promerica, Banco Industrial, Banco GyT Continental, Banco de Antigua, Banco Reformador, Banrural, Grupo Financiero de Occidente, BAC Credomatic, and Banco Internacional. By far the richest and most powerful regional economy within Guatemala, Guatemala City is the largest market for goods and services, which provides the greatest number of investment opportunities for public and private investors in all of Guatemala. Financing for these investments is provided by the regional private banks, as well as by foreign direct and capital investment, mostly from the United States. Guatemala City's ample consumer base and sophisticated service sector is represented by the large department store chains present in the city, among them Siman, Hiper Paiz & Paiz (Walmart), Price Smart, ClubCo, Cemaco, Sears and Office Depot.
Guatemala City is divided into 22 zones in accordance with the urban layout plan designed by Raúl Aguilar Batres. Each zone has its own streets and avenues, facilitating navigation within the city. Zones are numbered 1 through 25. However, numbers 20, 22 and 23 have not been designated to zones, thus these zones do not exist within the city proper.
Traditional buses are now required to discharge passengers at transfer stations at the city's edge to board the Transmetro. This is being implemented as new Transmetro lines become established. In conjunction with the new mass transit implementation in the city, there is also a prepaid bus card system called Transurbano that is being implemented in the metro area to limit cash handling for the transportation system. A new fleet of buses tailored for this system has been purchased from a Brazilian firm.
A light rail line known as Metro Riel is proposed.
Guatemala City is home to ten universities, among them the oldest institution of higher education in Central America, the University of San Carlos of Guatemala. Founded in 1676, the Universidad de San Carlos is older than all North American universities except for Harvard University.
The other nine institutions of higher education to be found in Guatemala City include the Universidad Mariano Gálvez, the Universidad Panamericana, the Universidad Mesoamericana, the Universidad Rafael Landivar, the Universidad Francisco Marroquín, the Universidad del Valle, the Universidad del Istmo, Universidad Galileo, Universidad da Vinci and the Universidad Rural. Whereas these nine named universities are private, the Universidad de San Carlos remains the only public institution of higher learning.
Guatemala City possesses several sportsgrounds and is home to many sports clubs. Football is the most popular sport, with CSD Municipal, Aurora FC and Comunicaciones being the main clubs. The Estadio Mateo Flores, located in the Zone 5 of the city, is the largest stadium in the country, followed in capacity by the Estadio Cementos Progreso, Estadio del Ejército & Estadio El Trébol. An important multi-functional hall is the Domo Polideportivo de la CDAG.
The city has hosted several promotional functions and some international sports events: in 1950 it hosted the VI Central American and Caribbean Games, and in 2000 the FIFA Futsal World Championship. On July 4, 2007 the International Olympic Committee gathered in Guatemala City and voted Sochi to become the host for the 2014 Winter Olympics and Paralympics. In April 2010, it hosted the XIVth Pan-American Mountain Bike Championships.
Guatemala City hosted the 2008 edition of the CONCACAF Futsal Championship, played at the Domo Polideportivo from June 2 to June 8, 2008.
Guatemala City is twinned with:
|
https://en.wikipedia.org/wiki?curid=11874
|
GNU
GNU () is an operating system and an extensive collection of computer software. GNU is composed wholly of free software, most of which is licensed under the GNU Project's own General Public License (GPL).
"GNU" is a recursive acronym for "GNU's Not Unix!", chosen because GNU's design is Unix-like, but differs from Unix by being free software and containing no Unix code.
The GNU project includes an operating system kernel, GNU Hurd, which was the original focus of the Free Software Foundation (FSF). However, given the Hurd kernel's status as not yet production-ready, non-GNU kernels, most popularly the Linux kernel, can also be used with GNU software. The combination of GNU and Linux has become ubiquitous to the point that the duo is often referred to as just ""Linux"" in short, or, less frequently, "GNU/Linux" (see the GNU/Linux naming controversy).
Richard Stallman, the founder of the project, views GNU as a "technical means to a social end". Relatedly, Lawrence Lessig states in his introduction to the second edition of Stallman's book "Free Software, Free Society" that in it Stallman has written about "the social aspects of software and how Free Software can create community and social justice".
Development of the GNU operating system was initiated by Richard Stallman while he worked at MIT Artificial Intelligence Laboratory. It was called the GNU Project, and was publicly announced on September 27, 1983, on the net.unix-wizards and net.usoft newsgroups by Stallman. Software development began on January 5, 1984, when Stallman quit his job at the Lab so that they could not claim ownership or interfere with distributing GNU components as free software. Richard Stallman chose the name by using various plays on words, including the song "The Gnu".
The goal was to bring a completely free software operating system into existence. Stallman wanted computer users to be free to study the source code of the software they use, share software with other people, modify the behavior of software, and publish their modified versions of the software. This philosophy was later published as the GNU Manifesto in March 1985.
Richard Stallman's experience with the Incompatible Timesharing System (ITS), an early operating system written in assembly language that became obsolete due to discontinuation of PDP-10, the computer architecture for which ITS was written, led to a decision that a portable system was necessary. It was thus decided that the development would be started using C and Lisp as system programming languages, and that GNU would be compatible with Unix. At the time, Unix was already a popular proprietary operating system. The design of Unix was modular, so it could be reimplemented piece by piece.
Much of the needed software had to be written from scratch, but existing compatible third-party free software components were also used such as the TeX typesetting system, the X Window System, and the Mach microkernel that forms the basis of the GNU Mach core of GNU Hurd (the official kernel of GNU). With the exception of the aforementioned third-party components, most of GNU has been written by volunteers; some in their spare time, some paid by companies, educational institutions, and other non-profit organizations. In October 1985, Stallman set up the Free Software Foundation (FSF). In the late 1980s and 1990s, the FSF hired software developers to write the software needed for GNU.
As GNU gained prominence, interested businesses began contributing to development or selling GNU software and technical support. The most prominent and successful of these was Cygnus Solutions, now part of Red Hat.
The system's basic components include the GNU Compiler Collection (GCC), the GNU C library (glibc), and GNU Core Utilities (coreutils), but also the GNU Debugger (GDB), GNU Binary Utilities (binutils), the GNU Bash shell. GNU developers have contributed to Linux ports of GNU applications and utilities, which are now also widely used on other operating systems such as BSD variants, Solaris and macOS.
Many GNU programs have been ported to other operating systems, including proprietary platforms such as Microsoft Windows and macOS. GNU programs have been shown to be more reliable than their proprietary Unix counterparts.
As of November 2015, there are a total of 466 GNU packages (including decommissioned, 383 excluding) hosted on the official GNU development site.
The official kernel of GNU Project was the GNU Hurd microkernel; however, as of 2012, the Linux kernel became officially part of the GNU Project in the form of Linux-libre, a variant of Linux with all proprietary components removed.
With the April 30, 2015 release of the Debian GNU/Hurd 2015 distro, GNU OS now provides the components to assemble an operating system that users can install and use on a computer. This includes the GNU Hurd kernel, that is currently in a pre-production state. The Hurd status page states that "it may not be ready for production use, as there are still some bugs and missing features. However, it should be a good base for further development and non-critical application usage."
Due to Hurd not being ready for production use, in practice these operating systems are Linux distributions. They contain the Linux kernel, GNU components and software from many other free software projects. Looking at all program code contained in the Ubuntu Linux distribution in 2011, GNU encompassed 8% (13% including GNOME) and the Linux kernel 6% (increased to 9% when including its direct dependencies).
Other kernels like the FreeBSD kernel also work together with GNU software to form a working operating system. The FSF maintains that an operating system built using the Linux kernel and GNU tools and utilities, should be considered a variant of GNU, and promotes the term "GNU/Linux" for such systems (leading to the GNU/Linux naming controversy). The GNU Project has endorsed Linux distributions, such as gNewSense, Trisquel and Parabola GNU/Linux-libre. Other GNU variants which do not use the Hurd as a kernel include Debian GNU/kFreeBSD and Debian GNU/NetBSD, bringing to fruition the early plan of GNU on a BSD kernel.
The GNU Project recommends that contributors assign the copyright for GNU packages to the Free Software Foundation, though the Free Software Foundation considers it acceptable to release small changes to an existing project to the public domain. However, this is not required; package maintainers may retain copyright to the GNU packages they maintain, though since only the copyright holder may enforce the license used (such as the GNU GPL), the copyright holder in this case enforces it rather than the Free Software Foundation.
For the development of needed software, Stallman wrote a license called the GNU General Public License (first called Emacs General Public License), with the goal to guarantee users freedom to share and change free software. Stallman wrote this license after his experience with James Gosling and a program called UniPress, over a controversy around software code use in the GNU Emacs program. For most of the 80s, each GNU package had its own license: the Emacs General Public License, the GCC General Public License, etc. In 1989, FSF published a single license they could use for all their software, and which could be used by non-GNU projects: the GNU General Public License (GPL).
This license is now used by most of GNU software, as well as a large number of free software programs that are not part of the GNU Project; it also historically has been the most commonly used free software license (though recently challenged by the MIT license). It gives all recipients of a program the right to run, copy, modify and distribute it, while forbidding them from imposing further restrictions on any copies they distribute. This idea is often referred to as copyleft.
In 1991, the GNU Lesser General Public License (LGPL), then known as the Library General Public License, was written for the GNU C Library to allow it to be linked with proprietary software. 1991 also saw the release of version 2 of the GNU GPL. The GNU Free Documentation License (FDL), for documentation, followed in 2000. The GPL and LGPL were revised to version 3 in 2007, adding clauses to protect users against hardware restrictions that prevent user to run modified software on their own devices.
Besides GNU's packages, the GNU Project's licenses are used by many unrelated projects, such as the Linux kernel, often used with GNU software. A minority of the software used by most of Linux distributions, such as the X Window System, is licensed under permissive free software licenses.
The logo for GNU is a gnu head. Originally drawn by Etienne Suvasa, a bolder and simpler version designed by Aurelio Heckert is now preferred. It appears in GNU software and in printed and electronic documentation for the GNU Project, and is also used in Free Software Foundation materials.
The image shown here is a modified version of the official logo. It was created by the Free Software Foundation in September 2013 to commemorate the 30th anniversary of the GNU Project.
|
https://en.wikipedia.org/wiki?curid=11875
|
Gradualism
Gradualism, from the Latin "gradus" ("step"), is a hypothesis, a theory or a tenet assuming that change comes about gradually or that variation is gradual in nature and happens over time as opposed to in large steps. Uniformitarianism, incrementalism, and reformism are similar concepts.
In the natural sciences, gradualism is the theory which holds that profound change is the cumulative product of slow but continuous processes, often contrasted with catastrophism. The theory was proposed in 1795 by James Hutton, a Scottish geologist, and was later incorporated into Charles Lyell's theory of uniformitarianism. Tenets from both theories were applied to biology and formed the basis of early evolutionary theory.
Charles Darwin was influenced by Lyell's "Principles of Geology", which explained both uniformitarian methodology and theory. Using uniformitarianism, which states that one cannot make an appeal to any force or phenomenon which cannot presently be observed (see catastrophism), Darwin theorized that the evolutionary process must occur gradually, not in saltations, since saltations are not presently observed, and extreme deviations from the usual phenotypic variation would be more likely to be selected against.
Gradualism is often confused with the concept of phyletic gradualism. It is a term coined by Stephen Jay Gould and Niles Eldredge to contrast with their model of punctuated equilibrium, which is gradualist itself, but argues that most evolution is marked by long periods of evolutionary stability (called stasis), which is punctuated by rare instances of branching evolution.
Phyletic gradualism is a model of evolution which theorizes that most speciation is slow, uniform and gradual. When evolution occurs in this mode, it is usually by the steady transformation of a whole species into a new one (through a process called anagenesis). In this view no clear line of demarcation exists between an ancestral species and a descendant species, unless splitting occurs.
Punctuated gradualism is a microevolutionary hypothesis that refers to a species that has "relative stasis over a considerable part of its total duration [and] underwent periodic, relatively rapid, morphologic change that did not lead to lineage branching". It is one of the three common models of evolution. While the traditional model of palaeontology, the phylogenetic model, states that features evolved slowly without any direct association with speciation, the relatively newer and more controversial idea of punctuated equilibrium claims that major evolutionary changes don't happen over a gradual period but in localized, rare, rapid events of branching speciation. Punctuated gradualism is considered to be a variation of these models, lying somewhere in between the phyletic gradualism model and the punctuated equilibrium model. It states that speciation is not needed for a lineage to rapidly evolve from one equilibrium to another but may show rapid transitions between long-stable states.
In politics, gradualism is the hypothesis that social change can be achieved in small, discrete increments rather than in abrupt strokes such as revolutions or uprisings. Gradualism is one of the defining features of political liberalism and reformism. Machiavellian politics pushes politicians to espouse gradualism.
In socialist politics and within the socialist movement, the concept of gradualism is frequently distinguished from reformism, with the former insisting that short-term goals need to be formulated and implemented in such a way that they inevitably lead into long-term goals. It is most commonly associated with the libertarian socialist concept of dual power and is seen as a middle way between reformism and revolutionism.
Martin Luther King, Jr. was opposed to the idea of gradualism as a method of eliminating segregation. The government wanted to try to integrate African-Americans and European-Americans slowly into the same society, but many believed it was a way for the government to put off actually doing anything about racial segregation:
In linguistics, language change is seen as gradual, the product of chain reactions and subject to cyclic drift. The view that creole languages are the product of catastrophism is heavily disputed.
Gradualism is the approach of certain schools of Buddhism and other Eastern philosophies (e.g. Theravada or Yoga), that enlightenment can be achieved step by step, through an arduous practice. The opposite approach, that insight is attained all at once, is called subitism. The debate on the issue was very important to the history of the development of Zen, which rejected gradualism, and to the establishment of the opposite approach within the Tibetan Buddhism, after the Debate of Samye. It was continued in other schools of Indian and Chinese philosophy.
Contradictorial gradualism is the paraconsistent treatment of fuzziness developed by Lorenzo Peña which regards true contradictions as situations wherein a state of affairs enjoys only partial existence.
Gradualism in social change implemented through reformist means is a moral principle to which the Fabian Society is committed. In a more general way, reformism is the assumption that gradual changes through and within existing institutions can ultimately change a society's fundamental economic system and political structures; and that an accumulation of reforms can lead to the emergence of an entirely different economic system and form of society than present-day capitalism. That hypothesis of social change grew out of opposition to revolutionary socialism, which contends that revolution is necessary for fundamental structural changes to occur.
In the terminology of NWO-related speculations, gradualism refers to the gradual implementation of a totalitarian world government.
|
https://en.wikipedia.org/wiki?curid=11877
|
Germanic languages
The Germanic languages are a branch of the Indo-European language family spoken natively by a population of about 515 million people mainly in Europe, North America, Oceania and Southern Africa. The most widely spoken Germanic language, English, is the world's most widely spoken language with an estimated 2 billion speakers. All Germanic languages are derived from Proto-Germanic, spoken in Iron Age Scandinavia.
The West Germanic languages include the three most widely spoken Germanic languages: English with around 360–400 million native speakers; German, with over 100 million native speakers; and Dutch, with 24 million native speakers. Other West Germanic languages include Afrikaans, an offshoot of Dutch, with over 7.1 million native speakers; Low German, considered a separate collection of unstandardized dialects, with roughly 0.3 million native speakers and probably 6.7–10 million people who can understand it (at least 5 million in Germany and 1.7 million in the Netherlands); Yiddish, once used by approximately 13 million Jews in pre-World War II Europe, and Scots, both with 1.5 million native speakers; Limburgish varieties with roughly 1.3 million speakers along the Dutch–Belgian–German border; and the Frisian languages with over 0.5 million native speakers in the Netherlands and Germany.
The largest North Germanic languages are Danish, Norwegian and Swedish, which are mutually intelligible and have a combined total of about 20 million native speakers in the Nordic countries and an additional five million second language speakers; since the middle ages these languages have however been strongly influenced by the West Germanic language Middle Low German, and Low German words account for about 30–60% of their vocabularies according to various estimates. Other North Germanic languages are Faroese and Icelandic, which are more conservative languages with no significant Low German influence, more complex grammar and limited mutual intelligibility with the others today.
The East Germanic branch included Gothic, Burgundian, and Vandalic, all of which are now extinct. The last to die off was Crimean Gothic, spoken until the late 18th century in some isolated areas of Crimea.
The SIL "Ethnologue" lists 48 different living Germanic languages, 41 of which belong to the Western branch and six to the Northern branch; it places Riograndenser Hunsrückisch German in neither of the categories, but it is often considered a German dialect by linguists. The total number of Germanic languages throughout history is unknown as some of them, especially the East Germanic languages, disappeared during or after the Migration Period. Some of the West Germanic languages also did not survive past the Migration Period, including Lombardic. As a result of World War II, the German language suffered a significant loss of "Sprachraum", as well as moribundness and extinction of several of its dialects. In the 21st century, its dialects are dying out anyway due to Standard German gaining primacy.
The common ancestor of all of the languages in this branch is called Proto-Germanic, also known as Common Germanic, which was spoken in about the middle of the 1st millennium BC in Iron Age Scandinavia. Proto-Germanic, along with all of its descendants, is characterised by a number of unique linguistic features, most famously the consonant change known as Grimm's law. Early varieties of Germanic entered history with the Germanic tribes moving south from Scandinavia in the 2nd century BC, to settle in the area of today's northern Germany and southern Denmark.
English is an official language of Belize, Canada, Nigeria, Falkland Islands, Malta, New Zealand, Ireland, South Africa, Philippines, Jamaica, Dominica, Guyana, Trinidad and Tobago, American Samoa, Palau, St. Lucia, Grenada, Barbados, St. Vincent and the Grenadines, Puerto Rico, Guam, Hong Kong, Singapore, Pakistan, India, Papua New Guinea, Vanuatu, the Solomon Islands and former British colonies in Asia, Africa and Oceania. Furthermore, it is the "de facto" language of the United Kingdom, the United States and Australia. It is also a recognised language in Nicaragua and Malaysia. American English-speakers make up the majority of all native Germanic speakers, including also making up the bulk of West Germanic speakers.
German is an official language of Austria, Belgium, Germany, Liechtenstein, Luxembourg and Switzerland and has regional status in Italy, Poland, Namibia and Denmark. German also continues to be spoken as a minority language by immigrant communities in North America, South America, Central America, Mexico and Australia. A German dialect, Pennsylvania German, is still present amongst Anabaptist populations in Pennsylvania in the United States.
Dutch is an official language of Aruba, Belgium, Curaçao, the Netherlands, Sint Maarten, and Suriname. The Netherlands also colonised Indonesia, but Dutch was scrapped as an official language after Indonesian independence and today it is only used by older or traditionally educated people. Dutch was until 1984 an official language in South Africa but evolved in and was replaced by Afrikaans, a partially mutually intelligible daughter language of Dutch.
Afrikaans is one of the 11 official languages in South Africa and is a "lingua franca" of Namibia. It is used in other Southern African nations, as well.
Low German is a collection of very diverse dialects spoken in the northeast of the Netherlands and northern Germany.
Scots is spoken in Lowland Scotland and parts of Ulster (where the local dialect is known as Ulster Scots).
Frisian is spoken among half a million people who live on the southern fringes of the North Sea in the Netherlands, Germany, and Denmark.
Luxembourgish is a Moselle Franconian dialect that is spoken mainly in the Grand Duchy of Luxembourg, where it is considered to be an official language. Similar varieties of Moselle Franconian are spoken in small parts of Belgium, France, and Germany.
Yiddish, once a native language of some 11 to 13 million people, remains in use by some 1.5 million speakers in Jewish communities around the world, mainly in North America, Europe, Israel, and other regions with Jewish populations.
Limburgish varieties are spoken in the Limburg and Rhineland regions, along the Dutch–Belgian–German border.
In addition to being the official language in Sweden, Swedish is also spoken natively by the Swedish-speaking minority in Finland, which is a large part of the population along the coast of western and southern Finland. Swedish is also one of the two official languages in Finland, along with Finnish, and the only official language in the Åland Islands. Swedish is also spoken by some people in Estonia.
Danish is an official language of Denmark and in its overseas territory of the Faroe Islands, and it is a "lingua franca" and language of education in its other overseas territory of Greenland, where it was one of the official languages until 2009. Danish is also spoken natively by the Danish minority in the German state of Schleswig-Holstein, where it is recognised as a minority language.
Norwegian is the official language of Norway.
Icelandic is the official language of Iceland.
Faroese is the official language of the Faroe Islands, and it is also spoken by some people in Denmark.
All Germanic languages are thought to be descended from a hypothetical Proto-Germanic, united by subjection to the sound shifts of Grimm's law and Verner's law. These probably took place during the Pre-Roman Iron Age of Northern Europe from c. 500 BC. Proto-Germanic itself was likely spoken after c. 500 BC, and Proto-Norse from the 2nd century AD and later is still quite close to reconstructed Proto-Germanic, but other common innovations separating Germanic from Proto-Indo-European suggest a common history of pre-Proto-Germanic speakers throughout the Nordic Bronze Age.
From the time of their earliest attestation, the Germanic varieties are divided into three groups: West, East, and North Germanic. Their exact relation is difficult to determine from the sparse evidence of runic inscriptions.
The western group would have formed in the late Jastorf culture, and the eastern group may be derived from the 1st-century variety of Gotland, leaving southern Sweden as the original location of the northern group. The earliest period of Elder Futhark (2nd to 4th centuries) predates the division in regional script variants, and linguistically essentially still reflect the Common Germanic stage. The Vimose inscriptions include some of the oldest datable Germanic inscriptions, starting in c. 160 AD.
The earliest coherent Germanic text preserved is the 4th-century Gothic translation of the New Testament by Ulfilas. Early testimonies of West Germanic are in Old Frankish/Old Dutch (the 5th-century Bergakker inscription), Old High German (scattered words and sentences 6th century and coherent texts 9th century), and Old English (oldest texts 650, coherent texts 10th century). North Germanic is only attested in scattered runic inscriptions, as Proto-Norse, until it evolves into Old Norse by about 800.
Longer runic inscriptions survive from the 8th and 9th centuries (Eggjum stone, Rök stone), longer texts in the Latin alphabet survive from the 12th century (Íslendingabók), and some skaldic poetry dates back to as early as the 9th century.
By about the 10th century, the varieties had diverged enough to make inter-comprehensibility difficult. The linguistic contact of the Viking settlers of the Danelaw with the Anglo-Saxons left traces in the English language and is suspected to have facilitated the collapse of Old English grammar that resulted in Middle English from the 12th century.
The East Germanic languages were marginalized from the end of the Migration Period. The Burgundians, Goths, and Vandals became linguistically assimilated by their respective neighbors by about the 7th century, with only Crimean Gothic lingering on until the 18th century.
During the early Middle Ages, the West Germanic languages were separated by the insular development of Middle English on one hand and by the High German consonant shift on the continent on the other, resulting in Upper German and Low Saxon, with graded intermediate Central German varieties. By early modern times, the span had extended into considerable differences, ranging from Highest Alemannic in the South to Northern Low Saxon in the North, and, although both extremes are considered German, they are hardly mutually intelligible. The southernmost varieties had completed the second sound shift, while the northern varieties remained unaffected by the consonant shift.
The North Germanic languages, on the other hand, remained unified until well past 1000 AD, and in fact the mainland Scandinavian languages still largely retain mutual intelligibility into modern times. The main split in these languages is between the mainland languages and the island languages to the west, especially Icelandic, which has maintained the grammar of Old Norse virtually unchanged, while the mainland languages have diverged greatly.
Germanic languages possess a number of defining features compared with other Indo-European languages.
Some of the most well-known are the following:
Other significant characteristics are:
Note that some of the above characteristics were not present in Proto-Germanic but developed later as areal features that spread from language to language:
Roughly speaking, Germanic languages differ in how conservative or how progressive each language is with respect to an overall trend toward analyticity. Some, such as Icelandic and, to a lesser extent, German, have preserved much of the complex inflectional morphology inherited from Proto-Germanic (and in turn from Proto-Indo-European). Others, such as English, Swedish, and Afrikaans, have moved toward a largely analytic type.
The subgroupings of the Germanic languages are defined by shared innovations. It is important to distinguish innovations from cases of linguistic conservatism. That is, if two languages in a family share a characteristic that is not observed in a third language, that is evidence of common ancestry of the two languages "only if" the characteristic is an innovation compared to the family's proto-language.
The following innovations are common to the Northwest Germanic languages (all but Gothic):
The following innovations are also common to the Northwest Germanic languages but represent areal changes:
The following innovations are common to the West Germanic languages:
The following innovations are common to the Ingvaeonic subgroup of the West Germanic languages, which includes English, Frisian, and in a few cases Dutch and Low German, but not High German:
The following innovations are common to the Anglo-Frisian subgroup of the Ingvaeonic languages:
The oldest Germanic languages all share a number of features, which are assumed to be inherited from Proto-Germanic. Phonologically, it includes the important sound changes known as Grimm's Law and Verner's Law, which introduced a large number of fricatives; late Proto-Indo-European had only one, /s/.
The main vowel developments are the merging (in most circumstances) of long and short /a/ and /o/, producing short /a/ and long /ō/. That likewise affected the diphthongs, with PIE /ai/ and /oi/ merging into /ai/ and PIE /au/ and /ou/ merging into /au/. PIE /ei/ developed into long /ī/. PIE long /ē/ developed into a vowel denoted as /ē1/ (often assumed to be phonetically ), while a new, fairly uncommon long vowel /ē2/ developed in varied and not completely understood circumstances. Proto-Germanic had no front rounded vowels, but all Germanic languages except for Gothic subsequently developed them through the process of i-umlaut.
Proto-Germanic developed a strong stress accent on the first syllable of the root, but remnants of the original free PIE accent are visible due to Verner's Law, which was sensitive to this accent. That caused a steady erosion of vowels in unstressed syllables. In Proto-Germanic, that had progressed only to the point that absolutely-final short vowels (other than /i/ and /u/) were lost and absolutely-final long vowels were shortened, but all of the early literary languages show a more advanced state of vowel loss. This ultimately resulted in some languages (like Modern English) losing practically all vowels following the main stress and the consequent rise of a very large number of monosyllabic words.
The following table shows the main outcomes of Proto-Germanic vowels and consonants in the various older languages. For vowels, only the outcomes in stressed syllables are shown. Outcomes in unstressed syllables are quite different, vary from language to language and depend on a number of other factors (such as whether the syllable was medial or final, whether the syllable was open or closed and (in some cases) whether the preceding syllable was light or heavy).
Notes:
The oldest Germanic languages have the typical complex inflected morphology of old Indo-European languages, with four or five noun cases; verbs marked for person, number, tense and mood; multiple noun and verb classes; few or no articles; and rather free word order. The old Germanic languages are famous for having only two tenses (present and past), with three PIE past-tense aspects (imperfect, aorist, and perfect/stative) merged into one and no new tenses (future, pluperfect, etc.) developing. There were three moods: indicative, subjunctive (developed from the PIE optative mood) and imperative. Gothic verbs had a number of archaic features inherited from PIE that were lost in the other Germanic languages with few traces, including dual endings, an inflected passive voice (derived from the PIE mediopassive voice), and a class of verbs with reduplication in the past tense (derived from the PIE perfect). The complex tense system of modern English (e.g. "In three months, the house will still be being built" or "If you had not acted so stupidly, we would never have been caught") is almost entirely due to subsequent developments (although paralleled in many of the other Germanic languages).
Among the primary innovations in Proto-Germanic are the preterite present verbs, a special set of verbs whose present tense looks like the past tense of other verbs and which is the origin of most modal verbs in English; a past-tense ending (in the so-called "weak verbs", marked with "-ed" in English) that appears variously as /d/ or /t/, often assumed to be derived from the verb "to do"; and two separate sets of adjective endings, originally corresponding to a distinction between indefinite semantics ("a man", with a combination of PIE adjective and pronoun endings) and definite semantics ("the man", with endings derived from PIE "n"-stem nouns).
Note that most modern Germanic languages have lost most of the inherited inflectional morphology as a result of the steady attrition of unstressed endings triggered by the strong initial stress. (Contrast, for example, the Balto-Slavic languages, which have largely kept the Indo-European pitch accent and consequently preserved much of the inherited morphology.) Icelandic and to a lesser extent modern German best preserve the Proto–Germanic inflectional system, with four noun cases, three genders, and well-marked verbs. English and Afrikaans are at the other extreme, with almost no remaining inflectional morphology.
The following shows a typical masculine "a"-stem noun, Proto-Germanic "*fiskaz" ("fish"), and its development in the various old literary languages:
Originally, adjectives in Proto-Indo-European followed the same declensional classes as nouns. The most common class (the "o/ā" class) used a combination of "o"-stem endings for masculine and neuter genders and "ā"-stems ending for feminine genders, but other common classes (e.g. the "i" class and "u" class) used endings from a single vowel-stem declension for all genders, and various other classes existed that were based on other declensions. A quite different set of "pronominal" endings was used for pronouns, determiners, and words with related semantics (e.g., "all", "only").
An important innovation in Proto-Germanic was the development of two separate sets of adjective endings, originally corresponding to a distinction between indefinite semantics ("a man") and definite semantics ("the man"). The endings of indefinite adjectives were derived from a combination of pronominal endings with one of the common vowel-stem adjective declensions – usually the "o/ā" class (often termed the "a/ō" class in the specific context of the Germanic languages) but sometimes the "i" or "u" classes. Definite adjectives, however, had endings based on "n"-stem nouns. Originally both types of adjectives could be used by themselves, but already by Proto-Germanic times a pattern evolved whereby definite adjectives had to be accompanied by a determiner with definite semantics (e.g., a definite article, demonstrative pronoun, possessive pronoun, or the like), while indefinite adjectives were used in other circumstances (either accompanied by a word with indefinite semantics such as "a", "one", or "some" or unaccompanied).
In the 19th century, the two types of adjectives – indefinite and definite – were respectively termed "strong" and "weak", names which are still commonly used. These names were based on the appearance of the two sets of endings in modern German. In German, the distinctive case endings formerly present on nouns have largely disappeared, with the result that the load of distinguishing one case from another is almost entirely carried by determiners and adjectives. Furthermore, due to regular sound change, the various definite ("n"-stem) adjective endings coalesced to the point where only two endings ("-e" and "-en") remain in modern German to express the sixteen possible inflectional categories of the language (masculine/feminine/neuter/plural crossed with nominative/accusative/dative/genitive – modern German merges all genders in the plural). The indefinite ("a/ō"-stem) adjective endings were less affected by sound change, with six endings remaining ("-, -e, -es, -er, -em, -en"), cleverly distributed in a way that is capable of expressing the various inflectional categories without too much ambiguity. As a result, the definite endings were thought of as too "weak" to carry inflectional meaning and in need of "strengthening" by the presence of an accompanying determiner, while the indefinite endings were viewed as "strong" enough to indicate the inflectional categories even when standing alone. (This view is enhanced by the fact that modern German largely uses weak-ending adjectives when accompanying an indefinite article, and hence the indefinite/definite distinction no longer clearly applies.) By analogy, the terms "strong" and "weak" were extended to the corresponding noun classes, with "a"-stem and "ō"-stem nouns termed "strong" and "n"-stem nouns termed "weak".
However, in Proto-Germanic – and still in Gothic, the most conservative Germanic language – the terms "strong" and "weak" are not clearly appropriate. For one thing, there were a large number of noun declensions. The "a"-stem, "ō"-stem, and "n"-stem declensions were the most common and represented targets into which the other declensions were eventually absorbed, but this process occurred only gradually. Originally the "n"-stem declension was not a single declension but a set of separate declensions (e.g., "-an", "-ōn", "-īn") with related endings, and these endings were in no way any "weaker" than the endings of any other declensions. (For example, among the eight possible inflectional categories of a noun — singular/plural crossed with nominative/accusative/dative/genitive — masculine "an"-stem nouns in Gothic include seven endings, and feminine "ōn"-stem nouns include six endings, meaning there is very little ambiguity of "weakness" in these endings and in fact much less than in the German "strong" endings.) Although it is possible to group the various noun declensions into three basic categories — vowel-stem, "n"-stem, and other-consonant-stem (a.k.a. "minor declensions") — the vowel-stem nouns do not display any sort of unity in their endings that supports grouping them together with each other but separate from the "n"-stem endings.
It is only in later languages that the binary distinction between "strong" and "weak" nouns become more relevant. In Old English, the "n"-stem nouns form a single, clear class, but the masculine "a"-stem and feminine "ō"-stem nouns have little in common with each other, and neither has much similarity to the small class of "u"-stem nouns. Similarly, in Old Norse, the masculine "a"-stem and feminine "ō"-stem nouns have little in common with each other, and the continuations of the masculine "an"-stem and feminine "ōn/īn"-stem nouns are also quite distinct. It is only in Middle Dutch and modern German that the various vowel-stem nouns have merged to the point that a binary strong/weak distinction clearly applies.
As a result, newer grammatical descriptions of the Germanic languages often avoid the terms "strong" and "weak" except in conjunction with German itself, preferring instead to use the terms "indefinite" and "definite" for adjectives and to distinguish nouns by their actual stem class.
In English, both two sets of adjective endings were lost entirely in the late Middle English period.
Note that divisions between and among subfamilies of Germanic are rarely precisely defined; most form continuous clines, with adjacent varieties being mutually intelligible and more separated ones not. Within the Germanic language family is East Germanic, West Germanic, and North Germanic. However, East Germanic languages became extinct several centuries ago.
The table below shows the succession of the significant historical stages of each language (horizontally) and their approximate groupings in subfamilies (vertically). Vertical sequence within each group does not imply a measure of greater or lesser similarity.
All living Germanic languages belong either to the West Germanic or to the North Germanic branch.
The West Germanic group is the larger by far, further subdivided into Anglo-Frisian on one hand and Continental West Germanic on the other. Anglo-Frisian notably includes English and all its variants, while Continental West Germanic includes German (standard register and dialects), as well as Dutch (standard register and dialects).
Modern classification looks like this. For a full classification, see List of Germanic languages.
The earliest evidence of Germanic languages comes from names recorded in the 1st century by Tacitus (especially from his work "Germania"), but the earliest Germanic writing occurs in a single instance in the 2nd century BC on the Negau helmet.
From roughly the 2nd century AD, certain speakers of early Germanic varieties developed the Elder Futhark, an early form of the runic alphabet. Early runic inscriptions also are largely limited to personal names and difficult to interpret. The Gothic language was written in the Gothic alphabet developed by Bishop Ulfilas for his translation of the Bible in the 4th century. Later, Christian priests and monks who spoke and read Latin in addition to their native Germanic varieties began writing the Germanic languages with slightly modified Latin letters. However, throughout the Viking Age, runic alphabets remained in common use in Scandinavia.
In addition to the standard Latin script, many Germanic languages use a variety of accent marks and extra letters, including the ß ("Eszett"), IJ, Ø, Æ, Å, Ä, Ü, Ö, Ð, Ȝ, and the Latinized runes Þ and Ƿ (with its Latin counterpart W). In print, German used to be prevalently set in blackletter typefaces (e.g., fraktur or schwabacher) until the 1940s, when "Kurrent" and, since the early 20th century, "Sütterlin" were used for German handwriting.
Yiddish is written using an adapted Hebrew alphabet.
Several of the terms in the table below have had semantic drift. For example, the form "Sterben" and other terms for "die" are cognates with the English word "starve". There is also at least three examples of a common borrowing from a non-Germanic source ("ounce" and "devil" and their cognates from Latin, "church" and its cognates from Greek).
|
https://en.wikipedia.org/wiki?curid=11883
|
German language
German (, ) is a West Germanic language that is mainly spoken in Central Europe. It is the most widely spoken and official or co-official language in Germany, Austria, Switzerland, South Tyrol in Italy, the German-speaking Community of Belgium, and Liechtenstein. It is one of the three official languages of Luxembourg and a co-official language in the Opole Voivodeship in Poland. The German language is most similar to other languages within the West Germanic language branch, including Afrikaans, Dutch, English, the Frisian languages, Low German/Low Saxon, Luxembourgish, and Yiddish. It also contains close similarities in vocabulary to Danish, Norwegian and Swedish, although they belong to the North Germanic group. German is the second most widely spoken Germanic language, after English.
One of the major languages of the world, German is a native language to almost 100 million people worldwide and the most widely spoken native language in the European Union. German is the third most commonly spoken foreign language in the EU after English and French, making it the second biggest language in the EU in terms of overall speakers. German is also the second most widely taught foreign language in the EU after English at primary school level (but third after English and French at lower secondary level), the fourth most widely taught non-English language in the US (after Spanish, French and American Sign Language), the second most commonly used scientific language and the third most widely used language on websites after English and Russian. The German-speaking countries are ranked fifth in terms of annual publication of new books, with one tenth of all books (including e-books) in the world being published in German. In the United Kingdom, German and French are the most sought-after foreign languages for businesses (with 49% and 50% of businesses identifying these two languages as the most useful, respectively).
German is an inflected language with four cases for nouns, pronouns and adjectives (nominative, accusative, genitive, dative), three genders (masculine, feminine, neuter), two numbers (singular, plural), and strong and weak verbs. It derives the majority of its vocabulary from the ancient Germanic branch of the Indo-European language family. Some of its vocabulary is derived from Latin and Greek, and fewer are borrowed from French and Modern English. German is a pluricentric language, with its standardized variants being (German, Austrian, and Swiss Standard German). It is also notable for its broad spectrum of dialects, with many unique varieties existing in Europe and other parts of the world. Italy recognizes all the German-speaking minorities in its territory as national historic minorities and protects the varieties of German spoken in several regions of Northern Italy besides South Tyrol. Due to the limited intelligibility between certain varieties and Standard German, as well as the lack of an undisputed, scientific difference between a "dialect" and a "language", some German varieties or dialect groups (e.g. Low German or Plautdietsch) can be described as either "languages" or "dialects".
Modern Standard German is a West Germanic language in the Germanic branch of the Indo-European languages. The Germanic languages are traditionally subdivided into three branches: North Germanic, East Germanic, and West Germanic. The first of these branches survives in modern Danish, Swedish, Norwegian, Faroese, and Icelandic, all of which are descended from Old Norse. The East Germanic languages are now extinct, and Gothic is the only language in this branch which survives in written texts. The West Germanic languages, however, have undergone extensive dialectal subdivision and are now represented in modern languages such as English, German, Dutch, Yiddish, Afrikaans, and others.
Within the West Germanic language dialect continuum, the Benrath and Uerdingen lines (running through Düsseldorf-Benrath and Krefeld-Uerdingen, respectively) serve to distinguish the Germanic dialects that were affected by the High German consonant shift (south of Benrath) from those that were not (north of Uerdingen). The various regional dialects spoken south of these lines are grouped as High German dialects "(nos. 29–34 on the map)", while those spoken to the north comprise the Low German/Low Saxon "(nos. 19–24)" and Low Franconian "(no. 25)" dialects. As members of the West Germanic language family, High German, Low German, and Low Franconian can be further distinguished historically as Irminonic, Ingvaeonic, and Istvaeonic, respectively. This classification indicates their historical descent from dialects spoken by the Irminones (also known as the Elbe group), Ingvaeones (or North Sea Germanic group), and Istvaeones (or Weser-Rhine group).
Standard German is based on a combination of Thuringian-Upper Saxon and Upper Franconian and Bavarian dialects, which are Central German and Upper German dialects, belonging to the Irminonic High German dialect group "(nos. 29–34)". German is therefore closely related to the other languages based on High German dialects, such as Luxembourgish (based on Central Franconian dialects – "no. 29"), and Yiddish. Also closely related to Standard German are the Upper German dialects spoken in the southern German-speaking countries, such as Swiss German (Alemannic dialects – "no. 34"), and the various Germanic dialects spoken in the French region of Grand Est, such as Alsatian (mainly Alemannic, but also Central- and Upper Franconian "(no. 32)" dialects) and Lorraine Franconian (Central Franconian – "no. 29").
After these High German dialects, standard German is less closely related to languages based on Low Franconian dialects (e.g. Dutch and Afrikaans) or Low German/Low Saxon dialects (spoken in northern Germany and southern Denmark), neither of which underwent the High German consonant shift. As has been noted, the former of these dialect types is Istvaeonic and the latter Ingvaeonic, whereas the High German dialects are all Irminonic: the differences between these languages and standard German are therefore considerable. Also related to German are the Frisian languages—North Frisian (spoken in Nordfriesland – "no. 28"), Saterland Frisian (spoken in Saterland – "no. 27"), and West Frisian (spoken in Friesland – "no. 26")—as well as the Anglic languages of English and Scots. These Anglo-Frisian dialects are all members of the Ingvaeonic family of West Germanic languages, which did not take part in the High German consonant shift.
The history of the German language begins with the High German consonant shift during the migration period, which separated Old High German (OHG) dialects from Old Saxon. This sound shift involved a drastic change in the pronunciation of both voiced and voiceless stop consonants ("b", "d", "g", and "p", "t", "k", respectively). The primary effects of the shift were the following:
While there is written evidence of the Old High German language in several Elder Futhark inscriptions from as early as the 6th century AD (such as the Pforzen buckle), the Old High German period is generally seen as beginning with the "Abrogans" (written c.765–775), a Latin-German glossary supplying over 3,000 OHG words with their Latin equivalents. After the "Abrogans", the first coherent works written in OHG appear in the 9th century, chief among them being the "Muspilli", the "Merseburg Charms", and the "Hildebrandslied", as well as a number of other religious texts (the "Georgslied", the "Ludwigslied", the "Evangelienbuch", and translated hymns and prayers). The "Muspilli" is a Christian poem written in a Bavarian dialect offering an account of the soul after the Last Judgment, and the "Merseburg Charms" are transcriptions of spells and charms from the pagan Germanic tradition. Of particular interest to scholars, however, has been the "Hildebrandslied", a secular epic poem telling the tale of an estranged father and son unknowingly meeting each other in battle. Linguistically this text is highly interesting due to the mixed use of Old Saxon and Old High German dialects in its composition. The written works of this period stem mainly from the Alamanni, Bavarian, and Thuringian groups, all belonging to the Elbe Germanic group (Irminones), which had settled in what is now southern-central Germany and Austria between the 2nd and 6th centuries during the great migration.
In general, the surviving texts of OHG show a wide range of dialectal diversity with very little written uniformity. The early written tradition of OHG survived mostly through monasteries and scriptoria as local translations of Latin originals; as a result, the surviving texts are written in highly disparate regional dialects and exhibit significant Latin influence, particularly in vocabulary. At this point monasteries, where most written works were produced, were dominated by Latin, and German saw only occasional use in official and ecclesiastical writing.
The German language through the OHG period was still predominantly a spoken language, with a wide range of dialects and a much more extensive oral tradition than a written one. Having just emerged from the High German consonant shift, OHG was also a relatively new and volatile language still undergoing a number of phonetic, phonological, morphological, and syntactic changes. The scarcity of written work, instability of the language, and widespread illiteracy of the time explain the lack of standardization up to the end of the OHG period in 1050.
While there is no complete agreement over the dates of the Middle High German (MHG) period, it is generally seen as lasting from 1050 to 1350. This was a period of significant expansion of the geographical territory occupied by Germanic tribes, and consequently of the number of German speakers. Whereas during the Old High German period the Germanic tribes extended only as far east as the Elbe and Saale rivers, the MHG period saw a number of these tribes expanding beyond this eastern boundary into Slavic territory (known as the "Ostsiedlung"). With the increasing wealth and geographic spread of the Germanic groups came greater use of German in the courts of nobles as the standard language of official proceedings and literature. A clear example of this is the "mittelhochdeutsche Dichtersprache" employed in the Hohenstaufen court in Swabia as a standardized supra-dialectal written language. While these efforts were still regionally bound, German began to be used in place of Latin for certain official purposes, leading to a greater need for regularity in written conventions.
While the major changes of the MHG period were socio-cultural, German was still undergoing significant linguistic changes in syntax, phonetics, and morphology as well (e.g. diphthongization of certain vowel sounds: "hus" (OHG "house")"→haus" (MHG), and weakening of unstressed short vowels to schwa [ə]: "taga" (OHG "days")→"tage" (MHG)).
A great wealth of texts survives from the MHG period. Significantly, these texts include a number of impressive secular works, such as the "Nibelungenlied", an epic poem telling the story of the dragon-slayer Siegfried ( 13th century), and the "Iwein," an Arthurian verse poem by Hartmann von Aue ( 1203), as well as several lyric poems and courtly romances such as "Parzival" and "Tristan". Also noteworthy is the "Sachsenspiegel", the first book of laws written in Middle "Low" German ( 1220). The abundance and especially the secular character of the literature of the MHG period demonstrate the beginnings of a standardized written form of German, as well as the desire of poets and authors to be understood by individuals on supra-dialectal terms.
The Middle High German period is generally seen as ending when the 1346-53 Black Death decimated Europe's population.
Modern German begins with the Early New High German (ENHG) period, which the influential German philologist Wilhelm Scherer dates 1350–1650, terminating with the end of the Thirty Years' War. This period saw the further displacement of Latin by German as the primary language of courtly proceedings and, increasingly, of literature in the German states. While these states were still under the control of the Holy Roman Empire and far from any form of unification, the desire for a cohesive written language that would be understandable across the many German-speaking principalities and kingdoms was stronger than ever. As a spoken language German remained highly fractured throughout this period, with a vast number of often mutually incomprehensible regional dialects being spoken throughout the German states; the invention of the printing press 1440 and the publication of Luther's vernacular translation of the Bible in 1534, however, had an immense effect on standardizing German as a supra-dialectal written language.
The ENHG period saw the rise of several important cross-regional forms of chancery German, one being "gemeine tiutsch," used in the court of the Holy Roman Emperor Maximilian I, and the other being "Meißner Deutsch", used in the Electorate of Saxony in the Duchy of Saxe-Wittenberg. Alongside these courtly written standards, the invention of the printing press led to the development of a number of printers' languages ("Druckersprachen") aimed at making printed material readable and understandable across as many diverse dialects of German as possible. The greater ease of production and increased availability of written texts brought about increased standardization in the written form of German.One of the central events in the development of ENHG was the publication of Luther's translation of the Bible into German (the New Testament was published in 1522; the Old Testament was published in parts and completed in 1534). Luther based his translation primarily on the "Meißner Deutsch" of Saxony, spending much time among the population of Saxony researching the dialect so as to make the work as natural and accessible to German speakers as possible. Copies of Luther's Bible featured a long list of glosses for each region, translating words which were unknown in the region into the regional dialect. Luther said the following concerning his translation method:One who would talk German does not ask the Latin how he shall do it; he must ask the mother in the home, the children on the streets, the common man in the market-place and note carefully how they talk, then translate accordingly. They will then understand what is said to them because it is German. When Christ says 'ex abundantia cordis os loquitur,' I would translate, if I followed the papists, "aus dem Überflusz des Herzens redet der Mund". But tell me is this talking German? What German understands such stuff? No, the mother in the home and the plain man would say, "Wesz das Herz voll ist, des gehet der Mund über".With Luther's rendering of the Bible in the vernacular, German asserted itself against the dominance of Latin as a legitimate language for courtly, literary, and now ecclesiastical subject-matter. Furthermore, his Bible was ubiquitous in the German states: nearly every household possessed a copy. Nevertheless, even with the influence of Luther's Bible as an unofficial written standard, a widely accepted standard for written German did not appear until the middle of the 18th century.
German was the language of commerce and government in the Habsburg Empire, which encompassed a large area of Central and Eastern Europe. Until the mid-19th century, it was essentially the language of townspeople throughout most of the Empire. Its use indicated that the speaker was a merchant or someone from an urban area, regardless of nationality.
Some cities, such as Prague () and Budapest (Buda, ), were gradually Germanized in the years after their incorporation into the Habsburg domain. Others, such as Pozsony (, now Bratislava), were originally settled during the Habsburg period and were primarily German at that time. Prague, Budapest and Bratislava, as well as cities like Zagreb () and Ljubljana (), contained significant German minorities.
In the eastern provinces of Banat, Bukovina, and Transylvania (), German was the predominant language not only in the larger towns – such as (Timișoara), (Sibiu) and (Brașov) – but also in many smaller localities in the surrounding areas.
The most comprehensive guide to the vocabulary of the German language is found within the . This dictionary was created by the Brothers Grimm and is composed of 16 parts which were issued between 1852 and 1860. In 1872, grammatical and orthographic rules first appeared in the "Duden Handbook".
In 1901, the 2nd Orthographical Conference ended with a complete standardization of the German language in its written form and the "Duden Handbook" was declared its standard definition. The (literally, German stage language) had established conventions for German pronunciation in theatres (Bühnendeutsch) three years earlier; however, this was an artificial standard that did not correspond to any traditional spoken dialect. Rather, it was based on the pronunciation of Standard German in Northern Germany, although it was subsequently regarded often as a general prescriptive norm, despite differing pronunciation traditions especially in the Upper-German-speaking regions that still characterize the dialect of the area today – especially the pronunciation of the ending as [ɪk] instead of [ɪç]. In Northern Germany, Standard German was a foreign language to most inhabitants, whose native dialects were subsets of Low German. It was usually encountered only in writing or formal speech; in fact, most of Standard German was a written language, not identical to any spoken dialect, throughout the German-speaking area until well into the 19th century.
Official revisions of some of the rules from 1901 were not issued until the controversial German orthography reform of 1996 was made the official standard by governments of all German-speaking countries. Media and written works are now almost all produced in Standard German (often called , "High German") which is understood in all areas where German is spoken.
Due to the German diaspora as well as German being the second most widely spoken language in Europe and the third most widely taught foreign language in the US and the EU (in upper secondary education) amongst others, the geographical distribution of German speakers (or "Germanophones") spans all inhabited continents. As for the number of speakers of any language worldwide, an assessment is always compromised by the lack of sufficient, reliable data. For an exact, global number of native German speakers, this is further complicated by the existence of several varieties whose status as separate "languages" or "dialects" is disputed for political and/or linguistic reasons, including quantitatively strong varieties like certain forms of Alemannic (e.g., Alsatian) and Low German/Plautdietsch. Depending on the inclusion or exclusion of certain varieties, it is estimated that approximately 90–95 million people speak German as a first language, 10–25 million as a second language, and 75–100 million as a foreign language. This would imply the existence of approximately 175–220 million German speakers worldwide. It is estimated that including every person studying German, regardless of their actual proficiency, would amount to about 280 million people worldwide with at least some knowledge of German.
In Europe, German is the second most widely spoken mother tongue (after Russian) and the second biggest language in terms of overall speakers (after English). The area in central Europe where the majority of the population speaks German as a first language and has German as a (co-)official language is called the "German "Sprachraum"". It comprises an estimated 88 million native speakers and 10 million who speak German as a second language (e.g. immigrants). Excluding regional minority languages, German is the only official language of the following countries:
German is a co-official language of the following countries:
Although expulsions and (forced) assimilation after the two World Wars greatly diminished them, minority communities of mostly bilingual German native speakers exist in areas both adjacent to and detached from the Sprachraum.
Within Europe and Asia, German is a recognized minority language in the following countries:
In France, the High German varieties of Alsatian and Moselle Franconian are identified as "regional languages", but the European Charter for Regional and Minority Languages of 1998 has not yet been ratified by the government. In the Netherlands, the Limburgish, Frisian, and Low German languages are protected regional languages according to the European Charter for Regional and Minority Languages; however, they are widely considered separate languages and neither German nor Dutch dialects.
Namibia was a colony of the German Empire from 1884 to 1919. Mostly descending from German settlers who immigrated during this time, 25–30,000 people still speak German as a native tongue today. The period of German colonialism in Namibia also led to the evolution of a Standard German-based pidgin language called "Namibian Black German", which became a second language for parts of the indigenous population. Although it is nearly extinct today, some older Namibians still have some knowledge of it.
German, along with English and Afrikaans, was a co-official language of Namibia from 1984 until its independence from South Africa in 1990. At this point, the Namibian government perceived Afrikaans and German as symbols of apartheid and colonialism, and decided English would be the sole official language, stating that it was a "neutral" language as there were virtually no English native speakers in Namibia at that time. German, Afrikaans and several indigenous languages became "national languages" by law, identifying them as elements of the cultural heritage of the nation and ensuring that the state acknowledged and supported their presence in the country. Today, German is used in a wide variety of spheres, especially business and tourism, as well as the churches (most notably the German-speaking Evangelical Lutheran Church in Namibia (GELK)), schools (e.g. the ), literature (German-Namibian authors include Giselher W. Hoffmann), radio (the Namibian Broadcasting Corporation produces radio programs in German), and music (e.g. artist EES). The is one of the three biggest newspapers in Namibia and the only German-language daily in Africa.
Mostly originating from different waves of immigration during the 19th and 20th centuries, an estimated 12,000 people speak German or a German variety as a first language in South Africa. One of the largest communities consists of the speakers of "Nataler Deutsch", a variety of Low German concentrated in and around Wartburg. The small town of Kroondal in the North-West Province also has a mostly German-speaking population. The South African constitution identifies German as a "commonly used" language and the Pan South African Language Board is obligated to promote and ensure respect for it. The community is strong enough that several German International schools are supported, such as the Deutsche Schule Pretoria.
In the United States, the states of North Dakota and South Dakota are the only states where German is the most common language spoken at home after English. German geographical names can be found throughout the Midwest region of the country, such as New Ulm and many other towns in Minnesota; Bismarck (North Dakota's state capital), Munich, Karlsruhe, and Strasburg (named after a town near Odessa in Ukraine) in North Dakota; New Braunfels, Fredericksburg, Weimar, and Muenster in Texas; Corn (formerly Korn), Kiefer and Berlin in Oklahoma; and Kiel, Berlin, and Germantown in Wisconsin.
In Brazil, the largest concentrations of German speakers are in the states of Rio Grande do Sul (where Riograndenser Hunsrückisch developed), Santa Catarina, Paraná, São Paulo and Espírito Santo.
There are important concentrations of German-speaking descendants in Argentina, Chile, Paraguay, Venezuela, Peru, and Bolivia.
The impact of nineteenth century German immigration to southern Chile was such that Valdivia was for a while a Spanish-German bilingual city with ""German signboards and placards alongside the Spanish"". The prestige the German language caused it to acquire qualities of a superstratum in southern Chile. The word for blackberry, a ubiquitous plant in southern Chile, is "murra", instead of the ordinary Spanish words "mora" and "zarzamora", from Valdivia to the Chiloé Archipelago and in some towns in the Aysén Region. The use of "rr" is an adaptation of guttural sounds found in German but difficult to pronounce in Spanish. Similarly the name for marbles, a traditional children's game, is different in Southern Chile compared to areas further north. From Valdivia to the Aysén Region this game is called "bochas", in contrast to the word "bolitas" used further north. The word "bocha" is likely a derivative of the German "Bocciaspiel".
In Australia, the state of South Australia experienced a pronounced wave of immigration in the 1840s from Prussia (particularly the Silesia region). With the prolonged isolation from other German speakers and contact with Australian English, a unique dialect known as Barossa German developed, spoken predominantly in the Barossa Valley near Adelaide. Usage of German sharply declined with the advent of World War I, due to the prevailing anti-German sentiment in the population and related government action. It continued to be used as a first language into the 20th century, but its use is now limited to a few older speakers.
German migration to New Zealand in the 19th century was less pronounced than migration from Britain, Ireland, and perhaps even Scandinavia. Despite this there were significant pockets of German-speaking communities which lasted until the first decades of the 20th century. German speakers settled principally in Puhoi, Nelson, and Gore. At the last census (2013), 36,642 people in New Zealand spoke German, making it the third most spoken European language after English and French and overall the ninth most spoken language.
There is also an important German creole being studied and recovered, named , spoken in the former German colony of German New Guinea, across Micronesia and in northern Australia (i.e. coastal parts of Queensland and Western Australia) by a few elderly people. The risk of its extinction is serious and efforts to revive interest in the language are being implemented by scholars.
Like French and Spanish, German has become a classic second foreign language in the western world, as English (Spanish in the US) is well established as the first foreign language. German ranks second (after English) among the best known foreign languages in the EU (on a par with French) as well as in Russia. In terms of student numbers across all levels of education, German ranks third in the EU (after English and French) as well as in the United States (after Spanish and French). In 2015, approximately 15.4 million people were in the process of learning German across all levels of education worldwide. As this number remained relatively stable since 2005 (± 1 million), roughly 75–100 million people able to communicate in German as a foreign language can be inferred, assuming an average course duration of three years and other estimated parameters. According to a 2012 survey, 47 million people within the EU (i.e., up to two-thirds of the 75–100 million worldwide) claimed to have sufficient German skills to have a conversation. Within the EU, not counting countries where it is an official language, German as a foreign language is most popular in Eastern and Northern Europe, namely the Czech Republic, Croatia, Denmark, the Netherlands, Slovakia, Hungary, Slovenia, Sweden and Poland. German was once, and to some extent still is, a lingua franca in those parts of Europe.
The basis of Standard German is the Luther Bible, which was translated by Martin Luther and which had originated from the Saxon court language (it being a convenient norm). However, there are places where the traditional regional dialects have been replaced by new vernaculars based on standard German; that is the case in large stretches of Northern Germany but also in major cities in other parts of the country. It is important to note, however, that the colloquial standard German differs greatly from the formal written language, especially in grammar and syntax, in which it has been influenced by dialectal speech.
Standard German differs regionally among German-speaking countries in vocabulary and some instances of pronunciation and even grammar and orthography. This variation must not be confused with the variation of local dialects. Even though the regional varieties of standard German are only somewhat influenced by the local dialects, they are very distinct. German is thus considered a pluricentric language.
In most regions, the speakers use a continuum from more dialectal varieties to more standard varieties depending on the circumstances.
In German linguistics, German dialects are distinguished from varieties of standard German.
The "varieties of standard German" refer to the different local varieties of the pluricentric standard German. They differ only slightly in lexicon and phonology. In certain regions, they have replaced the traditional German dialects, especially in Northern Germany.
In the German-speaking parts of Switzerland, mixtures of dialect and standard are very seldom used, and the use of Standard German is largely restricted to the written language. About 11% of the Swiss residents speak "High German" (Standard German) at home, but this is mainly due to German immigrants. This situation has been called a "medial diglossia". Swiss Standard German is used in the Swiss education system, while Austrian Standard German is officially used in the Austrian education system.
A mixture of dialect and standard does not normally occur in Northern Germany either. The traditional varieties there are Low German, whereas Standard German is a High German "variety". Because their linguistic distance is greater, they do not mesh with Standard German the way that High German dialects (such as Bavarian, Swabian, and Hessian) can.
The German dialects are the traditional local varieties of the language; many of them are not mutually intelligibile with standard German, and they have great differences in lexicon, phonology, and syntax. If a narrow definition of language based on mutual intelligibility is used, many German dialects are considered to be separate languages (for instance in the "Ethnologue"). However, such a point of view is unusual in German linguistics.
The German dialect continuum is traditionally divided most broadly into High German and Low German, also called Low Saxon. However, historically, High German dialects and Low Saxon/Low German dialects do not belong to the same language. Nevertheless, in today's Germany, Low Saxon/Low German is often perceived as a dialectal variation of Standard German on a functional level even by many native speakers. The same phenomenon is found in the eastern Netherlands, as the traditional dialects are not always identified with their Low Saxon/Low German origins, but with Dutch.
The variation among the German dialects is considerable, with often only neighbouring dialects being mutually intelligible. Some dialects are not intelligible to people who know only Standard German. However, all German dialects belong to the dialect continuum of High German and Low Saxon.
Middle Low German was the lingua franca of the Hanseatic League. It was the predominant language in Northern Germany until the 16th century. In 1534, the Luther Bible was published. The translation is considered to be an important step towards the evolution of the Early New High German. It aimed to be understandable to a broad audience and was based mainly on Central and Upper German varieties. The Early New High German language gained more prestige than Low German and became the language of science and literature. Around the same time, the Hanseatic League, based around northern ports, lost its importance as new trade routes to Asia and the Americas were established, and the most powerful German states of that period were located in Middle and Southern Germany.
The 18th and 19th centuries were marked by mass education in Standard German in schools. Gradually, Low German came to be politically viewed as a mere dialect spoken by the uneducated. Today, Low Saxon can be divided in two groups: Low Saxon varieties with a reasonable level of Standard German influence and varieties of Standard German with a Low Saxon influence known as . Sometimes, Low Saxon and Low Franconian varieties are grouped together because both are unaffected by the High German consonant shift. However, the proportion of the population who can understand and speak it has decreased continuously since World War II. The largest cities in the Low German area are Hamburg and Dortmund.
The Low Franconian dialects are the dialects that are more closely related to Dutch than to Low German. Most of the Low Franconian dialects are spoken in the Netherlands and Belgium, where they are considered as dialects of Dutch, which is itself a Low Franconian language. In Germany, Low Franconian dialects are spoken in the northwest of North Rhine-Westphalia, along the Lower Rhine. The Low Franconian dialects spoken in Germany are referred to as Meuse-Rhenish or Low Rhenish. In the north of the German Low Franconian language area, North Low Franconian dialects (also referred to as Cleverlands or as dialects of South Guelderish) are spoken. These dialects are more closely related to Dutch (also North Low Franconian) than the South Low Franconian dialects (also referred to as East Limburgish and, east of the Rhine, Bergish), which are spoken in the south of the German Low Franconian language area. The South Low Franconian dialects are more closely related to Limburgish than to Dutch, and are transitional dialects between Low Franconian and Ripuarian (Central Franconian). The East Bergish dialects are the easternmost Low Franconian dialects, and are transitional dialects between North- and South Low Franconian, and Westphalian (Low German), with most of their features being North Low Franconian. The largest cities in the German Low Franconian area are Düsseldorf and Duisburg.
The High German dialects consist of the Central German, High Franconian, and Upper German dialects. The High Franconian dialects are transitional dialects between Central and Upper German. The High German varieties spoken by the Ashkenazi Jews have several unique features and are considered as a separate language, Yiddish, written with the Hebrew alphabet.
The Central German dialects are spoken in Central Germany, from Aachen in the west to Görlitz in the east. They consist of Franconian dialects in the west (West Central German) and non-Franconian dialects in the east (East Central German). Modern Standard German is mostly based on Central German dialects.
The Franconian, West Central German dialects are the Central Franconian dialects (Ripuarian and Moselle Franconian) and the Rhine Franconian dialects (Hessian and Palatine). These dialects are considered as
Luxembourgish as well as the Transylvanian Saxon dialect spoken in Transylvania are based on Moselle Franconian dialects. The largest cities in the Franconian Central German area are Cologne and Frankfurt.
Further east, the non-Franconian, East Central German dialects are spoken (Thuringian, Upper Saxon, Ore Mountainian, and Lusatian-New Markish, and earlier, in the then German-speaking parts of Silesia also Silesian, and in then German southern East Prussia also High Prussian). The largest cities in the East Central German area are Berlin and Leipzig.
The High Franconian dialects are transitional dialects between Central and Upper German. They consist of the East and South Franconian dialects.
The East Franconian dialect branch is one of the most spoken dialect branches in Germany. These dialects are spoken in the region of Franconia and in the central parts of Saxon Vogtland. Franconia consists of the Bavarian districts of Upper, Middle, and Lower Franconia, the region of South Thuringia (Thuringia), and the eastern parts of the region of Heilbronn-Franken (Tauber Franconia and Hohenlohe) in Baden-Württemberg. The largest cities in the East Franconian area are Nuremberg and Würzburg.
South Franconian is mainly spoken in northern Baden-Württemberg in Germany, but also in the northeasternmost part of the region of Alsace in France. While these dialects are considered as dialects of German in Baden-Württemberg, they are considered as dialects of Alsatian in Alsace (most Alsatian dialects are Low Alemannic, however). The largest cities in the South Franconian area are Karlsruhe and Heilbronn.
The Upper German dialects are the Alemannic dialects in the west and the Bavarian dialects in the east.
Alemannic dialects are spoken in Switzerland (High Alemannic in the densely populated Swiss Plateau, in the south also Highest Alemannic, and Low Alemannic in Basel), Baden-Württemberg (Swabian and Low Alemannic, in the southwest also High Alemannic), Bavarian Swabia (Swabian, in the southwesternmost part also Low Alemannic), Vorarlberg (Low, High, and Highest Alemannic), Alsace (Low Alemannic, in the southernmost part also High Alemannic), Liechtenstein (High and Highest Alemannic), and in the Tyrolean district of Reutte (Swabian). The Alemannic dialects are considered as Alsatian in Alsace. The largest cities in the Alemannic area are Stuttgart and Zürich.
Bavarian dialects are spoken in Austria (Vienna, Lower and Upper Austria, Styria, Carinthia, Salzburg, Burgenland, and in most parts of Tyrol), Bavaria (Upper and Lower Bavaria as well as Upper Palatinate), South Tyrol, southwesternmost Saxony (Southern Vogtlandian), and in the Swiss village of Samnaun. The largest cities in the Bavarian area are Vienna and Munich.
German is a fusional language with a moderate degree of inflection, with three grammatical genders; as such, there can be a large number of words derived from the same root.
German nouns inflect by case, gender, and number:
This degree of inflection is considerably less than in Old High German and other old Indo-European languages such as Latin, Ancient Greek, and Sanskrit, and it is also somewhat less than, for instance, Old English, modern Icelandic, or Russian. The three genders have collapsed in the plural. With four cases and three genders plus plural, there are 16 permutations of case and gender/number of the article (not the nouns), but there are only six forms of the definite article, which together cover all 16 permutations. In nouns, inflection for case is required in the singular for strong masculine and neuter nouns only in the genitive and in the dative (only in fixed or archaic expressions), and even this is losing ground to substitutes in informal speech. The singular dative noun ending is considered archaic or at least old-fashioned in almost all contexts and is almost always dropped in writing, except in poetry, songs, proverbs, and other petrified forms. Weak masculine nouns share a common case ending for genitive, dative, and accusative in the singular. Feminine nouns are not declined in the singular. The plural has an inflection for the dative. In total, seven inflectional endings (not counting plural markers) exist in German: .
In German orthography, nouns and most words with the syntactical function of nouns are capitalised to make it easier for readers to determine the function of a word within a sentence ( – "On Friday I went shopping."; – "One day he finally showed up.") This convention is almost unique to German today (shared perhaps only by the closely related Luxembourgish language and several insular dialects of the North Frisian language), but it was historically common in other languages such as Danish (which abolished the capitalization of nouns in 1948) and English.
Like the other Germanic languages, German forms noun compounds in which the first noun modifies the category given by the second: ("dog hut"; specifically: "dog kennel"). Unlike English, whose newer compounds or combinations of longer nouns are often written "open" with separating spaces, German (like some other Germanic languages) nearly always uses the "closed" form without spaces, for example: ("tree house"). Like English, German allows arbitrarily long compounds in theory (see also English compounds). The longest German word verified to be actually in (albeit very limited) use is , which, literally translated, is "beef labelling supervision duties assignment law" [from (cattle), (meat), (labelling), (supervision), (duties), (assignment), (law)]. However, examples like this are perceived by native speakers as excessively bureaucratic, stylistically awkward, or even satirical.
The inflection of standard German verbs includes:
The meaning of basic verbs can be expanded and sometimes radically changed through the use of a number of prefixes. Some prefixes have a specific meaning; the prefix ' refers to destruction, as in (to tear apart), (to break apart), (to cut apart). Other prefixes have only the vaguest meaning in themselves; ' is found in a number of verbs with a large variety of meanings, as in (to try) from (to seek), (to interrogate) from (to take), (to distribute) from (to share), (to understand) from (to stand).
Other examples include the following:
Many German verbs have a separable prefix, often with an adverbial function. In finite verb forms, it is split off and moved to the end of the clause and is hence considered by some to be a "resultative particle". For example, , meaning "to go along", would be split, giving (Literal: "Go you with?"; Idiomatic: "Are you going along?").
Indeed, several parenthetical clauses may occur between the prefix of a finite verb and its complement (ankommen = to arrive, er kam an = he arrived, er ist angekommen = he has arrived):
A selectively literal translation of this example to illustrate the point might look like this:
German word order is generally with the V2 word order restriction and also with the SOV word order restriction for main clauses. For polar questions, exclamations, and wishes, the finite verb always has the first position. In subordinate clauses, the verb occurs at the very end.
German requires a verbal element (main verb or auxiliary verb) to appear second in the sentence. The verb is preceded by the topic of the sentence. The element in focus appears at the end of the sentence. For a sentence without an auxiliary, these are several possibilities:
The position of a noun in a German sentence has no bearing on its being a subject, an object or another argument. In a declarative sentence in English, if the subject does not occur before the predicate, the sentence could well be misunderstood.
However, German's flexible word order allows one to emphasise specific words:
Normal word order:
Object in front:
Adverb of time in front:
Both time expressions in front:
Another possibility:
Swapped adverbs:
Swapped object:
The flexible word order also allows one to use language "tools" (such as poetic meter and figures of speech) more freely.
When an auxiliary verb is present, it appears in second position, and the main verb appears at the end. This occurs notably in the creation of the perfect tense. Many word orders are still possible:
The main verb may appear in first position to put stress on the action itself. The auxiliary verb is still in second position.
Sentences using modal verbs place the infinitive at the end. For example, the English sentence "Should he go home?" would be rearranged in German to say "Should he (to) home go?" (). Thus, in sentences with several subordinate or relative clauses, the infinitives are clustered at the end. Compare the similar clustering of prepositions in the following (highly contrived) English sentence: "What did you bring that book that I do not like to be read to out of up for?"
German subordinate clauses have all verbs clustered at the end. Given that auxiliaries encode future, passive, modality, and the perfect, very long chains of verbs at the end of the sentence can occur. In these constructions, the past participle formed with is often replaced by the infinitive.
The order at the end of such strings is subject to variation, but the second one in the last example is unusual.
Most German vocabulary is derived from the Germanic branch of the Indo-European language family. However, there is a significant amount of loanwords from other languages, in particular Latin, Greek, Italian, French, and most recently English. In the early 19th century, Joachim Heinrich Campe estimated that one fifth of the total German vocabulary was of French or Latin origin.
Latin words were already imported into the predecessor of the German language during the Roman Empire and underwent all the characteristic phonetic changes in German. Their origin is thus no longer recognizable for most speakers (e.g. , , , , from Latin , , , , ). Borrowing from Latin continued after the fall of the Roman Empire during Christianization, mediated by the church and monasteries. Another important influx of Latin words can be observed during Renaissance humanism. In a scholarly context, the borrowings from Latin have continued until today, in the last few decades often indirectly through borrowings from English. During the 15th to 17th centuries, the influence of Italian was great, leading to many Italian loanwords in the fields of architecture, finance, and music. The influence of the French language in the 17th to 19th centuries resulted in an even greater import of French words. The English influence was already present in the 19th century, but it did not become dominant until the second half of the 20th century.
Thus, Notker Labeo was able to translate Aristotelian treatises into pure (Old High) German in the decades after the year 1000. The tradition of loan translation was revitalized in the 18th century with linguists like Joachim Heinrich Campe, who introduced close to 300 words that are still used in modern German. Even today, there are movements that try to promote the (substitution) of foreign words that are deemed unnecessary with German alternatives. It is claimed that this would also help in spreading modern or scientific notions among the less educated and as well democratise public life.
As in English, there are many pairs of synonyms due to the enrichment of the Germanic vocabulary with loanwords from Latin and Latinized Greek. These words often have different connotations from their Germanic counterparts and are usually perceived as more scholarly.
The size of the vocabulary of German is difficult to estimate. The ("German Dictionary") initiated by Jacob and Wilhelm Grimm already contained over 330,000 headwords in its first edition. The modern German scientific vocabulary is estimated at nine million words and word groups (based on the analysis of 35 million sentences of a corpus in Leipzig, which as of July 2003 included 500 million words in total).
The Duden is the "de facto" official dictionary of the German language, first published by Konrad Duden in 1880. The Duden is updated regularly, with new editions appearing every four or five years. , it was in its 27th edition and in 12 volumes, each covering different aspects such as loanwords, etymology, pronunciation, synonyms, and so forth.The first of these volumes, (German Orthography), has long been the prescriptive source for the spelling of German. The "Duden" has become the bible of the German language, being the definitive set of rules regarding grammar, spelling and usage of German.
The ("Austrian Dictionary"), abbreviated , is the official dictionary of the German language in the Republic of Austria. It is edited by a group of linguists under the authority of the Austrian Federal Ministry of Education, Arts and Culture (). It is the Austrian counterpart to the German "Duden" and contains a number of terms unique to Austrian German or more frequently used or differently pronounced there. A considerable amount of this "Austrian" vocabulary is also common in Southern Germany, especially Bavaria, and some of it is used in Switzerland as well. Since the 39th edition in 2001 the orthography of the has been adjusted to the German spelling reform of 1996. The dictionary is also officially used in the Italian province of South Tyrol.
This is a selection of cognates in both English and German. Instead of the usual infinitive ending "-en", German verbs are indicated by a hyphen after their stems. Words that are written with capital letters in German are nouns.
German is written in the Latin alphabet. In addition to the 26 standard letters, German has three vowels with an umlaut mark, namely "ä", "ö" and "ü", as well as the eszett or (sharp s): "ß". In Switzerland and Liechtenstein, "ss" is used instead of "ß". Since "ß" can never occur at the beginning of a word, it has no traditional uppercase form.
Written texts in German are easily recognisable as such by distinguishing features such as umlauts and certain orthographical features – German is the only major language that capitalizes all nouns, a relic of a widespread practice in Northern Europe in the early modern era (including English for a while, in the 1700s) – and the frequent occurrence of long compounds. Because legibility and convenience set certain boundaries, compounds consisting of more than three or four nouns are almost exclusively found in humorous contexts. (In contrast, although English can also string nouns together, it usually separates the nouns with spaces. For example, "toilet bowl cleaner".)
Before the German orthography reform of 1996, "ß" replaced "ss" after long vowels and diphthongs and before consonants, word-, or partial-word endings. In reformed spelling, "ß" replaces "ss" only after long vowels and diphthongs.
Since there is no traditional capital form of "ß", it was replaced by "SS" when capitalization was required. For example, (tape measure) became in capitals. An exception was the use of ß in legal documents and forms when capitalizing names. To avoid confusion with similar names, lower case "ß" was maintained (thus "" instead of ""). Capital ß (ẞ) was ultimately adopted into German orthography in 2017, ending a long orthographic debate (thus " and ").
Umlaut vowels (ä, ö, ü) are commonly transcribed with ae, oe, and ue if the umlauts are not available on the keyboard or other medium used. In the same manner ß can be transcribed as ss. Some operating systems use key sequences to extend the set of possible characters to include, amongst other things, umlauts; in Microsoft Windows this is done using Alt codes. German readers understand these transcriptions (although they appear unusual), but they are avoided if the regular umlauts are available, because they are a makeshift and not proper spelling. (In Westphalia and Schleswig-Holstein, city and family names exist where the extra e has a vowel lengthening effect, e.g. "Raesfeld" , "Coesfeld" and "Itzehoe" , but this use of the letter e after a/o/u does not occur in the present-day spelling of words other than proper nouns.)
There is no general agreement on where letters with umlauts occur in the sorting sequence. Telephone directories treat them by replacing them with the base vowel followed by an e. Some dictionaries sort each umlauted vowel as a separate letter after the base vowel, but more commonly words with umlauts are ordered immediately after the same word without umlauts. As an example in a telephone book occurs after but before (because Ä is replaced by Ae). In a dictionary comes after , but in some dictionaries and all other words starting with "Ä" may occur after all words starting with "A". In some older dictionaries or indexes, initial "Sch" and "St" are treated as separate letters and are listed as separate entries after "S", but they are usually treated as S+C+H and S+T.
Written German also typically uses an alternative opening inverted comma (quotation mark) as in .
Until the early 20th century, German was mostly printed in blackletter typefaces (mostly in Fraktur, but also in Schwabacher) and written in corresponding handwriting (for example Kurrent and Sütterlin). These variants of the Latin alphabet are very different from the serif or sans-serif Antiqua typefaces used today, and the handwritten forms in particular are difficult for the untrained to read. The printed forms, however, were claimed by some to be more readable when used for Germanic languages. (Often, foreign names in a text were printed in an Antiqua typeface even though the rest of the text was in Fraktur.) The Nazis initially promoted Fraktur and Schwabacher because they were considered Aryan, but they abolished them in 1941, claiming that these letters were Jewish. It is also believed that the Nazi régime had banned this script as they realized that Fraktur would inhibit communication in the territories occupied during World War II.
The Fraktur script however remains present in everyday life in pub signs, beer brands and other forms of advertisement, where it is used to convey a certain rusticality and antiquity.
A proper use of the long s (), ſ, is essential for writing German text in Fraktur typefaces. Many Antiqua typefaces also include the long s. A specific set of rules applies for the use of long s in German text, but nowadays it is rarely used in Antiqua typesetting. Any lower case "s" at the beginning of a syllable would be a long s, as opposed to a terminal s or short s (the more common variation of the letter s), which marks the end of a syllable; for example, in differentiating between the words (guard-house) and (tube of polish/wax). One can easily decide which "s" to use by appropriate hyphenation, ( vs. ). The long s only appears in lower case.
The orthography reform of 1996 led to public controversy and considerable dispute. The states () of North Rhine-Westphalia and Bavaria refused to accept it. At one point, the dispute reached the highest court, which quickly dismissed it, claiming that the states had to decide for themselves and that only in schools could the reform be made the official rule – everybody else could continue writing as they had learned it. After 10 years, without any intervention by the federal parliament, a major revision was installed in 2006, just in time for the coming school year. In 2007, some traditional spellings were finally invalidated; however, in 2008, many of the old comma rules were again put in force.
The most noticeable change was probably in the use of the letter "ß", called ("Sharp S") or (pronounced "ess-tsett"). Traditionally, this letter was used in three situations:
Examples are , , and . Currently, only the first rule is in effect, making the correct spellings , , and . The word 'foot' has the letter "ß" because it contains a long vowel, even though that letter occurs at the end of a syllable. The logic of this change is that an 'ß' is a single letter whereas 'ss' are two letters, so the same distinction applies as (for example) between the words and .
In German, vowels (excluding diphthongs; see below) are either "short" or "long", as follows:
Short is realized as in stressed syllables (including secondary stress), but as in unstressed syllables. Note that stressed short can be spelled either with "e" or with "ä" (for instance, 'would have' and 'chain' rhyme). In general, the short vowels are open and the long vowels are close. The one exception is the open sound of long "Ä"; in some varieties of standard German, and have merged into , removing this anomaly. In that case, pairs like 'bears/berries' or 'spike (of wheat)/honour' become homophonous (see: Captain Bluebear).
In many varieties of standard German, an unstressed is not pronounced but vocalised to .
Whether any particular vowel letter represents the long or short phoneme is not completely predictable, although the following regularities exist:
Both of these rules have exceptions (e.g. "has" is short despite the first rule; "moon" is long despite the second rule). For an "i" that is neither in the combination "ie" (making it long) nor followed by a double consonant or cluster (making it short), there is no general rule. In some cases, there are regional differences. In central Germany (Hesse), the "o" in the proper name "Hoffmann" is pronounced long, whereas most other Germans would pronounce it short. The same applies to the "e" in the geographical name "Mecklenburg" for people in that region. The word "cities" is pronounced with a short vowel by some (Jan Hofer, ARD Television) and with a long vowel by others (Marietta Slomka, ZDF Television). Finally, a vowel followed by "ch" can be short ( "compartment", "kitchen") or long ( "search", "books") almost at random. Thus, is homographous between "puddle" and "manner of laughing" (colloquial) or "laugh!" (imperative).
German vowels can form the following digraphs (in writing) and diphthongs (in pronunciation); note that the pronunciation of some of them (ei, äu, eu) is very different from what one would expect when considering the component letters:
Additionally, the digraph "ie" generally represents the phoneme , which is not a diphthong. In many varieties, an at the end of a syllable is vocalised. However, a sequence of a vowel followed by such a vocalised is not a phonemic diphthong: "bear", "he", "we", "gate", "short", "words".
In most varieties of standard German, syllables that begin with a vowel are preceded by a glottal stop .
With approximately 26 phonemes, the German consonant system exhibits an average number of consonants in comparison with other languages. One of the more noteworthy ones is the unusual affricate . The consonant inventory of the standard language is shown below.
German does not have any dental fricatives (as English th). The th sound, which the English language still has, disappeared on the continent in German with the consonant shifts between the 8th and 10th centuries. It is sometimes possible to find parallels between English and German by replacing the English th with d in German: "Thank" → in German , "this" and "that" → and , "thou" (old 2nd person singular pronoun) → , "think" → , "thirsty" → and many other examples.
Likewise, the gh in Germanic English words, pronounced in several different ways in modern English (as an f or not at all), can often be linked to German ch: "to laugh" → , "through" → , "high" → , "naught" → , "light" → or , "sight" → , "daughter" → , "neighbour" → .
The German language is used in German literature and can be traced back to the Middle Ages, with the most notable authors of the period being Walther von der Vogelweide and Wolfram von Eschenbach.
The , whose author remains unknown, is also an important work of the epoch. The fairy tales collected and published by Jacob and Wilhelm Grimm in the 19th century became famous throughout the world.
Reformer and theologian Martin Luther, who was the first to translate the Bible into German, is widely credited for having set the basis for the modern "High German" language. Among the best-known poets and authors in German are Lessing, Goethe, Schiller, Kleist, Hoffmann, Brecht, Heine, and Kafka. Fourteen German-speaking people have won the Nobel Prize in literature: Theodor Mommsen, Rudolf Christoph Eucken, Paul von Heyse, Gerhart Hauptmann, Carl Spitteler, Thomas Mann, Nelly Sachs, Hermann Hesse, Heinrich Böll, Elias Canetti, Günter Grass, Elfriede Jelinek, Herta Müller and Peter Handke, making it the second most awarded linguistic region (together with French) after English.
English has taken many loanwords from German, often without any change of spelling (aside from frequently eliminating umlauts and not capitalizing nouns):
Several organisations promote the use and learning of the German language:
The government-backed (named after Johann Wolfgang von Goethe) aims to enhance the knowledge of German culture and language within Europe and the rest of the world. This is done by holding exhibitions and conferences with German-related themes, and providing training and guidance in the learning and use of the German language. For example, the teaches the German language qualification.
The Dortmund-based , founded in 1997, supports the German language and is the largest language association of citizens in the world. The VDS has more than thirty-five thousand members in over seventy countries. Its founder, statistics professor Dr. Walter Krämer, has remained chairperson of the association from its formation.
The German state broadcaster provides radio and television broadcasts in German and 30 other languages across the globe. Its German language services are spoken slowly and thus tailored for learners. also provides an website for teaching German.
|
https://en.wikipedia.org/wiki?curid=11884
|
Greek language
Greek () is an independent branch of the Indo-European family of languages, native to Greece, Cyprus, Albania and other parts of the Eastern Mediterranean and the Black Sea. It has the longest documented history of any living Indo-European language, spanning at least 3,500 years of written records. Its writing system has been the Greek alphabet for the major part of its history; other systems, such as Linear B and the Cypriot syllabary, were used previously. The alphabet arose from the Phoenician script and was in turn the basis of the Latin, Cyrillic, Armenian, Coptic, Gothic, and many other writing systems.
The Greek language holds an important place in the history of the Western world and Christianity; the canon of ancient Greek literature includes works in the Western canon such as the epic poems "Iliad" and "Odyssey". Greek is also the language in which many of the foundational texts in science, especially astronomy, mathematics and logic and Western philosophy, such as the Platonic dialogues and the works of Aristotle, are composed; the New Testament of the Christian Bible was written in Koiné Greek. Together with the Latin texts and traditions of the Roman world, the study of the Greek texts and society of antiquity constitutes the discipline of Classics.
During antiquity, Greek was a widely spoken lingua franca in the Mediterranean world, West Asia and many places beyond. It would eventually become the official parlance of the Byzantine Empire and develop into Medieval Greek. In its modern form, Greek is the official language in two countries, Greece and Cyprus, a recognized minority language in seven other countries, and is one of the 24 official languages of the European Union. The language is spoken by at least 13.4 million people today in Greece, Cyprus, Italy, Albania, and Turkey and by the Greek diaspora.
Greek roots are often used to coin new words for other languages; Greek and Latin are the predominant sources of international scientific vocabulary.
Greek has been spoken in the Balkan peninsula since around the 3rd millennium BC, or possibly earlier. The earliest written evidence is a Linear B clay tablet found in Messenia that dates to between 1450 and 1350 BC, making Greek the world's oldest recorded living language. Among the Indo-European languages, its date of earliest written attestation is matched only by the now-extinct Anatolian languages.
The Greek language is conventionally divided into the following periods:
In the modern era, the Greek language entered a state of diglossia: the coexistence of vernacular and archaizing written forms of the language. What came to be known as the Greek language question was a polarization between two competing varieties of Modern Greek: Dimotiki, the vernacular form of Modern Greek proper, and Katharevousa, meaning 'purified', a compromise between Dimotiki and Ancient Greek, which was developed in the early 19th century and was used for literary and official purposes in the newly formed Greek state. In 1976, Dimotiki was declared the official language of Greece, having incorporated features of Katharevousa and giving birth to Standard Modern Greek, which is used today for all official purposes and in education.
The historical unity and continuing identity between the various stages of the Greek language are often emphasized. Although Greek has undergone morphological and phonological changes comparable to those seen in other languages, never since classical antiquity has its cultural, literary, and orthographic tradition been interrupted to the extent that one can speak of a new language emerging. Greek speakers today still tend to regard literary works of ancient Greek as part of their own rather than a foreign language. It is also often stated that the historical changes have been relatively slight compared with some other languages. According to one estimation, "Homeric Greek is probably closer to demotic than 12-century Middle English is to modern spoken English".
Greek is spoken today by at least 13 million people, principally in Greece and Cyprus along with a sizable Greek-speaking minority in Albania near the Greek-Albanian border. A significant percentage of Albania's population has some basic knowledge of the Greek language due in part to the Albanian wave of immigration to Greece in the 1980s and '90s. Prior to the Greco-Turkish War and the resulting population exchange in 1923 a very large population of Greek-speakers also existed in Turkey, though very few remain today. A small Greek-speaking community is also found in Bulgaria near the Greek-Bulgarian border. Greek is also spoken worldwide by the sizable Greek diaspora which as notable communities in the United States, Australia, Canada, South Africa, Chile, Brazil, Argentina, Russia, Ukraine, and throughout the European Union, especially in the United Kingdom and Germany.
Historically, significant Greek-speaking communities and regions were found throughout the Eastern Mediterranean, in what are today Southern Italy, Turkey, Cyprus, Syria, Lebanon, Israel, Egypt, and Libya; in the area of the Black Sea, in what are today Turkey, Bulgaria, Romania, Ukraine, Russia, Georgia, Armenia, and Azerbaijan; and, to a lesser extent, in the Western Mediterranean in and around colonies such as Massalia, Monoikos, and Mainake.
Greek, in its modern form, is the official language of Greece, where it is spoken by almost the entire population. It is also the official language of Cyprus (nominally alongside Turkish). Because of the membership of Greece and Cyprus in the European Union, Greek is one of the organization's 24 official languages. Furthermore, Greek is officially recognized as official in Dropull and Himara (Albania), and as a minority language all over Albania, as well as in parts of Italy, Armenia, Romania, and Ukraine as a regional or minority language in the framework of the European Charter for Regional or Minority Languages. Greeks are also a recognized ethnic minority in Hungary.
The phonology, morphology, syntax, and vocabulary of the language show both conservative and innovative tendencies across the entire attestation of the language from the ancient to the modern period. The division into conventional periods is, as with all such periodizations, relatively arbitrary, especially because at all periods, Ancient Greek has enjoyed high prestige, and the literate borrowed heavily from it.
Across its history, the syllabic structure of Greek has varied little: Greek shows a mixed syllable structure, permitting complex syllabic onsets but very restricted codas. It has only oral vowels and a fairly stable set of consonantal contrasts. The main phonological changes occurred during the Hellenistic and Roman period (see Koine Greek phonology for details):
In all its stages, the morphology of Greek shows an extensive set of productive derivational affixes, a limited but productive system of compounding and a rich inflectional system. Although its morphological categories have been fairly stable over time, morphological changes are present throughout, particularly in the nominal and verbal systems. The major change in the nominal morphology since the classical stage was the disuse of the dative case (its functions being largely taken over by the genitive). The verbal system has lost the infinitive, the synthetically-formed future, and perfect tenses and the optative mood. Many have been replaced by periphrastic (analytical) forms.
Pronouns show distinctions in person (1st, 2nd, and 3rd), number (singular, dual, and plural in the ancient language; singular and plural alone in later stages), and gender (masculine, feminine, and neuter) and decline for case (from six cases in the earliest forms attested to four in the modern language). Nouns, articles, and adjectives show all the distinctions except for a person. Both attributive and predicative adjectives agree with the noun.
The inflectional categories of the Greek verb have likewise remained largely the same over the course of the language's history but with significant changes in the number of distinctions within each category and their morphological expression. Greek verbs have synthetic inflectional forms for:
Many aspects of the syntax of Greek have remained constant: verbs agree with their subject only, the use of the surviving cases is largely intact (nominative for subjects and predicates, accusative for objects of most verbs and many prepositions, genitive for possessors), articles precede nouns, adpositions are largely prepositional, relative clauses follow the noun they modify and relative pronouns are clause-initial. However, the morphological changes also have their counterparts in the syntax, and there are also significant differences between the syntax of the ancient and that of the modern form of the language. Ancient Greek made great use of participial constructions and of constructions involving the infinitive, and the modern variety lacks the infinitive entirely (instead of having a raft of new periphrastic constructions) and uses participles more restrictively. The loss of the dative led to a rise of prepositional indirect objects (and the use of the genitive to directly mark these as well). Ancient Greek tended to be verb-final, but neutral word order in the modern language is VSO or SVO.
Modern Greek inherits most of its vocabulary from Ancient Greek, which in turn is an Indo-European language, but also includes a number of borrowings from the languages of the populations that inhabited Greece before the arrival of Proto-Greeks, some documented in Mycenaean texts; they include a large number of Greek toponyms. The form and meaning of many words have evolved. Loanwords (words of foreign origin) have entered the language, mainly from Latin, Venetian, and Turkish. During the older periods of Greek, loanwords into Greek acquired Greek inflections, thus leaving only a foreign root word. Modern borrowings (from the 20th century on), especially from French and English, are typically not inflected; other modern borrowings are derived from South Slavic (Macedonian/Bulgarian) and Eastern Romance languages (Aromanian and Megleno-Romanian).
Greek words have been widely borrowed into other languages, including English: "mathematics", "physics", "astronomy", "democracy", "philosophy", "athletics, theatre, rhetoric", "baptism", "evangelist", etc. Moreover, Greek words and word elements continue to be productive as a basis for coinages: "anthropology", "photography", "telephony", "isomer", "biomechanics", "cinematography", etc. and form, with Latin words, the foundation of international scientific and technical vocabulary like all words ending with "–logy" ("discourse"). There are many English words of Greek origin.
Greek is an independent branch of the Indo-European language family. The ancient language most closely related to it may be ancient Macedonian, which many scholars suggest may have been a dialect of Greek itself, but it is so poorly attested that it is difficult to conclude anything about it. Independently of the Macedonian question, some scholars have grouped Greek into Graeco-Phrygian, as Greek and the extinct Phrygian share features that are not found in other Indo-European languages. Among living languages, some Indo-Europeanists suggest that Greek may be most closely related to Armenian (see Graeco-Armenian) or the Indo-Iranian languages (see Graeco-Aryan), but little definitive evidence has been found for grouping the living branches of the family. In addition, Albanian has also been considered somewhat related to Greek and Armenian by some linguists. If proven and recognized, the three languages would form a new Balkan sub-branch with other dead European languages.
Linear B, attested as early as the late 15th century BC, was the first script used to write Greek. It is basically a syllabary, which was finally deciphered by Michael Ventris and John Chadwick in the 1950s (its precursor, Linear A, has not been deciphered and most likely encodes a non-Greek language). The language of the Linear B texts, Mycenaean Greek, is the earliest known form of Greek.
Another similar system used to write the Greek language was the Cypriot syllabary (also a descendant of Linear A via the intermediate Cypro-Minoan syllabary), which is closely related to Linear B but uses somewhat different syllabic conventions to represent phoneme sequences. The Cypriot syllabary is attested in Cyprus from the 11th century BC until its gradual abandonment in the late Classical period, in favor of the standard Greek alphabet.
Greek has been written in the Greek alphabet since approximately the 9th century BC. It was created by modifying the Phoenician alphabet, with the innovation of adopting certain letters to represent the vowels. The variant of the alphabet in use today is essentially the late Ionic variant, introduced for writing classical Attic in 403 BC. In classical Greek, as in classical Latin, only upper-case letters existed. The lower-case Greek letters were developed much later by medieval scribes to permit a faster, more convenient cursive writing style with the use of ink and quill.
The Greek alphabet consists of 24 letters, each with an uppercase (majuscule) and lowercase (minuscule) form. The letter sigma has an additional lowercase form (ς) used in the final position:
In addition to the letters, the Greek alphabet features a number of diacritical signs: three different accent marks (acute, grave, and circumflex), originally denoting different shapes of pitch accent on the stressed vowel; the so-called breathing marks (rough and smooth breathing), originally used to signal presence or absence of word-initial /h/; and the diaeresis, used to mark the full syllabic value of a vowel that would otherwise be read as part of a diphthong. These marks were introduced during the course of the Hellenistic period. Actual usage of the grave in handwriting saw a rapid decline in favor of uniform usage of the acute during the late 20th century, and it has only been retained in typography.
After the writing reform of 1982, most diacritics are no longer used. Since then, Greek has been written mostly in the simplified monotonic orthography (or monotonic system), which employs only the acute accent and the diaeresis. The traditional system, now called the polytonic orthography (or polytonic system), is still used internationally for the writing of Ancient Greek.
In Greek, the question mark is written as the English semicolon, while the functions of the colon and semicolon are performed by a raised point (•), known as the "ano teleia" (). In Greek the comma also functions as a silent letter in a handful of Greek words, principally distinguishing ("ó,ti", "whatever") from ("óti", "that").
Ancient Greek texts often used "scriptio continua" ('continuous writing'), which means that ancient authors and scribes would write word after word with no spaces or punctuation between words to differentiate or mark boundaries. Boustrophedon, or bi-directional text, was also used in Ancient Greek.
Greek has occasionally been written in the Latin script, especially in areas under Venetian rule or by Greek Catholics. The term / applies when the Latin script is used to write Greek in the cultural ambit of Catholicism (because / is an older Greek term for West-European dating to when most of (Roman Catholic Christian) West Europe was under the control of the Frankish Empire). / (meaning "Catholic Chiot") alludes to the significant presence of Catholic missionaries based on the island of Chios. Additionally, the term Greeklish is often used when the Greek language is written in a Latin script in online communications.
The Latin script is nowadays used by the Greek-speaking communities of Southern Italy.
The Yevanic dialect was written by Romaniote and Constantinopolitan Karaite Jews using the Hebrew Alphabet.
Some Greek Muslims from Crete wrote their Cretan Greek in the Arabic alphabet.
The same happened among Epirote Muslims in Ioannina.
This usage is sometimes called aljamiado as when Romance languages are written in the Arabic alphabet.
|
https://en.wikipedia.org/wiki?curid=11887
|
George Orwell
Eric Arthur Blair (25 June 1903 – 21 January 1950), known by his pen name George Orwell, was an English novelist, essayist, journalist and critic. His work is characterised by lucid prose, biting social criticism, opposition to totalitarianism, and outspoken support of democratic socialism.
As a writer, Orwell produced literary criticism and poetry, fiction and polemical journalism; and is best known for the allegorical novella "Animal Farm" (1945) and the dystopian novel "Nineteen Eighty-Four" (1949). His non-fiction works, including "The Road to Wigan Pier" (1937), documenting his experience of working-class life in the north of England, and "Homage to Catalonia" (1938), an account of his experiences soldiering for the Republican faction of the Spanish Civil War (1936–1939), are as critically respected as his essays on politics and literature, language and culture. In 2008, "The Times" ranked George Orwell second among "The 50 greatest British writers since 1945".
Orwell's work remains influential in popular culture and in political culture, and the adjective "Orwellian"—describing totalitarian and authoritarian social practices—is part of the English language, like many of his neologisms, such as "Big Brother", "Thought Police", "Two Minutes Hate", "Room 101", "memory hole", "Newspeak", "doublethink", "proles", "unperson", and "thoughtcrime".
Eric Arthur Blair was born on 25 June 1903 in Motihari, Bihar, British India. His great-grandfather, Charles Blair, was a wealthy country gentleman in Dorset who married Lady Mary Fane, daughter of the Earl of Westmorland, and had income as an absentee landlord of plantations in Jamaica. His grandfather, Thomas Richard Arthur Blair, was a clergyman. Eric Blair described his family as "lower-upper-middle class".
His father, Richard Walmesley Blair, worked in the Opium Department of the Indian Civil Service. His mother, Ida Mabel Blair ("née" Limouzin), grew up in Moulmein, Burma, where her French father was involved in speculative ventures. Eric had two sisters: Marjorie, five years older; and Avril, five years younger. When Eric was one year old, his mother took him and Marjorie to England. His birthplace and ancestral house in Motihari has been declared a protected monument of historical importance.
In 1904 Ida Blair settled with her children at Henley-on-Thames in Oxfordshire. Eric was brought up in the company of his mother and sisters, and apart from a brief visit in mid-1907, the family did not see their husband or father, Richard Blair, until 1912. His mother's diary from 1905 describes a lively round of social activity and artistic interests.
Aged five, Eric was sent as a day-boy to a convent school in Henley-on-Thames, which Marjorie also attended. It was a Roman Catholic convent run by French Ursuline nuns, who had been exiled from France after Catholic education was banned in 1903 due to the Dreyfus Affair. His mother wanted him to have a public school education, but his family could not afford the fees, and he needed to earn a scholarship. Ida Blair's brother Charles Limouzin recommended St Cyprian's School, Eastbourne, East Sussex. Limouzin, who was a proficient golfer, knew of the school and its headmaster through the Royal Eastbourne Golf Club, where he won several competitions in 1903 and 1904. The headmaster undertook to help Blair to win a scholarship, and made a private financial arrangement that allowed Blair's parents to pay only half the normal fees. In September 1911, Eric arrived at St Cyprian's. He boarded at the school for the next five years, returning home only for school holidays. During this period, while working for the Ministry of Pensions, his mother lived at 23 Cromwell Crescent, Earls Court. He knew nothing of the reduced fees, although he "soon recognised that he was from a poorer home". Blair hated the school and many years later wrote an essay "Such, Such Were the Joys", published posthumously, based on his time there. At St Cyprian's, Blair first met Cyril Connolly, who became a writer. Many years later, as the editor of "Horizon", Connolly published several of Orwell's essays.
Before the First World War, the family moved to Shiplake, Oxfordshire where Eric became friendly with the Buddicom family, especially their daughter Jacintha. When they first met, he was standing on his head in a field. On being asked why, he said, "You are noticed more if you stand on your head than if you are right way up." Jacintha and Eric read and wrote poetry, and dreamed of becoming famous writers. He said that he might write a book in the style of H. G. Wells's "A Modern Utopia". During this period, he also enjoyed shooting, fishing and birdwatching with Jacintha's brother and sister.
While at St Cyprian's, Blair wrote two poems that were published in the "Henley and South Oxfordshire Standard". He came second to Connolly in the Harrow History Prize, had his work praised by the school's external examiner, and earned scholarships to Wellington and Eton. But inclusion on the Eton scholarship roll did not guarantee a place, and none was immediately available for Blair. He chose to stay at St Cyprian's until December 1916, in case a place at Eton became available.
In January, Blair took up the place at Wellington, where he spent the Spring term. In May 1917 a place became available as a King's Scholar at Eton. At this time the family lived at Mall Chambers, Notting Hill Gate. Blair remained at Eton until December 1921, when he left midway between his 18th and 19th birthday. Wellington was "beastly", Orwell told his childhood friend Jacintha Buddicom, but he said he was "interested and happy" at Eton. His principal tutor was A. S. F. Gow, Fellow of Trinity College, Cambridge, who also gave him advice later in his career. Blair was briefly taught French by Aldous Huxley. Steven Runciman, who was at Eton with Blair, noted that he and his contemporaries appreciated Huxley's linguistic flair. Cyril Connolly followed Blair to Eton, but because they were in separate years, they did not associate with each other.
Blair's academic performance reports suggest that he neglected his academic studies, but during his time at Eton he worked with Roger Mynors to produce a College magazine, "The Election Times", joined in the production of other publications—"College Days" and "Bubble and Squeak"—and participated in the Eton Wall Game. His parents could not afford to send him to a university without another scholarship, and they concluded from his poor results that he would not be able to win one. Runciman noted that he had a romantic idea about the East, and the family decided that Blair should join the Imperial Police, the precursor of the Indian Police Service. For this he had to pass an entrance examination. In December 1921 he left Eton and travelled to join his retired father, mother, and younger sister Avril, who that month had moved to 40 Stradbroke Road, Southwold, Suffolk, the first of their four homes in the town. Blair was enrolled at a crammer there called Craighurst, and brushed up on his Classics, English, and History. He passed the entrance exam, coming seventh out of the 26 candidates who exceeded the pass mark.
Blair's maternal grandmother lived at Moulmein, so he chose a posting in Burma, then still a province of British India. In October 1922 he sailed on board SS "Herefordshire" via the Suez Canal and Ceylon to join the Indian Imperial Police in Burma. A month later, he arrived at Rangoon and travelled to the police training school in Mandalay. He was appointed an Assistant District Superintendent (on probation) on 29 November 1922, with effect from 27 November and at a base salary of Rs. 325 per month, with an overseas supplement of Rs. 125/month and a "Burma Allowance" of Rs. 75/month (a total of Rs. 525, or approximately £52–10s per month at prevailing exchange rates, equivalent to £ in ). After a short posting at Maymyo, Burma's principal hill station, he was posted to the frontier outpost of Myaungmya in the Irrawaddy Delta at the beginning of 1924.
Working as an imperial police officer gave him considerable responsibility while most of his contemporaries were still at university in England. When he was posted farther east in the Delta to Twante as a sub-divisional officer, he was responsible for the security of some 200,000 people. At the end of 1924, he was posted to Syriam, closer to Rangoon. Syriam had the refinery of the Burmah Oil Company, "the surrounding land a barren waste, all vegetation killed off by the fumes of sulphur dioxide pouring out day and night from the stacks of the refinery." But the town was near Rangoon, a cosmopolitan seaport, and Blair went into the city as often as he could, "to browse in a bookshop; to eat well-cooked food; to get away from the boring routine of police life". In September 1925 he went to Insein, the home of Insein Prison, the second largest prison in Burma. In Insein, he had "long talks on every conceivable subject" with Elisa Maria Langford-Rae (who later married Kazi Lhendup Dorjee). She noted his "sense of utter fairness in minutest details". By this time, Blair had completed his training and was receiving a monthly salary of Rs. 740, including allowances (approximately £74 per month, equivalent to £ in ).
In Burma, Blair acquired a reputation as an outsider. He spent much of his time alone, reading or pursuing non-"pukka" activities, such as attending the churches of the Karen ethnic group. A colleague, Roger Beadon, recalled (in a 1969 recording for the BBC) that Blair was fast to learn the language and that before he left Burma, "was able to speak fluently with Burmese priests in 'very high-flown Burmese.'" Blair made changes to his appearance in Burma that remained for the rest of his life. This included adopting a pencil moustache, a thin line above the lip (he previously had a toothbrush moustache). Emma Larkin writes in the introduction to "Burmese Days", "While in Burma, he acquired a moustache similar to those worn by officers of the British regiments stationed there. [He] also acquired some tattoos; on each knuckle he had a small untidy blue circle. Many Burmese living in rural areas still sport tattoos like this—they are believed to protect against bullets and snake bites." Later, he wrote that he felt guilty about his role in the work of empire and he "began to look more closely at his own country and saw that England also had its oppressed."
In April 1926 he moved to Moulmein, where his maternal grandmother lived. At the end of that year, he was assigned to Katha in Upper Burma, where he contracted dengue fever in 1927. Entitled to a leave in England that year, he was allowed to return in July due to his illness. While on leave in England and on holiday with his family in Cornwall in September 1927, he reappraised his life. Deciding against returning to Burma, he resigned from the Indian Imperial Police to become a writer, with effect from 12 March 1928 after five-and-a-half years of service. He drew on his experiences in the Burma police for the novel "Burmese Days" (1934) and the essays "A Hanging" (1931) and "Shooting an Elephant" (1936).
In England, he settled back in the family home at Southwold, renewing acquaintance with local friends and attending an Old Etonian dinner. He visited his old tutor Gow at Cambridge for advice on becoming a writer. In 1927 he moved to London. Ruth Pitter, a family acquaintance, helped him find lodgings, and by the end of 1927 he had moved into rooms in Portobello Road; a blue plaque commemorates his residence there. Pitter's involvement in the move "would have lent it a reassuring respectability in Mrs Blair's eyes." Pitter had a sympathetic interest in Blair's writing, pointed out weaknesses in his poetry, and advised him to write about what he knew. In fact he decided to write of "certain aspects of the present that he set out to know" and ventured into the East End of London—the first of the occasional sorties he would make to discover for himself the world of poverty and the down-and-outers who inhabit it. He had found a subject. These sorties, explorations, expeditions, tours or immersions were made intermittently over a period of five years.
In imitation of Jack London, whose writing he admired (particularly "The People of the Abyss"), Blair started to explore the poorer parts of London. On his first outing he set out to Limehouse Causeway, spending his first night in a common lodging house, possibly George Levy's 'kip'. For a while he "went native" in his own country, dressing like a tramp, adopting the name P.S. Burton and making no concessions to middle-class "mores" and expectations; he recorded his experiences of the low life for use in "The Spike", his first published essay in English, and in the second half of his first book, "Down and Out in Paris and London" (1933).
In early 1928 he moved to Paris. He lived in the rue du Pot de Fer, a working class district in the 5th Arrondissement. His aunt Nellie Limouzin also lived in Paris and gave him social and, when necessary, financial support. He began to write novels, including an early version of "Burmese Days", but nothing else survives from that period. He was more successful as a journalist and published articles in "Monde", a political/literary journal edited by Henri Barbusse (his first article as a professional writer, "La Censure en Angleterre", appeared in that journal on 6 October 1928); "G. K.'s Weekly", where his first article to appear in England, "A Farthing Newspaper", was printed on 29 December 1928; and "Le Progrès Civique" (founded by the left-wing coalition Le Cartel des Gauches). Three pieces appeared in successive weeks in "Le Progrès Civique": discussing unemployment, a day in the life of a tramp, and the beggars of London, respectively. "In one or another of its destructive forms, poverty was to become his obsessive subject—at the heart of almost everything he wrote until "Homage to Catalonia"."
He fell seriously ill in February 1929 and was taken to the Hôpital Cochin in the 14th arrondissement, a free hospital where medical students were trained. His experiences there were the basis of his essay "How the Poor Die", published in 1946. He chose not to identify the hospital, and indeed was deliberately misleading about its location. Shortly afterwards, he had all his money stolen from his lodging house. Whether through necessity or to collect material, he undertook menial jobs such as dishwashing in a fashionable hotel on the rue de Rivoli, which he later described in "Down and Out in Paris and London". In August 1929, he sent a copy of "The Spike" to John Middleton Murry's "New Adelphi" magazine in London. The magazine was edited by Max Plowman and Sir Richard Rees, and Plowman accepted the work for publication.
In December 1929 after nearly two years in Paris, Blair returned to England and went directly to his parents' house in Southwold, a coastal town in Suffolk, which remained his base for the next five years. The family was well established in the town, and his sister Avril was running a tea-house there. He became acquainted with many local people, including Brenda Salkeld, the clergyman's daughter who worked as a gym-teacher at St Felix Girls' School in the town. Although Salkeld rejected his offer of marriage, she remained a friend and regular correspondent for many years. He also renewed friendships with older friends, such as Dennis Collings, whose girlfriend Eleanor Jacques was also to play a part in his life.
In early 1930 he stayed briefly in Bramley, Leeds, with his sister Marjorie and her husband Humphrey Dakin, who was as unappreciative of Blair as when they knew each other as children. Blair was writing reviews for "Adelphi" and acting as a private tutor to a disabled child at Southwold. He then became tutor to three young brothers, one of whom, Richard Peters, later became a distinguished academic. "His history in these years is marked by dualities and contrasts. There is Blair leading a respectable, outwardly eventless life at his parents' house in Southwold, writing; then in contrast, there is Blair as Burton (the name he used in his down-and-out episodes) in search of experience in the kips and spikes, in the East End, on the road, and in the hop fields of Kent." He went painting and bathing on the beach, and there he met Mabel and Francis Fierz, who later influenced his career. Over the next year he visited them in London, often meeting their friend Max Plowman. He also often stayed at the homes of Ruth Pitter and Richard Rees, where he could "change" for his sporadic tramping expeditions. One of his jobs was domestic work at a lodgings for half a crown (two shillings and sixpence, or one-eighth of a pound) a day.
Blair now contributed regularly to "Adelphi", with "A Hanging" appearing in August 1931. From August to September 1931 his explorations of poverty continued, and, like the protagonist of "A Clergyman's Daughter", he followed the East End tradition of working in the Kent hop fields. He kept a diary about his experiences there. Afterwards, he lodged in the Tooley Street kip, but could not stand it for long, and with financial help from his parents moved to Windsor Street, where he stayed until Christmas. "Hop Picking", by Eric Blair, appeared in the October 1931 issue of "New Statesman", whose editorial staff included his old friend Cyril Connolly. Mabel Fierz put him in contact with Leonard Moore, who became his literary agent.
At this time Jonathan Cape rejected "A Scullion's Diary", the first version of "Down and Out". On the advice of Richard Rees, he offered it to Faber and Faber, but their editorial director, T. S. Eliot, also rejected it. Blair ended the year by deliberately getting himself arrested, so that he could experience Christmas in prison, but the authorities did not regard his "drunk and disorderly" behaviour as imprisonable, and he returned home to Southwold after two days in a police cell.
In April 1932 Blair became a teacher at The Hawthorns High School, a school for boys, in Hayes, West London. This was a small school offering private schooling for children of local tradesmen and shopkeepers, and had only 14 or 16 boys aged between ten and sixteen, and one other master. While at the school he became friendly with the curate of the local parish church and became involved with activities there. Mabel Fierz had pursued matters with Moore, and at the end of June 1932, Moore told Blair that Victor Gollancz was prepared to publish "A Scullion's Diary" for a £40 advance, through his recently founded publishing house, Victor Gollancz Ltd, which was an outlet for radical and socialist works.
At the end of the summer term in 1932, Blair returned to Southwold, where his parents had used a legacy to buy their own home. Blair and his sister Avril spent the holidays making the house habitable while he also worked on "Burmese Days". He was also spending time with Eleanor Jacques, but her attachment to Dennis Collings remained an obstacle to his hopes of a more serious relationship.
"Clink", an essay describing his failed attempt to get sent to prison, appeared in the August 1932 number of "Adelphi". He returned to teaching at Hayes and prepared for the publication of his book, now known as "Down and Out in Paris and London". He wished to publish under a different name to avoid any embarrassment to his family over his time as a "tramp". In a letter to Moore (dated 15 November 1932), he left the choice of pseudonym to Moore and to Gollancz. Four days later, he wrote to Moore, suggesting the pseudonyms P. S. Burton (a name he used when tramping), Kenneth Miles, George Orwell, and H. Lewis Allways. He finally adopted the "nom de plume" George Orwell because "It is a good round English name." "Down and Out in Paris and London" was published on 9 January 1933 as Orwell continued to work on "Burmese Days". "Down and Out" was modestly successful and was next published by Harper & Brothers in New York.
In mid-1933 Blair left Hawthorns to become a teacher at Frays College, in Uxbridge, Middlesex. This was a much larger establishment with 200 pupils and a full complement of staff. He acquired a motorcycle and took trips through the surrounding countryside. On one of these expeditions he became soaked and caught a chill that developed into pneumonia. He was taken to Uxbridge Cottage Hospital, where for a time his life was believed to be in danger. When he was discharged in January 1934, he returned to Southwold to convalesce and, supported by his parents, never returned to teaching.
He was disappointed when Gollancz turned down "Burmese Days", mainly on the grounds of potential suits for libel, but Harper were prepared to publish it in the United States. Meanwhile, Blair started work on the novel "A Clergyman's Daughter", drawing upon his life as a teacher and on life in Southwold. Eleanor Jacques was now married and had gone to Singapore and Brenda Salkeld had left for Ireland, so Blair was relatively isolated in Southwold—working on the allotments, walking alone and spending time with his father. Eventually in October, after sending "A Clergyman's Daughter" to Moore, he left for London to take a job that had been found for him by his aunt Nellie Limouzin.
This job was as a part-time assistant in Booklovers' Corner, a second-hand bookshop in Hampstead run by Francis and Myfanwy Westrope, who were friends of Nellie Limouzin in the Esperanto movement. The Westropes were friendly and provided him with comfortable accommodation at Warwick Mansions, Pond Street. He was sharing the job with Jon Kimche, who also lived with the Westropes. Blair worked at the shop in the afternoons and had his mornings free to write and his evenings free to socialise. These experiences provided background for the novel "Keep the Aspidistra Flying" (1936). As well as the various guests of the Westropes, he was able to enjoy the company of Richard Rees and the "Adelphi" writers and Mabel Fierz. The Westropes and Kimche were members of the Independent Labour Party, although at this time Blair was not seriously politically active. He was writing for the "Adelphi" and preparing "A Clergyman's Daughter" and "Burmese Days" for publication.
At the beginning of 1935 he had to move out of Warwick Mansions, and Mabel Fierz found him a flat in Parliament Hill. "A Clergyman's Daughter" was published on 11 March 1935. In early 1935 Blair met his future wife Eileen O'Shaughnessy, when his landlady, Rosalind Obermeyer, who was studying for a master's degree in psychology at University College London, invited some of her fellow students to a party. One of these students, Elizaveta Fen, a biographer and future translator of Chekhov, recalled Blair and his friend Richard Rees "draped" at the fireplace, looking, she thought, "moth-eaten and prematurely aged." Around this time, Blair had started to write reviews for "The New English Weekly".
In June, "Burmese Days" was published and Cyril Connolly's review in the "New Statesman" prompted Blair (as he then became known) to re-establish contact with his old friend. In August, he moved into a flat in Kentish Town, which he shared with Michael Sayers and Rayner Heppenstall. The relationship was sometimes awkward and Blair and Heppenstall even came to blows, though they remained friends and later worked together on BBC broadcasts. Blair was now working on "Keep the Aspidistra Flying", and also tried unsuccessfully to write a serial for the "News Chronicle". By October 1935 his flatmates had moved out and he was struggling to pay the rent on his own. He remained until the end of January 1936, when he stopped working at Booklovers' Corner.
At this time, Victor Gollancz suggested Orwell spend a short time investigating social conditions in economically depressed northern England. Two years earlier, J. B. Priestley had written about England north of the Trent, sparking an interest in reportage. The depression had also introduced a number of working-class writers from the North of England to the reading public. It was one of these working-class authors, Jack Hilton, whom Orwell sought for advice. Orwell had written to Hilton seeking lodging and asking for recommendations on his route. Hilton was unable to provide him lodging, but suggested that he travel to Wigan rather than Rochdale, "for there are the colliers and they're good stuff."
On 31 January 1936, Orwell set out by public transport and on foot, reaching Manchester via Coventry, Stafford, the Potteries and Macclesfield. Arriving in Manchester after the banks had closed, he had to stay in a common lodging-house. The next day he picked up a list of contacts sent by Richard Rees. One of these, the trade union official Frank Meade, suggested Wigan, where Orwell spent February staying in dirty lodgings over a tripe shop. At Wigan, he visited many homes to see how people lived, took detailed notes of housing conditions and wages earned, went down Bryn Hall coal mine, and used the local public library to consult public health records and reports on working conditions in mines.
During this time, he was distracted by concerns about style and possible libel in "Keep the Aspidistra Flying". He made a quick visit to Liverpool and during March, stayed in south Yorkshire, spending time in Sheffield and Barnsley. As well as visiting mines, including Grimethorpe, and observing social conditions, he attended meetings of the Communist Party and of Oswald Mosley ("his speech the usual claptrap—The blame for everything was put upon mysterious international gangs of Jews") where he saw the tactics of the Blackshirts ("...one is liable to get both a hammering and a fine for asking a question which Mosley finds it difficult to answer."). He also made visits to his sister at Headingley, during which he visited the Brontë Parsonage at Haworth, where he was "chiefly impressed by a pair of Charlotte Brontë's cloth-topped boots, very small, with square toes and lacing up at the sides."
Orwell needed somewhere he could concentrate on writing his book, and once again help was provided by Aunt Nellie, who was living at Wallington, Hertfordshire in a very small 16th-century cottage called the "Stores". Wallington was a tiny village 35 miles north of London, and the cottage had almost no modern facilities. Orwell took over the tenancy and moved in on 2 April 1936. He started work on "The Road to Wigan Pier" by the end of April, but also spent hours working on the garden and testing the possibility of reopening the Stores as a village shop. "Keep the Aspidistra Flying" was published by Gollancz on 20 April 1936. On 4 August, Orwell gave a talk at the Adelphi Summer School held at Langham, entitled "An Outsider Sees the Distressed Areas"; others who spoke at the school included John Strachey, Max Plowman, Karl Polanyi and Reinhold Niebuhr.
The result of his journeys through the north was "The Road to Wigan Pier", published by Gollancz for the Left Book Club in 1937. The first half of the book documents his social investigations of Lancashire and Yorkshire, including an evocative description of working life in the coal mines. The second half is a long essay on his upbringing and the development of his political conscience, which includes an argument for socialism (although he goes to lengths to balance the concerns and goals of socialism with the barriers it faced from the movement's own advocates at the time, such as "priggish" and "dull" socialist intellectuals and "proletarian" socialists with little grasp of the actual ideology). Gollancz feared the second half would offend readers and added a disculpatory preface to the book while Orwell was in Spain.
Orwell's research for "The Road to Wigan Pier" led to him being placed under surveillance by the Special Branch from 1936, for 12 years, until one year before the publication of "Nineteen Eighty-Four".
Orwell married Eileen O'Shaughnessy on 9 June 1936. Shortly afterwards, the political crisis began in Spain and Orwell followed developments there closely. At the end of the year, concerned by Francisco Franco's military uprising (supported by Nazi Germany, Fascist Italy and local groups such as Falange), Orwell decided to go to Spain to take part in the Spanish Civil War on the Republican side. Under the erroneous impression that he needed papers from some left-wing organisation to cross the frontier, on John Strachey's recommendation he applied unsuccessfully to Harry Pollitt, leader of the British Communist Party. Pollitt was suspicious of Orwell's political reliability; he asked him whether he would undertake to join the International Brigade and advised him to get a safe-conduct from the Spanish Embassy in Paris. Not wishing to commit himself until he had seen the situation "in situ", Orwell instead used his Independent Labour Party contacts to get a letter of introduction to John McNair in Barcelona.
Orwell set out for Spain on about 23 December 1936, dining with Henry Miller in Paris on the way. The American writer told Orwell that going to fight in the Civil War out of some sense of obligation or guilt was "sheer stupidity" and that the Englishman's ideas "about combating Fascism, defending democracy, etc., etc., were all baloney." A few days later in Barcelona, Orwell met John McNair of the Independent Labour Party (ILP) Office who quoted him: "I've come to fight against Fascism". Orwell stepped into a complex political situation in Catalonia. The Republican government was supported by a number of factions with conflicting aims, including the Workers' Party of Marxist Unification (POUM – Partido Obrero de Unificación Marxista), the anarcho-syndicalist Confederación Nacional del Trabajo (CNT) and the Unified Socialist Party of Catalonia (a wing of the Spanish Communist Party, which was backed by Soviet arms and aid). The ILP was linked to the POUM so Orwell joined the POUM.
After a time at the Lenin Barracks in Barcelona he was sent to the relatively quiet Aragon Front under Georges Kopp. By January 1937 he was at Alcubierre above sea level, in the depth of winter. There was very little military action and Orwell was shocked by the lack of munitions, food and firewood as well as other extreme deprivations. With his Cadet Corps and police training, Orwell was quickly made a corporal. On the arrival of a British ILP Contingent about three weeks later, Orwell and the other English militiaman, Williams, were sent with them to Monte Oscuro. The newly arrived ILP contingent included Bob Smillie, Bob Edwards, Stafford Cottman and Jack Branthwaite. The unit was then sent on to Huesca.
Meanwhile, back in England, Eileen had been handling the issues relating to the publication of "The Road to Wigan Pier" before setting out for Spain herself, leaving Nellie Limouzin to look after The Stores. Eileen volunteered for a post in John McNair's office and with the help of Georges Kopp paid visits to her husband, bringing him English tea, chocolate and cigars. Orwell had to spend some days in hospital with a poisoned hand and had most of his possessions stolen by the staff. He returned to the front and saw some action in a night attack on the Nationalist trenches where he chased an enemy soldier with a bayonet and bombed an enemy rifle position.
In April, Orwell returned to Barcelona. Wanting to be sent to the Madrid front, which meant he "must join the International Column", he approached a Communist friend attached to the Spanish Medical Aid and explained his case. "Although he did not think much of the Communists, Orwell was still ready to treat them as friends and allies. That would soon change." This was the time of the Barcelona May Days and Orwell was caught up in the factional fighting. He spent much of the time on a roof, with a stack of novels, but encountered Jon Kimche from his Hampstead days during the stay. The subsequent campaign of lies and distortion carried out by the Communist press, in which the POUM was accused of collaborating with the fascists, had a dramatic effect on Orwell. Instead of joining the International Brigades as he had intended, he decided to return to the Aragon Front. Once the May fighting was over, he was approached by a Communist friend who asked if he still intended transferring to the International Brigades. Orwell expressed surprise that they should still want him, because according to the Communist press he was a fascist. "No one who was in Barcelona then, or for months later, will forget the horrible atmosphere produced by fear, suspicion, hatred, censored newspapers, crammed jails, enormous food queues and prowling gangs of armed men."
After his return to the front, he was wounded in the throat by a sniper's bullet. At 6 ft 2 in (1.88 m), Orwell was considerably taller than the Spanish fighters and had been warned against standing against the trench parapet. Unable to speak, and with blood pouring from his mouth, Orwell was carried on a stretcher to Siétamo, loaded on an ambulance and after a bumpy journey via Barbastro arrived at the hospital at Lérida. He recovered sufficiently to get up and on 27 May 1937 was sent on to Tarragona and two days later to a POUM sanatorium in the suburbs of Barcelona. The bullet had missed his main artery by the barest margin and his voice was barely audible. It had been such a clean shot that the wound immediately went through the process of cauterisation. He received electrotherapy treatment and was declared medically unfit for service.
By the middle of June the political situation in Barcelona had deteriorated and the POUM—painted by the pro-Soviet Communists as a Trotskyist organisation—was outlawed and under attack. The Communist line was that the POUM were "objectively" Fascist, hindering the Republican cause. "A particularly nasty poster appeared, showing a head with a POUM mask being ripped off to reveal a Swastika-covered face beneath." Members, including Kopp, were arrested and others were in hiding. Orwell and his wife were under threat and had to lie low, although they broke cover to try to help Kopp.
Finally with their passports in order, they escaped from Spain by train, diverting to Banyuls-sur-Mer for a short stay before returning to England. In the first week of July 1937 Orwell arrived back at Wallington; on 13 July 1937 a deposition was presented to the Tribunal for Espionage & High Treason in Valencia, charging the Orwells with "rabid Trotskyism", and being agents of the POUM. The trial of the leaders of the POUM and of Orwell (in his absence) took place in Barcelona in October and November 1938. Observing events from French Morocco, Orwell wrote that they were "only a by-product of the Russian Trotskyist trials and from the start every kind of lie, including flagrant absurdities, has been circulated in the Communist press." Orwell's experiences in the Spanish Civil War gave rise to "Homage to Catalonia" (1938).
Orwell returned to England in June 1937, and stayed at the O'Shaughnessy home at Greenwich. He found his views on the Spanish Civil War out of favour. Kingsley Martin rejected two of his works and Gollancz was equally cautious. At the same time, the communist "Daily Worker" was running an attack on "The Road to Wigan Pier", taking out of context Orwell writing that "the working classes smell"; a letter to Gollancz from Orwell threatening libel action brought a stop to this. Orwell was also able to find a more sympathetic publisher for his views in Fredric Warburg of Secker & Warburg. Orwell returned to Wallington, which he found in disarray after his absence. He acquired goats, a cockerel (rooster) he called Henry Ford and a poodle puppy he called Marx; and settled down to animal husbandry and writing "Homage to Catalonia".
There were thoughts of going to India to work on the Pioneer, a newspaper in Lucknow, but by March 1938 Orwell's health had deteriorated. He was admitted to Preston Hall Sanatorium at Aylesford, Kent, a British Legion hospital for ex-servicemen to which his brother-in-law Laurence O'Shaughnessy was attached. He was thought initially to be suffering from tuberculosis and stayed in the sanatorium until September. A stream of visitors came to see him, including Common, Heppenstall, Plowman and Cyril Connolly. Connolly brought with him Stephen Spender, a cause of some embarrassment as Orwell had referred to Spender as a "pansy friend" some time earlier. "Homage to Catalonia" was published by Secker & Warburg and was a commercial flop. In the latter part of his stay at the clinic, Orwell was able to go for walks in the countryside and study nature.
The novelist L. H. Myers secretly funded a trip to French Morocco for half a year for Orwell to avoid the English winter and recover his health. The Orwells set out in September 1938 via Gibraltar and Tangier to avoid Spanish Morocco and arrived at Marrakech. They rented a villa on the road to Casablanca and during that time Orwell wrote "Coming Up for Air". They arrived back in England on 30 March 1939 and "Coming Up for Air" was published in June. Orwell spent time in Wallington and Southwold working on a Dickens essay and it was in June 1939 that Orwell's father, Richard Blair, died.
At the outbreak of the Second World War, Orwell's wife Eileen started working in the Censorship Department of the Ministry of Information in central London, staying during the week with her family in Greenwich. Orwell also submitted his name to the Central Register for war work, but nothing transpired. "They won't have me in the army, at any rate at present, because of my lungs", Orwell told Geoffrey Gorer. He returned to Wallington, and in late 1939 he wrote material for his first collection of essays, "Inside the Whale". For the next year he was occupied writing reviews for plays, films and books for "The Listener", "Time and Tide" and "New Adelphi". On 29 March 1940 his long association with "Tribune" began with a review of a sergeant's account of Napoleon's retreat from Moscow. At the beginning of 1940, the first edition of Connolly's "Horizon" appeared, and this provided a new outlet for Orwell's work as well as new literary contacts. In May the Orwells took lease of a flat in London at Dorset Chambers, Chagford Street, Marylebone. It was the time of the Dunkirk evacuation and the death in France of Eileen's brother Lawrence caused her considerable grief and long-term depression. Throughout this period Orwell kept a wartime diary.
Orwell was declared "unfit for any kind of military service" by the Medical Board in June, but soon afterwards found an opportunity to become involved in war activities by joining the Home Guard. He shared Tom Wintringham's socialist vision for the Home Guard as a revolutionary People's Militia. His lecture notes for instructing platoon members include advice on street fighting, field fortifications, and the use of mortars of various kinds. Sergeant Orwell managed to recruit Fredric Warburg to his unit. During the Battle of Britain he used to spend weekends with Warburg and his new Zionist friend, Tosco Fyvel, at Warburg's house at Twyford, Berkshire. At Wallington he worked on "England Your England" and in London wrote reviews for various periodicals. Visiting Eileen's family in Greenwich brought him face-to-face with the effects of the Blitz on East London. In mid-1940, Warburg, Fyvel and Orwell planned Searchlight Books. Eleven volumes eventually appeared, of which Orwell's "", published on 19 February 1941, was the first.
Early in 1941 he began to write for the American "Partisan Review" which linked Orwell with The New York Intellectuals who were also anti-Stalinist, and contributed to the Gollancz anthology "The Betrayal of the Left", written in the light of the Molotov–Ribbentrop Pact (although Orwell referred to it as the Russo-German Pact and the Hitler-Stalin Pact). He also applied unsuccessfully for a job at the Air Ministry. Meanwhile, he was still writing reviews of books and plays and at this time met the novelist Anthony Powell. He also took part in a few radio broadcasts for the Eastern Service of the BBC. In March the Orwells moved to a seventh-floor flat at Langford Court, St John's Wood, while at Wallington Orwell was "digging for victory" by planting potatoes.
In August 1941, Orwell finally obtained "war work" when he was taken on full-time by the BBC's Eastern Service. When interviewed for the job he indicted that he "accept[ed] absolutely the need for propaganda to be directed by the government and stressed his view that, in wartime, discipline in the execution of government policy was essential". He supervised cultural broadcasts to India to counter propaganda from Nazi Germany designed to undermine Imperial links. This was Orwell's first experience of the rigid conformity of life in an office, and it gave him an opportunity to create cultural programmes with contributions from T. S. Eliot, Dylan Thomas, E. M. Forster, Ahmed Ali, Mulk Raj Anand, and William Empson among others.
At the end of August he had a dinner with H. G. Wells which degenerated into a row because Wells had taken offence at observations Orwell made about him in a "Horizon" article. In October Orwell had a bout of bronchitis and the illness recurred frequently. David Astor was looking for a provocative contributor for "The Observer" and invited Orwell to write for him—the first article appearing in March 1942. In early 1942 Eileen changed jobs to work at the Ministry of Food and in mid-1942 the Orwells moved to a larger flat, a ground floor and basement, 10a Mortimer Crescent in Maida Vale/Kilburn—"the kind of lower-middle-class ambience that Orwell thought was London at its best." Around the same time Orwell's mother and sister Avril, who had found work in a sheet-metal factory behind King's Cross Station, moved into a flat close to George and Eileen.
At the BBC, Orwell introduced "Voice", a literary programme for his Indian broadcasts, and by now was leading an active social life with literary friends, particularly on the political left. Late in 1942, he started writing regularly for the left-wing weekly "Tribune" directed by Labour MPs Aneurin Bevan and George Strauss. In March 1943 Orwell's mother died and around the same time he told Moore he was starting work on a new book, which turned out to be "Animal Farm".
In September 1943, Orwell resigned from the BBC post that he had occupied for two years. His resignation followed a report confirming his fears that few Indians listened to the broadcasts, but he was also keen to concentrate on writing "Animal Farm". Just six days before his last day of service, on 24 November 1943, his adaptation of the fairy tale, Hans Christian Andersen's "The Emperor's New Clothes" was broadcast. It was a genre in which he was greatly interested and which appeared on "Animal Farm"s title-page. At this time he also resigned from the Home Guard on medical grounds.
In November 1943, Orwell was appointed literary editor at "Tribune", where his assistant was his old friend Jon Kimche. Orwell was on staff until early 1945, writing over 80 book reviews and on 3 December 1943 started his regular personal column, "As I Please", usually addressing three or four subjects in each. He was still writing reviews for other magazines, including "Partisan Review", "Horizon", and the New York "Nation" and becoming a respected pundit among left-wing circles but also a close friend of people on the right such as Powell, Astor and Malcolm Muggeridge. By April 1944 "Animal Farm" was ready for publication. Gollancz refused to publish it, considering it an attack on the Soviet regime which was a crucial ally in the war. A similar fate was met from other publishers (including T. S. Eliot at Faber and Faber) until Jonathan Cape agreed to take it.
In May the Orwells had the opportunity to adopt a child, thanks to the contacts of Eileen's sister Gwen O'Shaughnessy, then a doctor in Newcastle upon Tyne. In June a V-1 flying bomb struck Mortimer Crescent and the Orwells had to find somewhere else to live. Orwell had to scrabble around in the rubble for his collection of books, which he had finally managed to transfer from Wallington, carting them away in a wheelbarrow.
Another blow was Cape's reversal of his plan to publish "Animal Farm". The decision followed his personal visit to Peter Smollett, an official at the Ministry of Information. Smollett was later identified as a Soviet agent.
The Orwells spent some time in the North East, near Carlton, County Durham, dealing with matters in the adoption of a boy whom they named Richard Horatio Blair. By September 1944 they had set up home in Islington, at 27b Canonbury Square. Baby Richard joined them there, and Eileen gave up her work at the Ministry of Food to look after her family. Secker & Warburg had agreed to publish "Animal Farm", planned for the following March, although it did not appear in print until August 1945. By February 1945 David Astor had invited Orwell to become a war correspondent for the "Observer". Orwell had been looking for the opportunity throughout the war, but his failed medical reports prevented him from being allowed anywhere near action. He went to Paris after the liberation of France and to Cologne once it had been occupied by the Allies.
It was while he was there that Eileen went into hospital for a hysterectomy and died under anaesthetic on 29 March 1945. She had not given Orwell much notice about this operation because of worries about the cost and because she expected to make a speedy recovery. Orwell returned home for a while and then went back to Europe. He returned finally to London to cover the 1945 general election at the beginning of July. "Animal Farm: A Fairy Story" was published in Britain on 17 August 1945, and a year later in the US, on 26 August 1946.
"Animal Farm" had particular resonance in the post-war climate and its worldwide success made Orwell a sought-after figure. For the next four years, Orwell mixed journalistic work—mainly for "Tribune", "The Observer" and the "Manchester Evening News", though he also contributed to many small-circulation political and literary magazines—with writing his best-known work, "Nineteen Eighty-Four", which was published in 1949.
In the year following Eileen's death he published around 130 articles and a selection of his "Critical Essays", while remaining active in various political lobbying campaigns. He employed a housekeeper, Susan Watson, to look after his adopted son at the Islington flat, which visitors now described as "bleak". In September he spent a fortnight on the island of Jura in the Inner Hebrides and saw it as a place to escape from the hassle of London literary life. David Astor was instrumental in arranging a place for Orwell on Jura. Astor's family owned Scottish estates in the area and a fellow Old Etonian, Robin Fletcher, had a property on the island. In late 1945 and early 1946 Orwell made several hopeless and unwelcome marriage proposals to younger women, including Celia Kirwan (who later became Arthur Koestler's sister-in-law), Ann Popham who happened to live in the same block of flats and Sonia Brownell, one of Connolly's coterie at the "Horizon" office. Orwell suffered a tubercular haemorrhage in February 1946 but disguised his illness. In 1945 or early 1946, while still living at Canonbury Square, Orwell wrote an article on "British Cookery", complete with recipes, commissioned by the British Council. Given the post-war shortages, both parties agreed not to publish it. His sister Marjorie died of kidney disease in May and shortly after, on 22 May 1946, Orwell set off to live on the Isle of Jura at a house known as Barnhill.
This was an abandoned farmhouse with outbuildings near the northern end of the island, situated at the end of a five-mile (8 km), heavily rutted track from Ardlussa, where the owners lived. Conditions at the farmhouse were primitive but the natural history and the challenge of improving the place appealed to Orwell. His sister Avril accompanied him there and young novelist Paul Potts made up the party. In July Susan Watson arrived with Orwell's son Richard. Tensions developed and Potts departed after one of his manuscripts was used to light the fire. Orwell meanwhile set to work on "Nineteen Eighty-Four". Later Susan Watson's boyfriend David Holbrook arrived. A fan of Orwell since school days, he found the reality very different, with Orwell hostile and disagreeable probably because of Holbrook's membership of the Communist Party. Susan Watson could no longer stand being with Avril and she and her boyfriend left.
Orwell returned to London in late 1946 and picked up his literary journalism again. Now a well-known writer, he was swamped with work. Apart from a visit to Jura in the new year he stayed in London for one of the coldest British winters on record and with such a national shortage of fuel that he burnt his furniture and his child's toys. The heavy smog in the days before the Clean Air Act 1956 did little to help his health about which he was reticent, keeping clear of medical attention. Meanwhile, he had to cope with rival claims of publishers Gollancz and Warburg for publishing rights. About this time he co-edited a collection titled "British Pamphleteers" with Reginald Reynolds. As a result of the success of "Animal Farm", Orwell was expecting a large bill from the Inland Revenue and he contacted a firm of accountants of which the senior partner was Jack Harrison. The firm advised Orwell to establish a company to own his copyright and to receive his royalties and set up a "service agreement" so that he could draw a salary. Such a company "George Orwell Productions Ltd" (GOP Ltd) was set up on 12 September 1947 although the service agreement was not then put into effect. Jack Harrison left the details at this stage to junior colleagues.
Orwell left London for Jura on 10 April 1947. In July he ended the lease on the Wallington cottage. Back on Jura he worked on "Nineteen Eighty-Four" and made good progress. During that time his sister's family visited, and Orwell led a disastrous boating expedition, on 19 August, which nearly led to loss of life whilst trying to cross the notorious Gulf of Corryvreckan and gave him a soaking which was not good for his health. In December a chest specialist was summoned from Glasgow who pronounced Orwell seriously ill and a week before Christmas 1947 he was in Hairmyres Hospital in East Kilbride, then a small village in the countryside, on the outskirts of Glasgow. Tuberculosis was diagnosed and the request for permission to import streptomycin to treat Orwell went as far as Aneurin Bevan, then Minister of Health. David Astor helped with supply and payment and Orwell began his course of streptomycin on 19 or 20 February 1948. By the end of July 1948 Orwell was able to return to Jura and by December he had finished the manuscript of "Nineteen Eighty-Four". In January 1949, in a very weak condition, he set off for a sanatorium at Cranham, Gloucestershire, escorted by Richard Rees.
The sanatorium at Cranham consisted of a series of small wooden chalets or huts in a remote part of the Cotswolds near Stroud. Visitors were shocked by Orwell's appearance and concerned by the shortcomings and ineffectiveness of the treatment. Friends were worried about his finances, but by now he was comparatively well-off. He was writing to many of his friends, including Jacintha Buddicom, who had "rediscovered" him, and in March 1949, was visited by Celia Kirwan. Kirwan had just started working for a Foreign Office unit, the Information Research Department, set up by the Labour government to publish anti-communist propaganda, and Orwell gave her a list of people he considered to be unsuitable as IRD authors because of their pro-communist leanings. Orwell's list, not published until 2003, consisted mainly of writers but also included actors and Labour MPs. Orwell received more streptomycin treatment and improved slightly. In June 1949 "Nineteen Eighty-Four" was published to immediate critical and popular acclaim.
Orwell's health had continued to decline since the diagnosis of tuberculosis in December 1947. In mid-1949, he courted Sonia Brownell, and they announced their engagement in September, shortly before he was removed to University College Hospital in London. Sonia took charge of Orwell's affairs and attended him diligently in the hospital. In September 1949, Orwell invited his accountant Harrison to visit him in hospital, and Harrison claimed that Orwell then asked him to become director of GOP Ltd and to manage the company, but there was no independent witness. Orwell's wedding took place in the hospital room on 13 October 1949, with David Astor as best man. Orwell was in decline and visited by an assortment of visitors including Muggeridge, Connolly, Lucian Freud, Stephen Spender, Evelyn Waugh, Paul Potts, Anthony Powell, and his Eton tutor Anthony Gow. Plans to go to the Swiss Alps were mooted. Further meetings were held with his accountant, at which Harrison and Mr and Mrs Blair were confirmed as directors of the company, and at which Harrison claimed that the "service agreement" was executed, giving copyright to the company. Orwell's health was in decline again by Christmas. On the evening of 20 January 1950, Potts visited Orwell and slipped away on finding him asleep. Jack Harrison visited later and claimed that Orwell gave him 25% of the company. Early on the morning of 21 January, an artery burst in Orwell's lungs, killing him at age 46.
Orwell had requested to be buried in accordance with the Anglican rite in the graveyard of the closest church to wherever he happened to die. The graveyards in central London had no space, and so in an effort to ensure his last wishes could be fulfilled, his widow appealed to his friends to see whether any of them knew of a church with space in its graveyard.
David Astor lived in Sutton Courtenay, Oxfordshire, and arranged for Orwell to be interred in the churchyard of All Saints' there. Orwell's gravestone bears the epitaph: "Here lies Eric Arthur Blair, born June 25th 1903, died January 21st 1950"; no mention is made on the gravestone of his more famous pen name.
Orwell's son, Richard Horatio Blair, was brought up by Orwell's sister Avril. He is patron of the Orwell Society.
In 1979, Sonia Brownell brought a High Court action against Harrison when he declared an intention to subdivide his 25 percent share of the company between his three children. For Sonia, the consequence of this manoeuvre would be to have made getting overall control of the company three times more difficult. She was considered to have a strong case, but was becoming increasingly ill and eventually was persuaded to settle out of court on 2 November 1980. She died on 11 December 1980, aged 62.
During most of his career, Orwell was best known for his journalism, in essays, reviews, columns in newspapers and magazines and in his books of reportage: "Down and Out in Paris and London" (describing a period of poverty in these cities), "The Road to Wigan Pier" (describing the living conditions of the poor in northern England, and class division generally) and "Homage to Catalonia". According to Irving Howe, Orwell was "the best English essayist since Hazlitt, perhaps since Dr Johnson."
Modern readers are more often introduced to Orwell as a novelist, particularly through his enormously successful titles "Animal Farm" and "Nineteen Eighty-Four". The former is often thought to reflect degeneration in the Soviet Union after the Russian Revolution and the rise of Stalinism; the latter, life under totalitarian rule. "Nineteen Eighty-Four" is often compared to "Brave New World" by Aldous Huxley; both are powerful dystopian novels warning of a future world where the state machine exerts complete control over social life. In 1984, "Nineteen Eighty-Four" and Ray Bradbury's "Fahrenheit 451" were honoured with the Prometheus Award for their contributions to dystopian literature. In 2011 he received it again for "Animal Farm".
"Coming Up for Air", his last novel before World War II, is the most "English" of his novels; alarms of war mingle with images of idyllic Thames-side Edwardian childhood of protagonist George Bowling. The novel is pessimistic; industrialism and capitalism have killed the best of Old England, and there were great, new external threats. In homely terms, its protagonist George Bowling posits the totalitarian hypotheses of Franz Borkenau, Orwell, Ignazio Silone and Koestler: "Old Hitler's something different. So's Joe Stalin. They aren't like these chaps in the old days who crucified people and chopped their heads off and so forth, just for the fun of it ... They're something quite new—something that's never been heard of before".
In an autobiographical piece that Orwell sent to the editors of "Twentieth Century Authors" in 1940, he wrote: "The writers I care about most and never grow tired of are: Shakespeare, Swift, Fielding, Dickens, Charles Reade, Flaubert and, among modern writers, James Joyce, T. S. Eliot and D. H. Lawrence. But I believe the modern writer who has influenced me most is W. Somerset Maugham, whom I admire immensely for his power of telling a story straightforwardly and without frills." Elsewhere, Orwell strongly praised the works of Jack London, especially his book "The Road". Orwell's investigation of poverty in "The Road to Wigan Pier" strongly resembles that of Jack London's "The People of the Abyss", in which the American journalist disguises himself as an out-of-work sailor to investigate the lives of the poor in London. In his essay "Politics vs. Literature: An Examination of Gulliver's Travels" (1946) Orwell wrote: "If I had to make a list of six books which were to be preserved when all others were destroyed, I would certainly put "Gulliver's Travels" among them."
Orwell was an admirer of Arthur Koestler and became a close friend during the three years that Koestler and his wife Mamain spent at the cottage of Bwlch Ocyn, a secluded farmhouse that belonged to Clough Williams-Ellis, in the Vale of Ffestiniog. Orwell reviewed Koestler's. "Darkness at Noon" for the "New Statesman" in 1941, saying:
Brilliant as this book is as a novel, and a piece of brilliant literature, it is probably most valuable as an interpretation of the Moscow "confessions" by someone with an inner knowledge of totalitarian methods. What was frightening about these trials was not the fact that they happened—for obviously such things are necessary in a totalitarian society—but the eagerness of Western intellectuals to justify them.
Other writers admired by Orwell included: Ralph Waldo Emerson, George Gissing, Graham Greene, Herman Melville, Henry Miller, Tobias Smollett, Mark Twain, Joseph Conrad, and Yevgeny Zamyatin. He was both an admirer and a critic of Rudyard Kipling, praising Kipling as a gifted writer and a "good bad poet" whose work is "spurious" and "morally insensitive and aesthetically disgusting," but undeniably seductive and able to speak to certain aspects of reality more effectively than more enlightened authors. He had a similarly ambivalent attitude to G. K. Chesterton, whom he regarded as a writer of considerable talent who had chosen to devote himself to "Roman Catholic propaganda", and to Evelyn Waugh, who was, he wrote, "ab[ou]t as good a novelist as one can be (i.e. as novelists go today) while holding untenable opinions".
In 1980, the English Heritage honoured Orwell with a blue plaque at 50 Lawford Road, Kentish Town, London, where he lived from August 1935 until January 1936.
Throughout his life Orwell continually supported himself as a book reviewer. His reviews are well known and have had an influence on literary criticism. He wrote in the conclusion to his 1940 essay on Charles Dickens,
George Woodcock suggested that the last two sentences also describe Orwell.
Orwell wrote a critique of George Bernard Shaw's play "Arms and the Man". He considered this Shaw's best play and the most likely to remain socially relevant, because of its theme that war is not, generally speaking, a glorious romantic adventure. His 1945 essay "In Defence of P.G. Wodehouse" contains an amusing assessment of Wodehouse's writing and also argues that his broadcasts from Germany (during the war) did not really make him a traitor. He accused The Ministry of Information of exaggerating Wodehouse's actions for propaganda purposes.
In 1946, the British Council commissioned Orwell to write an essay on British food as part of a drive to promote British relations abroad. In the essay titled "British Cookery," Orwell described the British diet as "a simple, rather heavy, perhaps slightly barbarous diet" and where "hot drinks are acceptable at most hours of the day". He discusses the ritual of breakfast in the UK, "this is not a snack but a serious meal. The hour at which people have their breakfast is of course governed by the time at which they go to work." He wrote that high tea in the United Kingdom consisted of a variety of savoury and sweet dishes, but "no tea would be considered a good one if it did not include at least one kind of cake.” Orwell also added a recipe for marmalade, a popular British spread on bread. However, the British Council declined to publish the essay on the grounds that it was too problematic to write about food at the time of strict rationing in the UK. In 2019, the essay was discovered in the British Council's archives along with the rejection letter. The British Council issued an official apology to Orwell over the rejection of the commissioned essay.
Arthur Koestler said that Orwell's "uncompromising intellectual honesty made him appear almost inhuman at times." Ben Wattenberg stated: "Orwell's writing pierced intellectual hypocrisy wherever he found it." According to historian Piers Brendon, "Orwell was the saint of common decency who would in earlier days, said his BBC boss Rushbrook Williams, 'have been either canonised—or burnt at the stake'". Raymond Williams in describes Orwell as a "successful impersonation of a plain man who bumps into experience in an unmediated way and tells the truth about it." Christopher Norris declared that Orwell's "homespun empiricist outlook—his assumption that the truth was just there to be told in a straightforward common-sense way—now seems not merely naïve but culpably self-deluding". The American scholar Scott Lucas has described Orwell as an enemy of the Left. John Newsinger has argued that Lucas could only do this by portraying "all of Orwell's attacks on Stalinism [–] as if they were attacks on socialism, despite Orwell's continued insistence that they were not."
Orwell's work has taken a prominent place in the school literature curriculum in England, with "Animal Farm" a regular examination topic at the end of secondary education (GCSE), and "Nineteen Eighty-Four" a topic for subsequent examinations below university level (A Levels). A 2016 UK poll saw "Animal Farm" ranked the nation's favourite book from school.
Historian John Rodden stated: "John Podhoretz did claim that if Orwell were alive today, he'd be standing with the neo-conservatives and against the Left. And the question arises, to what extent can you even begin to predict the political positions of somebody who's been dead three decades and more by that time?"
In "Orwell's Victory", Christopher Hitchens argues: "In answer to the accusation of inconsistency Orwell as a writer was forever taking his own temperature. In other words, here was someone who never stopped testing and adjusting his intelligence".
John Rodden points out the "undeniable conservative features in the Orwell physiognomy" and remarks on how "to some extent Orwell facilitated the kinds of uses and abuses by the Right that his name has been put to. In other ways there has been the politics of selective quotation." Rodden refers to the essay "Why I Write", in which Orwell refers to the Spanish Civil War as being his "watershed political experience", saying: "The Spanish War and other events in 1936–37, turned the scale. Thereafter I knew where I stood. Every line of serious work that I have written since 1936 has been written directly or indirectly against totalitarianism and "for" Democratic Socialism as I understand it." (emphasis in original) Rodden goes on to explain how, during the McCarthy era, the introduction to the Signet edition of "Animal Farm", which sold more than 20 million copies, makes use of "the politics of ellipsis":
If the book itself, "Animal Farm", had left any doubt of the matter, Orwell dispelled it in his essay "Why I Write": 'Every line of serious work that I've written since 1936 has been written directly or indirectly against Totalitarianism ... dot, dot, dot, dot.' "For Democratic Socialism" is vaporised, just like Winston Smith did it at the Ministry of Truth, and that's very much what happened at the beginning of the McCarthy era and just continued, Orwell being selectively quoted.
Fyvel wrote about Orwell: "His crucial experience [...] was his struggle to turn himself into a writer, one which led through long periods of poverty, failure and humiliation, and about which he has written almost nothing directly. The sweat and agony was less in the slum-life than in the effort to turn the experience into literature."
In October 2015 Finlay Publisher, for the Orwell Society, published "George Orwell 'The Complete Poetry"', compiled and presented by Dione Venables.
In his essay "Politics and the English Language" (1946), Orwell wrote about the importance of precise and clear language, arguing that vague writing can be used as a powerful tool of political manipulation because it shapes the way we think. In that essay, Orwell provides six rules for writers:
Andrew N. Rubin argues that "Orwell claimed that we should be attentive to how the use of language has limited our capacity for critical thought just as we should be equally concerned with the ways in which dominant modes of thinking have reshaped the very language that we use."
The adjective "Orwellian" connotes an attitude and a policy of control by propaganda, surveillance, misinformation, denial of truth and manipulation of the past. In "Nineteen Eighty-Four", Orwell described a totalitarian government that controlled thought by controlling language, making certain ideas literally unthinkable. Several words and phrases from "Nineteen Eighty-Four" have entered popular language. "Newspeak" is a simplified and obfuscatory language designed to make independent thought impossible. "Doublethink" means holding two contradictory beliefs simultaneously. The "Thought Police" are those who suppress all dissenting opinion. "Prolefeed" is homogenised, manufactured superficial literature, film and music used to control and indoctrinate the populace through docility. "Big Brother" is a supreme dictator who watches everyone.
Orwell may have been the first to use the term "cold war" to refer to the state of tension between powers in the Western Bloc and the Eastern Bloc that followed World War II in his essay, "You and the Atom Bomb", published in "Tribune" on 19 October 1945. He wrote:
We may be heading not for general breakdown but for an epoch as horribly stable as the slave empires of antiquity. James Burnham's theory has been much discussed, but few people have yet considered its ideological implications—this is, the kind of world-view, the kind of beliefs, and the social structure that would probably prevail in a State which was at once unconquerable and in a permanent state of 'cold war' with its neighbours.
In 2014, a play written by playwright Joe Sutton titled "Orwell in America" was first performed by the Northern Stage theatre company in White River Junction, Vermont. It is a fictitious account of Orwell doing a book tour in the United States (something he never did in his lifetime). It moved to Off-Broadway in 2016.
Orwell's birthplace, a bungalow in Motihari, Bihar, India, was opened as a museum in May 2015.
A statue of George Orwell, sculpted by the British sculptor Martin Jennings, was unveiled on 7 November 2017 outside Broadcasting House, the headquarters of the BBC. The wall behind the statue is inscribed with the following phrase: "If liberty means anything at all, it means the right to tell people what they do not want to hear". These are words from his proposed preface to "Animal Farm" and a rallying cry for the idea of free speech in an open society.
Jacintha Buddicom's account, "Eric & Us", provides an insight into Blair's childhood. She quoted his sister Avril that "he was essentially an aloof, undemonstrative person" and said herself of his friendship with the Buddicoms: "I do not think he needed any other friends beyond the schoolfriend he occasionally and appreciatively referred to as 'CC'". She could not recall his having schoolfriends to stay and exchange visits as her brother Prosper often did in holidays. Cyril Connolly provides an account of Blair as a child in "Enemies of Promise". Years later, Blair mordantly recalled his prep school in the essay "Such, Such Were the Joys", claiming among other things that he "was made to study like a dog" to earn a scholarship, which he alleged was solely to enhance the school's prestige with parents. Jacintha Buddicom repudiated Orwell's schoolboy misery described in the essay, stating that "he was a specially happy child". She noted that he did not like his name because it reminded him of a book he greatly disliked—"Eric, or, Little by Little", a Victorian boys' school story.
Connolly remarked of him as a schoolboy, "The remarkable thing about Orwell was that alone among the boys he was an intellectual and not a parrot for he thought for himself". At Eton, John Vaughan Wilkes, his former headmaster's son recalled that "he was extremely argumentative—about anything—and criticising the masters and criticising the other boys [...] We enjoyed arguing with him. He would generally win the arguments—or think he had anyhow." Roger Mynors concurs: "Endless arguments about all sorts of things, in which he was one of the great leaders. He was one of those boys who thought for himself."
Blair liked to carry out practical jokes. Buddicom recalls him swinging from the luggage rack in a railway carriage like an orangutan to frighten a woman passenger out of the compartment. At Eton, he played tricks on John Crace, his Master in College, among which was to enter a spoof advertisement in a College magazine implying pederasty. Gow, his tutor, said he "made himself as big a nuisance as he could" and "was a very unattractive boy". Later Blair was expelled from the crammer at Southwold for sending a dead rat as a birthday present to the town surveyor. In one of his "As I Please" essays he refers to a protracted joke when he answered an advertisement for a woman who claimed a cure for obesity.
Blair had an interest in natural history which stemmed from his childhood. In letters from school he wrote about caterpillars and butterflies, and Buddicom recalls his keen interest in ornithology. He also enjoyed fishing and shooting rabbits, and conducting experiments as in cooking a hedgehog or shooting down a jackdaw from the Eton roof to dissect it. His zeal for scientific experiments extended to explosives—again Buddicom recalls a cook giving notice because of the noise. Later in Southwold, his sister Avril recalled him blowing up the garden. When teaching he enthused his students with his nature-rambles both at Southwold and Hayes. His adult diaries are permeated with his observations on nature.
Buddicom and Blair lost touch shortly after he went to Burma and she became unsympathetic towards him. She wrote that it was because of the letters he wrote complaining about his life, but an addendum to "Eric & Us" by Venables reveals that he may have lost her sympathy through an incident which was, at best, a clumsy attempt at seduction.
Mabel Fierz, who later became Blair's confidante, said: "He used to say the one thing he wished in this world was that he'd been attractive to women. He liked women and had many girlfriends I think in Burma. He had a girl in Southwold and another girl in London. He was rather a womaniser, yet he was afraid he wasn't attractive."
Brenda Salkield (Southwold) preferred friendship to any deeper relationship and maintained a correspondence with Blair for many years, particularly as a sounding board for his ideas. She wrote: "He was a great letter writer. Endless letters, and I mean when he wrote you a letter he wrote pages." His correspondence with Eleanor Jacques (London) was more prosaic, dwelling on a closer relationship and referring to past rendezvous or planning future ones in London and Burnham Beeches.
When Orwell was in the sanatorium in Kent, his wife's friend Lydia Jackson visited. He invited her for a walk and out of sight "an awkward situation arose." Jackson was to be the most critical of Orwell's marriage to Eileen O'Shaughnessy, but their later correspondence hints at a complicity. Eileen at the time was more concerned about Orwell's closeness to Brenda Salkield. Orwell had an affair with his secretary at "Tribune" which caused Eileen much distress, and others have been mooted. In a letter to Ann Popham he wrote: "I was sometimes unfaithful to Eileen, and I also treated her badly, and I think she treated me badly, too, at times, but it was a real marriage, in the sense that we had been through awful struggles together and she understood all about my work, etc." Similarly he suggested to Celia Kirwan that they had both been unfaithful. There are several testaments that it was a well-matched and happy marriage.
In June 1944, Orwell and Eileen adopted a three-week-old boy they named Richard Horatio. According to Richard, Orwell was a wonderful father who gave him devoted, if rather rugged, attention and a great degree of freedom. After Orwell's death Richard went to live with Orwell's sister and her husband.
Blair was very lonely after Eileen's death in 1945, and desperate for a wife, both as companion for himself and as mother for Richard. He proposed marriage to four women, including Celia Kirwan, and eventually Sonia Brownell accepted. Orwell had met her when she was assistant to Cyril Connolly, at "Horizon" literary magazine. They were married on 13 October 1949, only three months before Orwell's death. Some maintain that Sonia was the model for Julia in "Nineteen Eighty-Four".
Orwell was noted for very close and enduring friendships with a few friends, but these were generally people with a similar background or with a similar level of literary ability. Ungregarious, he was out of place in a crowd and his discomfort was exacerbated when he was outside his own class. Though representing himself as a spokesman for the common man, he often appeared out of place with real working people. His brother-in-law Humphrey Dakin, a "Hail fellow, well met" type, who took him to a local pub in Leeds, said that he was told by the landlord: "Don't bring that bugger in here again." Adrian Fierz commented "He wasn't interested in racing or greyhounds or pub crawling or shove ha'penny. He just did not have much in common with people who did not share his intellectual interests." Awkwardness attended many of his encounters with working-class representatives, as with Pollitt and McNair, but his courtesy and good manners were often commented on. Jack Common observed on meeting him for the first time, "Right away manners, and more than manners—breeding—showed through."
In his tramping days, he did domestic work for a time. His extreme politeness was recalled by a member of the family he worked for; she declared that the family referred to him as "Laurel" after the film comedian. With his gangling figure and awkwardness, Orwell's friends often saw him as a figure of fun. Geoffrey Gorer commented "He was awfully likely to knock things off tables, trip over things. I mean, he was a gangling, physically badly co-ordinated young man. I think his feeling [was] that even the inanimate world was against him." When he shared a flat with Heppenstall and Sayer, he was treated in a patronising manner by the younger men. At the BBC in the 1940s, "everybody would pull his leg" and Spender described him as having real entertainment value "like, as I say, watching a Charlie Chaplin movie." A friend of Eileen's reminisced about her tolerance and humour, often at Orwell's expense.
One biography of Orwell accused him of having had an authoritarian streak. In Burma, he struck out at a Burmese boy who, while "fooling around" with his friends, had "accidentally bumped into him" at a station, resulting in Orwell falling "heavily" down some stairs. One of his former pupils recalled being beaten so hard he could not sit down for a week. When sharing a flat with Orwell, Heppenstall came home late one night in an advanced stage of loud inebriation. The upshot was that Heppenstall ended up with a bloody nose and was locked in a room. When he complained, Orwell hit him across the legs with a shooting stick and Heppenstall then had to defend himself with a chair. Years later, after Orwell's death, Heppenstall wrote a dramatic account of the incident called "The Shooting Stick" and Mabel Fierz confirmed that Heppenstall came to her in a sorry state the following day.
Orwell got on well with young people. The pupil he beat considered him the best of teachers and the young recruits in Barcelona tried to drink him under the table without success. His nephew recalled Uncle Eric laughing louder than anyone in the cinema at a Charlie Chaplin film.
In the wake of his most famous works, he attracted many uncritical hangers-on, but many others who sought him found him aloof and even dull. With his soft voice, he was sometimes shouted down or excluded from discussions. At this time, he was severely ill; it was wartime or the austerity period after it; during the war his wife suffered from depression; and after her death he was lonely and unhappy. In addition to that, he always lived frugally and seemed unable to care for himself properly. As a result of all this, people found his circumstances bleak. Some, like Michael Ayrton, called him "Gloomy George," but others developed the idea that he was an "English secular saint."
Although Orwell was frequently heard on the BBC for panel discussion and one-man broadcasts, no recorded copy of his voice is known to exist.
Orwell was a heavy smoker, who rolled his own cigarettes from strong shag tobacco, despite his bronchial condition. His penchant for the rugged life often took him to cold and damp situations, both in the long term, as in Catalonia and Jura, and short term, for example, motorcycling in the rain and suffering a shipwreck. Described by "The Economist" as "perhaps the 20th century's best chronicler of English culture", Orwell considered fish and chips, association football, the pub, strong tea, cut price chocolate, the movies, and radio among the chief comforts for the working class.
Orwell enjoyed strong tea—he had Fortnum & Mason's tea brought to him in Catalonia. His 1946 essay, "A Nice Cup of Tea", appeared in the "Evening Standard" article on how to make tea, with Orwell writing, "tea is one of the mainstays of civilisation in this country and causes violent disputes over how it should be made", with the main issue being whether to put tea in the cup first and add the milk afterward, or the other way round, on which he states, "in every family in Britain there are probably two schools of thought on the subject". He appreciated English beer, taken regularly and moderately, despised drinkers of lager and wrote about an imagined, ideal British pub in his 1946 "Evening Standard" article, "The Moon Under Water". Not as particular about food, he enjoyed the wartime "Victory Pie" and extolled canteen food at the BBC. He preferred traditional English dishes, such as roast beef, and kippers. His 1945 essay, "In Defence of English Cooking", included Yorkshire pudding, crumpets, muffins, innumerable biscuits, Christmas pudding, shortbread, various British cheeses and Oxford marmalade. Reports of his Islington days refer to the cosy afternoon tea table.
His dress sense was unpredictable and usually casual. In Southwold, he had the best cloth from the local tailor but was equally happy in his tramping outfit. His attire in the Spanish Civil War, along with his size-12 boots, was a source of amusement. David Astor described him as looking like a prep school master, while according to the Special Branch dossier, Orwell's tendency to dress "in Bohemian fashion" revealed that the author was "a Communist".
Orwell's confusing approach to matters of social decorum—on the one hand expecting a working-class guest to dress for dinner, and on the other, slurping tea out of a saucer at the BBC canteen—helped stoke his reputation as an English eccentric.
Orwell was an atheist who identified himself with the humanist outlook on life. Despite this, and despite his criticisms of both religious doctrine and of religious organisations, he nevertheless regularly participated in the social and civic life of the church, including by attending Church of England Holy Communion. Acknowledging this contradiction, he once said: "It seems rather mean to go to HC [Holy Communion] when one doesn't believe, but I have passed myself off for pious & there is nothing for it but to keep up with the deception." Orwell was also extremely well-read in Biblical literature and could quote lengthy passages from the Book of Common Prayer from memory. His extensive knowledge of the Bible came coupled with unsparing criticism of its philosophy, and as an adult he could not bring himself to believe in its tenets. He said in part V of his essay, "Such, Such Were the Joys", that "Till about the age of fourteen I believed in God, and believed that the accounts given of him were true. But I was well aware that I did not love him." Despite this, he had two Anglican marriages and left instructions for an Anglican funeral. Orwell directly contrasted Christianity with secular humanism in his essay "Lear, Tolstoy and the Fool", finding the latter philosophy more palatable and less "self-interested." Literary critic James Wood wrote that in the struggle, as he saw it, between Christianity and humanism, "Orwell was on the humanist side, of course—basically an unmetaphysical, English version of Camus's philosophy of perpetual godless struggle."
Orwell's writing was often explicitly critical of religion, and Christianity in particular. He found the church to be a "selfish [...] church of the landed gentry" with its establishment "out of touch" with the majority of its communicants and altogether a pernicious influence on public life. In their 1972 study, "The Unknown Orwell", the writers Peter Stansky and William Abrahams noted that at Eton Blair displayed a "sceptical attitude" to Christian belief. Crick observed that Orwell displayed "a pronounced anti-Catholicism". Evelyn Waugh, writing in 1946, acknowledged Orwell's high moral sense and respect for justice but believed "he seems never to have been touched at any point by a conception of religious thought and life." His contradictory and sometimes ambiguous views about the social benefits of religious affiliation mirrored the dichotomies between his public and private lives: Stephen Ingle wrote that it was as if the writer George Orwell "vaunted" his unbelief while Eric Blair the individual retained "a deeply ingrained religiosity".
Orwell liked to provoke arguments by challenging the status quo, but he was also a traditionalist with a love of old English values. He criticised and satirised, from the inside, the various social milieux in which he found himself—provincial town life in "A Clergyman's Daughter"; middle-class pretension in "Keep the Aspidistra Flying"; preparatory schools in "Such, Such Were the Joys"; colonialism in "Burmese Days", and some socialist groups in "The Road to Wigan Pier". In his "Adelphi" days, he described himself as a "Tory-anarchist".
In 1928, Orwell began his career as a professional writer in Paris at a journal owned by the French Communist Henri Barbusse. His first article, "La Censure en Angleterre" ("Censorship in England"), was an attempt to account for the "extraordinary and illogical" moral censorship of plays and novels then practised in Britain. His own explanation was that the rise of the "puritan middle class", who had stricter morals than the aristocracy, tightened the rules of censorship in the 19th century. Orwell's first published article in his home country, "A Farthing Newspaper", was a critique of the new French daily the "Ami de Peuple". This paper was sold much more cheaply than most others, and was intended for ordinary people to read. Orwell pointed out that its proprietor François Coty also owned the right-wing dailies "Le Figaro" and "Le Gaulois", which the "Ami de Peuple" was supposedly competing against. Orwell suggested that cheap newspapers were no more than a vehicle for advertising and anti-leftist propaganda, and predicted the world might soon see free newspapers which would drive legitimate dailies out of business.
Writing for "Le Progrès Civique", Orwell described the British colonial government in Burma and India:
The Spanish Civil War played the most important part in defining Orwell's socialism. He wrote to Cyril Connolly from Barcelona on 8 June 1937: "I have seen wonderful things and at last really believe in Socialism, which I never did before." Having witnessed the success of the anarcho-syndicalist communities, for example in Anarchist Catalonia, and the subsequent brutal suppression of the anarcho-syndicalists, anti-Stalin communist parties and revolutionaries by the Soviet Union-backed Communists, Orwell returned from Catalonia a staunch anti-Stalinist and joined the British Independent Labour Party, his card being issued on 13 June 1938. Although he was never a Trotskyist, he was strongly influenced by the Trotskyist and anarchist critiques of the Soviet regime, and by the anarchists' emphasis on individual freedom. In Part 2 of "The Road to Wigan Pier", published by the Left Book Club, Orwell stated that "a real Socialist is one who wishes—not merely conceives it as desirable, but actively wishes—to see tyranny overthrown". Orwell stated in "Why I Write" (1946): "Every line of serious work that I have written since 1936 has been written, directly or indirectly, against totalitarianism and for democratic socialism, as I understand it." Orwell was a proponent of a federal socialist Europe, a position outlined in his 1947 essay "Toward European Unity," which first appeared in "Partisan Review". According to biographer John Newsinger:
In his 1938 essay "Why I joined the Independent Labour Party," published in the ILP-affiliated "New Leader", Orwell wrote:
Towards the end of the essay, he wrote: "I do not mean I have lost all faith in the Labour Party. My most earnest hope is that the Labour Party will win a clear majority in the next General Election."
Orwell was opposed to rearmament against Nazi Germany and at the time of the Munich Agreement he signed a manifesto entitled "If War Comes We Shall Resist"—but he changed his view after the Molotov–Ribbentrop Pact and the outbreak of the war. He left the ILP because of its opposition to the war and adopted a political position of "revolutionary patriotism". In December 1940 he wrote in "Tribune" (the Labour left's weekly): "We are in a strange period of history in which a revolutionary has to be a patriot and a patriot has to be a revolutionary." During the war, Orwell was highly critical of the popular idea that an Anglo-Soviet alliance would be the basis of a post-war world of peace and prosperity. In 1942, commenting on journalist E. H. Carr's pro-Soviet views, Orwell stated that "all the appeasers, e.g. Professor E.H. Carr, have switched their allegiance from Hitler to Stalin".
On anarchism, Orwell wrote in "The Road to Wigan Pier": "I worked out an anarchistic theory that all government is evil, that the punishment always does more harm than the crime and the people can be trusted to behave decently if you will only let them alone." He continued and argued that "it is always necessary to protect peaceful people from violence. In any state of society where crime can be profitable you have got to have a harsh criminal law and administer it ruthlessly."
In his reply (dated 15 November 1943) to an invitation from the Duchess of Atholl to speak for the British League for European Freedom, he stated that he did not agree with their objectives. He admitted that what they said was "more truthful than the lying propaganda found in most of the press", but added that he could not "associate himself with an essentially Conservative body" that claimed to "defend democracy in Europe" but had "nothing to say about British imperialism". His closing paragraph stated: "I belong to the Left and must work inside it, much as I hate Russian totalitarianism and its poisonous influence in this country."
Orwell joined the staff of "Tribune" as literary editor, and from then until his death, was a left-wing (though hardly orthodox) Labour-supporting democratic socialist. On 1 September 1944, about the Warsaw uprising, Orwell expressed in "Tribune" his hostility against the influence of the alliance with the USSR over the allies: "Do remember that dishonesty and cowardice always have to be paid for. Do not imagine that for years on end you can make yourself the boot-licking propagandist of the sovietic regime, or any other regime, and then suddenly return to honesty and reason. Once a whore, always a whore." According to Newsinger, although Orwell "was always critical of the 1945–51 Labour government's moderation, his support for it began to pull him to the right politically. This did not lead him to embrace conservatism, imperialism or reaction, but to defend, albeit critically, Labour reformism." Between 1945 and 1947, with A. J. Ayer and Bertrand Russell, he contributed a series of articles and essays to "Polemic", a short-lived British "Magazine of Philosophy, Psychology, and Aesthetics" edited by the ex-Communist Humphrey Slater.
Writing in early 1945 a long essay titled "Antisemitism in Britain," for the "Contemporary Jewish Record", Orwell stated that antisemitism was on the increase in Britain and that it was "irrational and will not yield to arguments". He argued that it would be useful to discover why anti-Semites could "swallow such absurdities on one particular subject while remaining sane on others". He wrote: "For quite six years the English admirers of Hitler contrived not to learn of the existence of Dachau and Buchenwald. ... Many English people have heard almost nothing about the extermination of German and Polish Jews during the present war. Their own anti-Semitism has caused this vast crime to bounce off their consciousness." In "Nineteen Eighty-Four", written shortly after the war, Orwell portrayed the Party as enlisting anti-Semitic passions against their enemy, Goldstein.
Orwell publicly defended P. G. Wodehouse against charges of being a Nazi sympathiser—occasioned by his agreement to do some broadcasts over the German radio in 1941—a defence based on Wodehouse's lack of interest in and ignorance of politics.
Special Branch, the intelligence division of the Metropolitan Police, maintained a file on Orwell for more than 20 years of his life. The dossier, published by The National Archives, states that, according to one investigator, Orwell had "advanced Communist views and several of his Indian friends say that they have often seen him at Communist meetings". MI5, the intelligence department of the Home Office, noted: "It is evident from his recent writings—'The Lion and the Unicorn'—and his contribution to Gollancz's symposium "The Betrayal of the Left" that he does not hold with the Communist Party nor they with him."
Sexual politics plays an important role in "Nineteen Eighty-Four". In the novel, people's intimate relationships are strictly governed by the party's Junior Anti-Sex League, by opposing sexual relations and instead encouraging artificial insemination. Personally, Orwell disliked what he thought as misguided middle-class revolutionary emancipatory views, expressing disdain for "every fruit-juice drinker, nudist, sandal-wearer, sex-maniacs."
Orwell was also openly against homosexuality, at a time when such prejudice was common. Speaking at the 2003 George Orwell Centenary Conference, Daphne Patai said: "Of course he was homophobic. That has nothing to do with his relations with his homosexual friends. Certainly, he had a negative attitude and a certain kind of anxiety, a denigrating attitude towards homosexuality. That is definitely the case. I think his writing reflects that quite fully."
Orwell used the homophobic epithets "nancy" and "pansy", such in his expressions of contempt for what he called the "pansy Left", and "nancy poets", i.e. left-wing homosexual or bisexual writers and intellectuals such as Stephen Spender and W. H. Auden. The protagonist of "Keep the Aspidistra Flying", Gordon Comstock, conducts an internal critique of his customers when working in a bookshop, and there is an extended passage of several pages in which he concentrates on a homosexual male customer, and sneers at him for his "nancy" characteristics, including a lisp, which he identifies in detail, with some disgust. Stephen Spender "thought Orwell's occasional homophobic outbursts were part of his rebellion against the public school".
Orwell's will requested that no biography of him be written, and his widow, Sonia Brownell, repelled every attempt by those who tried to persuade her to let them write about him. Various recollections and interpretations were published in the 1950s and '60s, but Sonia saw the 1968 "Collected Works" as the record of his life. She did appoint Malcolm Muggeridge as official biographer, but later biographers have seen this as deliberate spoiling as Muggeridge eventually gave up the work. In 1972, two American authors, Peter Stansky and William Abrahams, produced "The Unknown Orwell", an unauthorised account of his early years that lacked any support or contribution from Sonia Brownell.
Sonia Brownell then commissioned Bernard Crick, a professor of politics at the University of London, to complete a biography and asked Orwell's friends to co-operate. Crick collated a considerable amount of material in his work, which was published in 1980, but his questioning of the factual accuracy of Orwell's first-person writings led to conflict with Brownell, and she tried to suppress the book. Crick concentrated on the facts of Orwell's life rather than his character, and presented primarily a political perspective on Orwell's life and work.
After Sonia Brownell's death, other works on Orwell were published in the 1980s, particularly in 1984. These included collections of reminiscences by Coppard and Crick and Stephen Wadhams.
In 1991, Michael Shelden, an American professor of literature, published a biography. More concerned with the literary nature of Orwell's work, he sought explanations for Orwell's character and treated his first-person writings as autobiographical. Shelden introduced new information that sought to build on Crick's work. Shelden speculated that Orwell possessed an obsessive belief in his failure and inadequacy.
Peter Davison's publication of the "Complete Works of George Orwell", completed in 2000, made most of the Orwell Archive accessible to the public. Jeffrey Meyers, a prolific American biographer, was first to take advantage of this and published a book in 2001 that investigated the darker side of Orwell and questioned his saintly image. "Why Orwell Matters" (released in the United Kingdom as "Orwell's Victory") was published by Christopher Hitchens in 2002.
In 2003, the centenary of Orwell's birth resulted in biographies by Gordon Bowker and D. J. Taylor, both academics and writers in the United Kingdom. Taylor notes the stage management which surrounds much of Orwell's behaviour and Bowker highlights the essential sense of decency which he considers to have been Orwell's main motivation.
|
https://en.wikipedia.org/wiki?curid=11891
|
Gambling
Gambling (also known as betting) is the wagering of money or something of value (referred to as "the stakes") on an event with an uncertain outcome, with the primary intent of winning money or material goods. Gambling thus requires three elements to be present: consideration (an amount wagered), risk (chance), and a prize. The outcome of the wager is often immediate, such as a single roll of dice, a spin of a roulette wheel, or a horse crossing the finish line, but longer time frames are also common, allowing wagers on the outcome of a future sports contest or even an entire sports season.
The term "gaming" in this context typically refers to instances in which the activity has been specifically permitted by law. The two words are not mutually exclusive; "i.e.", a "gaming" company offers (legal) "gambling" activities to the public and may be regulated by one of many gaming control boards, for example, the Nevada Gaming Control Board. However, this distinction is not universally observed in the English-speaking world. For instance, in the United Kingdom, the regulator of gambling activities is called the Gambling Commission (not the Gaming Commission). The word "gaming" is used more frequently since the rise of computer and video games to describe activities that do not necessarily involve wagering, especially online gaming, with the new usage still not having displaced the old usage as the primary definition in common dictionaries. "Gaming" has also been used to circumvent laws against "gambling". The media and others have used one term or the other to frame conversations around the subjects, resulting in a shift of perceptions among their audiences.
Gambling is also a major international commercial activity, with the legal gambling market totaling an estimated $335 billion in 2009. In other forms, gambling can be conducted with materials which have a value, but are not real money. For example, players of marbles games might wager marbles, and likewise games of "Pogs" or "" can be played with the collectible game pieces (respectively, small discs and trading cards) as stakes, resulting in a meta-game regarding the value of a player's collection of pieces.
Gambling dates back to the Paleolithic period, before written history. In Mesopotamia the earliest six-sided dice date to about 3000 BC. However, they were based on astragali dating back thousands of years earlier. In China, gambling houses were widespread in the first millennium BC, and betting on fighting animals was common. Lotto games and dominoes (precursors of Pai Gow) appeared in China as early as the 10th century.
Playing cards appeared in the 9th century in China. Records trace gambling in Japan back at least as far as the 14th century.
Poker, the most popular U.S. card game associated with gambling, derives from the Persian game As-Nas, dating back to the 17th century.
The first known casino, the Ridotto, started operating in 1638 in Venice, Italy.
Gambling has been a main recreational activity in Great Britain for centuries. Horseracing has been a favorite theme for over three centuries. It has been heavily regulated. Historically much of the opposition comes from evangelical Protestants, and from social reformers.
Gambling has been a popular activity in the United States for centuries. It has also been suppressed by law in many areas for almost as long. By the early 20th century, gambling was almost uniformly outlawed throughout the U.S. and thus became a largely illegal activity, helping to spur the growth of the mafia and other criminal organizations. The late 20th century saw a softening in attitudes towards gambling and a relaxation of laws against it.
Many jurisdictions, local as well as national, either ban gambling or heavily control it by licensing the vendors. Such regulation generally leads to gambling tourism and illegal gambling in the areas where it is not allowed. The involvement of governments, through regulation and taxation, has led to a close connection between many governments and gaming organizations, where legal gambling provides significant government revenue, such as in Monaco and Macau, China.
There is generally legislation requiring that gaming devices be statistically random, to prevent manufacturers from making some high-payoff results impossible. Since these high payoffs have very low probability, a house bias can quite easily be missed unless the devices are checked carefully.
Most jurisdictions that allow gambling require participants to be above a certain age. In some jurisdictions, the gambling age differs depending on the type of gambling. For example, in many American states one must be over 21 to enter a casino, but may buy a lottery ticket after turning 18.
Because contracts of insurance have many features in common with wagers, insurance contracts are often distinguished in law as agreements in which either party has an interest in the "bet-upon" outcome "beyond" the specific financial terms. e.g.: a "bet" with an insurer on whether one's house will burn down is not gambling, but rather "insurance" – as the homeowner has an obvious interest in the continued existence of his/her home "independent of" the purely financial aspects of the "bet" (i.e. the insurance policy). Nonetheless, both insurance and gambling contracts are typically considered aleatory contracts under most legal systems, though they are subject to different types of regulation.
Under common law, particularly English Law (English unjust enrichment), a gambling contract may not give a casino "bona fide" purchaser status, permitting the recovery of stolen funds in some situations. In "Lipkin Gorman v Karpnale Ltd", where a solicitor used stolen funds to gamble at a casino, the House of Lords overruled the High Court's previous verdict, adjudicating that the casino return the stolen funds less those subject to any change of position defence. U.S. Law precedents are somewhat similar. For case law on recovery of gambling losses where the loser had stolen the funds see "Rights of owner of stolen money as against one who won it in gambling transaction from thief".
An interesting question is what happens when the person trying to make recovery is the gambler's spouse, and the money or property lost was either the spouse's, or was community property. This was a minor plot point in a Perry Mason novel, "The Case of the Singing Skirt", and it cites an actual case "Novo v. Hotel Del Rio".
Ancient Hindu poems like the Gambler's Lament and the "Mahabharata" testify to the popularity of gambling among ancient Indians. However, the text "Arthashastra" (c. 4th century BC) recommends taxation and control of gambling.
Ancient Jewish authorities frowned on gambling, even disqualifying professional gamblers from testifying in court.
The Catholic Church holds the position that there is no moral impediment to gambling, so long as it is fair, all bettors have a reasonable chance of winning, there is no fraud involved, and the parties involved do not have actual knowledge of the outcome of the bet (unless they have disclosed this knowledge), and as long as the following conditions are met: the gambler can afford to lose the bet, and stops when the limit is reached, and the motivation is entertainment and not personal gain leading to the "love of money" or making a living. In general, Catholic bishops have opposed casino gambling on the grounds that it too often tempts people into problem gambling or addiction, and has particularly negative effects on poor people; they sometimes also cite secondary effects such as increases in loan sharking, prostitution, corruption, and general public immorality. Some parish pastors have also opposed casinos for the additional reason that they would take customers away from church bingo and annual festivals where games such as blackjack, roulette, craps, and poker are used for fundraising. St. Thomas Aquinas wrote that gambling should be especially forbidden where the losing bettor is underage or otherwise not able to consent to the transaction. Gambling has often been seen as having social consequences, as satirized by Balzac. For these social and religious reasons, most legal jurisdictions limit gambling, as advocated by Pascal.
Gambling views among Protestants vary, with some either discouraging or forbidding their members from participation in gambling. Methodists, in accordance with the doctrine of outward holiness, oppose gambling which they believe is a sin that feeds on greed; examples are the United Methodist Church, the Free Methodist Church, the Evangelical Wesleyan Church, the Salvation Army, and the Church of the Nazarene.
Other Protestants that oppose gambling include many Mennonites, Quakers, the Christian Reformed Church in North America, the Church of the Lutheran Confession, the Southern Baptist Convention, the Assemblies of God, and the Seventh-day Adventist Church.
Other churches that oppose gambling include the Jehovah's Witnesses, The Church of Jesus Christ of Latter-day Saints, the Iglesia Ni Cristo, and the Members Church of God International.
Although different interpretations of Shari‘ah (Islamic Law) exist in the Muslim world, there is a consensus among the "‘Ulema’" (, Scholars (of Islam)) that gambling is "haraam" (, sinful or forbidden). In assertions made during its prohibition, Muslim jurists describe gambling as being both un-Qur’anic, and as being generally harmful to the Muslim Ummah (, Community). The Arabic terminology for gambling is "Maisir".
In parts of the world that implement full Shari‘ah, such as Aceh, punishments for Muslim gamblers can range up to 12 lashes or a one-year prison term and a fine for those who provide a venue for such practises. Some Islamic nations prohibit gambling; most other countries regulate it.
While almost any game can be played for money, and any game typically played for money can also be played just for fun, some games are generally offered in a casino setting.
Gambling games that take place outside of casinos include Bingo (as played in the US and UK), dead pool, lotteries, pull-tab games and scratchcards, and Mahjong.
Other non-casino gambling games include:
*Although coin tossing is not usually played in a casino, it has been known to be an official gambling game in some Australian casinos
Fixed-odds betting and Parimutuel betting frequently occur at many types of sporting events, and political elections. In addition many bookmakers offer fixed odds on a number of non-sports related outcomes, for example the direction and extent of movement of various financial indices, the winner of television competitions such as "Big Brother", and election results. Interactive prediction markets also offer trading on these outcomes, with "shares" of results trading on an open market.
One of the most widespread forms of gambling involves betting on horse or greyhound racing. Wagering may take place through parimutuel pools, or bookmakers may take bets personally. Parimutuel wagers pay off at prices determined by support in the wagering pools, while bookmakers pay off either at the odds offered at the time of accepting the bet; or at the median odds offered by track bookmakers at the time the race started.
Betting on team sports has become an important service industry in many countries. For example, millions of people play the football pools every week in the United Kingdom. In addition to organized sports betting, both legal and illegal, there are many side-betting games played by casual groups of spectators, such as NCAA Basketball Tournament Bracket Pools, Super Bowl Squares, Fantasy Sports Leagues with monetary entry fees and winnings, and in-person spectator games like Moundball.
Based on Sports Betting, Virtual Sports are fantasy and never played sports events made by software that can be played everytime without wondering about external things like weather conditions.
Arbitrage betting is a theoretically risk-free betting system in which every outcome of an event is bet upon so that a known profit will be made by the bettor upon completion of the event, regardless of the outcome. Arbitrage betting is a combination of the ancient art of arbitrage trading and gambling, which has been made possible by the large numbers of bookmakers in the marketplace, creating occasional opportunities for arbitrage.
One can also bet with another person that a statement is true or false, or that a specified event will happen (a "back bet") or will not happen (a "lay bet") within a specified time. This occurs in particular when two people have opposing but strongly held views on truth or events. Not only do the parties hope to gain from the bet, they place the bet also to demonstrate their certainty about the issue. Some means of determining the issue at stake must exist. Sometimes the amount bet remains nominal, demonstrating the outcome as one of principle rather than of financial importance.
Betting exchanges allow consumers to both back and lay at odds of their choice. Similar in some ways to a stock exchange, a bettor may want to back a horse (hoping it will win) or lay a horse (hoping it will lose, effectively acting as bookmaker).
Spread betting allows gamblers to wagering on the outcome of an event where the pay-off is based on the accuracy of the wager, rather than a simple "win or lose" outcome. For example, a wager can be based on the when a point is scored in the game in minutes and each minute away from the prediction increases or reduces the payout.
Many betting systems have been created in an attempt to "beat the house" but no system can make a mathematically unprofitable bet in terms of expected value profitable over time. Widely used systems include:
Many risk-return choices are sometimes referred to colloquially as "gambling." Whether this terminology is acceptable is a matter of debate:
Investments are also usually not considered gambling, although some investments can involve significant risk. Examples of investments include stocks, bonds and real estate. Starting a business can also be considered a form of investment. Investments are generally not considered gambling when they meet the following criteria:
Some speculative investment activities are particularly risky, but are sometimes perceived to be different from gambling:
Studies show that though many people participate in gambling as a form of recreation or even as a means to gain an income, gambling, like any behavior that involves variation in brain chemistry, can become a harmful, behavioral addiction. Behavioral addiction can occur with all the negative consequences in a person's life minus the physical issues faced by people who compulsively engage in drug and alcohol abuse. Reinforcement schedules may also make gamblers persist in gambling even after repeated losses. This is where the organized crime often ends up making large profits, allowing gamblers lines of credit and charge high percentage rates known as vigs to be paid weekly, with crime family enforcement.
The Russian writer and problem gambler Fyodor Dostoevsky portrays in his novella "The Gambler" the psychological implications of gambling and how gambling can affect gamblers. He also associates gambling and the idea of "getting rich quick", suggesting that Russians may have a particular affinity for gambling. Dostoevsky shows the effect of betting money for the chance of gaining more in 19th-century Europe. The association between Russians and gambling has fed legends of the origins of Russian roulette.
There are many symptoms and reasons for gambling. Gamblers gamble more money to try to win back money that they have lost and some gamble to relieve feelings of helplessness and anxiety.
The Advertising Standards Authority has censured several betting firms for advertisements disguised as news articles suggesting falsely a person had cleared debts and paid for medical expenses by online gambling. The firms face possible fines.
Gamblers exhibit a number of cognitive and motivational biases that distort the perceived odds of events and that influence their preferences for gambles.
|
https://en.wikipedia.org/wiki?curid=11921
|
Game theory
Game theory is the study of mathematical models of strategic interaction among rational decision-makers. It has applications in all fields of social science, as well as in logic, systems science and computer science. Originally, it addressed zero-sum games, in which each participant's gains or losses are exactly balanced by those of the other participants. Today, game theory applies to a wide range of behavioral relations, and is now an umbrella term for the science of logical decision making in humans, animals, and computers.
Modern game theory began with the idea of mixed-strategy equilibria in two-person zero-sum games and its proof by John von Neumann. Von Neumann's original proof used the Brouwer fixed-point theorem on continuous mappings into compact convex sets, which became a standard method in game theory and mathematical economics. His paper was followed by the 1944 book "Theory of Games and Economic Behavior", co-written with Oskar Morgenstern, which considered cooperative games of several players. The second edition of this book provided an axiomatic theory of expected utility, which allowed mathematical statisticians and economists to treat decision-making under uncertainty.
Game theory was developed extensively in the 1950s by many scholars. It was explicitly applied to biology in the 1970s, although similar developments go back at least as far as the 1930s. Game theory has been widely recognized as an important tool in many fields. , with the Nobel Memorial Prize in Economic Sciences going to game theorist Jean Tirole, eleven game theorists have won the economics Nobel Prize. John Maynard Smith was awarded the Crafoord Prize for his application of game theory to biology.
Discussions of two-person games began long before the rise of modern, mathematical game theory.
The first known discussion of game theory occurred in a letter believed to be written in 1713 by Charles Waldegrave, an active Jacobite and uncle to James Waldegrave, a British diplomat. The true identity of the original correspondent is somewhat elusive given the limited details and evidence available and the subjective nature of its interpretation. One theory postulates Francis Waldegrave as the true correspondent, but this has yet to be proven. In this letter, Waldegrave provides a minimax mixed strategy solution to a two-person version of the card game le Her, and the problem is now known as Waldegrave problem. In his 1838 "Recherches sur les principes mathématiques de la théorie des richesses" ("Researches into the Mathematical Principles of the Theory of Wealth"), Antoine Augustin Cournot considered a duopoly and presents a solution that is the Nash equilibrium of the game.
In 1913, Ernst Zermelo published "Über eine Anwendung der Mengenlehre auf die Theorie des Schachspiels" ("On an Application of Set Theory to the Theory of the Game of Chess"), which proved that the optimal chess strategy is strictly determined. This paved the way for more general theorems.
In 1938, the Danish mathematical economist Frederik Zeuthen proved that the mathematical model had a winning strategy by using Brouwer's fixed point theorem. In his 1938 book "Applications aux Jeux de Hasard" and earlier notes, Émile Borel proved a minimax theorem for two-person zero-sum matrix games only when the pay-off matrix was symmetric and provides a solution to a non-trivial infinite game (known in English as Blotto game). Borel conjectured the non-existence of mixed-strategy equilibria in finite two-person zero-sum games, a conjecture that was proved false by von Neumann.
Game theory did not really exist as a unique field until John von Neumann published the paper "On the Theory of Games of Strategy" in 1928. Von Neumann's original proof used Brouwer's fixed-point theorem on continuous mappings into compact convex sets, which became a standard method in game theory and mathematical economics. His paper was followed by his 1944 book "Theory of Games and Economic Behavior" co-authored with Oskar Morgenstern. The second edition of this book provided an axiomatic theory of utility, which reincarnated Daniel Bernoulli's old theory of utility (of money) as an independent discipline. Von Neumann's work in game theory culminated in this 1944 book. This foundational work contains the method for finding mutually consistent solutions for two-person zero-sum games. Subsequent work focused primarily on cooperative game theory, which analyzes optimal strategies for groups of individuals, presuming that they can enforce agreements between them about proper strategies.
In 1950, the first mathematical discussion of the prisoner's dilemma appeared, and an experiment was undertaken by notable mathematicians Merrill M. Flood and Melvin Dresher, as part of the RAND Corporation's investigations into game theory. RAND pursued the studies because of possible applications to global nuclear strategy. Around this same time, John Nash developed a criterion for mutual consistency of players' strategies known as the Nash equilibrium, applicable to a wider variety of games than the criterion proposed by von Neumann and Morgenstern. Nash proved that every finite n-player, non-zero-sum (not just two-player zero-sum) non-cooperative game has what is now known as a Nash equilibrium in mixed strategies.
Game theory experienced a flurry of activity in the 1950s, during which the concepts of the core, the extensive form game, fictitious play, repeated games, and the Shapley value were developed. The 1950s also saw the first applications of game theory to philosophy and political science.
In 1979 Robert Axelrod tried setting up computer programs as players and found that in tournaments between them the winner was often a simple "tit-for-tat" program--submitted by Anatol Rapoport--that cooperates on the first step, then, on subsequent steps, does whatever its opponent did on the previous step. The same winner was also often obtained by natural selection; a fact that is widely taken to explain cooperation phenomena in evolutionary biology and the social sciences.
In 1965, Reinhard Selten introduced his solution concept of subgame perfect equilibria, which further refined the Nash equilibrium. Later he would introduce trembling hand perfection as well. In 1994 Nash, Selten and Harsanyi became Economics Nobel Laureates for their contributions to economic game theory.
In the 1970s, game theory was extensively applied in biology, largely as a result of the work of John Maynard Smith and his evolutionarily stable strategy. In addition, the concepts of correlated equilibrium, trembling hand perfection, and common knowledge were introduced and analyzed.
In 2005, game theorists Thomas Schelling and Robert Aumann followed Nash, Selten, and Harsanyi as Nobel Laureates. Schelling worked on dynamic models, early examples of evolutionary game theory. Aumann contributed more to the equilibrium school, introducing equilibrium coarsening and correlated equilibria, and developing an extensive formal analysis of the assumption of common knowledge and of its consequences.
In 2007, Leonid Hurwicz, Eric Maskin, and Roger Myerson were awarded the Nobel Prize in Economics "for having laid the foundations of mechanism design theory". Myerson's contributions include the notion of proper equilibrium, and an important graduate text: "Game Theory, Analysis of Conflict". Hurwicz introduced and formalized the concept of incentive compatibility.
In 2012, Alvin E. Roth and Lloyd S. Shapley were awarded the Nobel Prize in Economics "for the theory of stable allocations and the practice of market design". In 2014, the Nobel went to game theorist Jean Tirole.
A game is "cooperative" if the players are able to form binding commitments externally enforced (e.g. through contract law). A game is "non-cooperative" if players cannot form alliances or if all agreements need to be self-enforcing (e.g. through credible threats).
Cooperative games are often analyzed through the framework of "cooperative game theory", which focuses on predicting which coalitions will form, the joint actions that groups take, and the resulting collective payoffs. It is opposed to the traditional "non-cooperative game theory" which focuses on predicting individual players' actions and payoffs and analyzing Nash equilibria.
Cooperative game theory provides a high-level approach as it describes only the structure, strategies, and payoffs of coalitions, whereas non-cooperative game theory also looks at how bargaining procedures will affect the distribution of payoffs within each coalition. As non-cooperative game theory is more general, cooperative games can be analyzed through the approach of non-cooperative game theory (the converse does not hold) provided that sufficient assumptions are made to encompass all the possible strategies available to players due to the possibility of external enforcement of cooperation. While it would thus be optimal to have all games expressed under a non-cooperative framework, in many instances insufficient information is available to accurately model the formal procedures available during the strategic bargaining process, or the resulting model would be too complex to offer a practical tool in the real world. In such cases, cooperative game theory provides a simplified approach that allows analysis of the game at large without having to make any assumption about bargaining powers.
A symmetric game is a game where the payoffs for playing a particular strategy depend only on the other strategies employed, not on who is playing them. That is, if the identities of the players can be changed without changing the payoff to the strategies, then a game is symmetric. Many of the commonly studied 2×2 games are symmetric. The standard representations of chicken, the prisoner's dilemma, and the stag hunt are all symmetric games. Some scholars would consider certain asymmetric games as examples of these games as well. However, the most common payoffs for each of these games are symmetric.
Most commonly studied asymmetric games are games where there are not identical strategy sets for both players. For instance, the ultimatum game and similarly the dictator game have different strategies for each player. It is possible, however, for a game to have identical strategies for both players, yet be asymmetric. For example, the game pictured to the right is asymmetric despite having identical strategy sets for both players.
Zero-sum games are a special case of constant-sum games in which choices by players can neither increase nor decrease the available resources. In zero-sum games, the total benefit to all players in the game, for every combination of strategies, always adds to zero (more informally, a player benefits only at the equal expense of others). Poker exemplifies a zero-sum game (ignoring the possibility of the house's cut), because one wins exactly the amount one's opponents lose. Other zero-sum games include matching pennies and most classical board games including Go and chess.
Many games studied by game theorists (including the famed prisoner's dilemma) are non-zero-sum games, because the outcome has net results greater or less than zero. Informally, in non-zero-sum games, a gain by one player does not necessarily correspond with a loss by another.
Constant-sum games correspond to activities like theft and gambling, but not to the fundamental economic situation in which there are potential gains from trade. It is possible to transform any game into a (possibly asymmetric) zero-sum game by adding a dummy player (often called "the board") whose losses compensate the players' net winnings.
Simultaneous games are games where both players move simultaneously, or if they do not move simultaneously, the later players are unaware of the earlier players' actions (making them "effectively" simultaneous). Sequential games (or dynamic games) are games where later players have some knowledge about earlier actions. This need not be perfect information about every action of earlier players; it might be very little knowledge. For instance, a player may know that an earlier player did not perform one particular action, while they do not know which of the other available actions the first player actually performed.
The difference between simultaneous and sequential games is captured in the different representations discussed above. Often, normal form is used to represent simultaneous games, while extensive form is used to represent sequential ones. The transformation of extensive to normal form is one way, meaning that multiple extensive form games correspond to the same normal form. Consequently, notions of equilibrium for simultaneous games are insufficient for reasoning about sequential games; see subgame perfection.
In short, the differences between sequential and simultaneous games are as follows:
An important subset of sequential games consists of games of perfect information. A game is one of perfect information if all players know the moves previously made by all other players. Most games studied in game theory are imperfect-information games. Examples of perfect-information games include tic-tac-toe, checkers, infinite chess, and Go.
Many card games are games of imperfect information, such as poker and bridge. Perfect information is often confused with complete information, which is a similar concept. Complete information requires that every player know the strategies and payoffs available to the other players but not necessarily the actions taken. Games of incomplete information can be reduced, however, to games of imperfect information by introducing "moves by nature".
Games in which the difficulty of finding an optimal strategy stems from the multiplicity of possible moves are called combinatorial games. Examples include chess and go. Games that involve imperfect information may also have a strong combinatorial character, for instance backgammon. There is no unified theory addressing combinatorial elements in games. There are, however, mathematical tools that can solve particular problems and answer general questions.
Games of perfect information have been studied in combinatorial game theory, which has developed novel representations, e.g. surreal numbers, as well as combinatorial and algebraic (and sometimes non-constructive) proof methods to solve games of certain types, including "loopy" games that may result in infinitely long sequences of moves. These methods address games with higher combinatorial complexity than those usually considered in traditional (or "economic") game theory. A typical game that has been solved this way is Hex. A related field of study, drawing from computational complexity theory, is game complexity, which is concerned with estimating the computational difficulty of finding optimal strategies.
Research in artificial intelligence has addressed both perfect and imperfect information games that have very complex combinatorial structures (like chess, go, or backgammon) for which no provable optimal strategies have been found. The practical solutions involve computational heuristics, like alpha–beta pruning or use of artificial neural networks trained by reinforcement learning, which make games more tractable in computing practice.
Games, as studied by economists and real-world game players, are generally finished in finitely many moves. Pure mathematicians are not so constrained, and set theorists in particular study games that last for infinitely many moves, with the winner (or other payoff) not known until "after" all those moves are completed.
The focus of attention is usually not so much on the best way to play such a game, but whether one player has a winning strategy. (It can be proven, using the axiom of choice, that there are gameseven with perfect information and where the only outcomes are "win" or "lose"for which "neither" player has a winning strategy.) The existence of such strategies, for cleverly designed games, has important consequences in descriptive set theory.
Much of game theory is concerned with finite, discrete games that have a finite number of players, moves, events, outcomes, etc. Many concepts can be extended, however. Continuous games allow players to choose a strategy from a continuous strategy set. For instance, Cournot competition is typically modeled with players' strategies being any non-negative quantities, including fractional quantities.
Differential games such as the continuous pursuit and evasion game are continuous games where the evolution of the players' state variables is governed by differential equations. The problem of finding an optimal strategy in a differential game is closely related to the optimal control theory. In particular, there are two types of strategies: the open-loop strategies are found using the Pontryagin maximum principle while the closed-loop strategies are found using Bellman's Dynamic Programming method.
A particular case of differential games are the games with a random time horizon. In such games, the terminal time is a random variable with a given probability distribution function. Therefore, the players maximize the mathematical expectation of the cost function. It was shown that the modified optimization problem can be reformulated as a discounted differential game over an infinite time interval.
Evolutionary game theory studies players who adjust their strategies over time according to rules that are not necessarily rational or farsighted. In general, the evolution of strategies over time according to such rules is modeled as a Markov chain with a state variable such as the current strategy profile or how the game has been played in the recent past. Such rules may feature imitation, optimization, or survival of the fittest.
In biology, such models can represent (biological) evolution, in which offspring adopt their parents' strategies and parents who play more successful strategies (i.e. corresponding to higher payoffs) have a greater number of offspring. In the social sciences, such models typically represent strategic adjustment by players who play a game many times within their lifetime and, consciously or unconsciously, occasionally adjust their strategies.
Individual decision problems with stochastic outcomes are sometimes considered "one-player games". These situations are not considered game theoretical by some authors. They may be modeled using similar tools within the related disciplines of decision theory, operations research, and areas of artificial intelligence, particularly AI planning (with uncertainty) and multi-agent system. Although these fields may have different motivators, the mathematics involved are substantially the same, e.g. using Markov decision processes (MDP).
Stochastic outcomes can also be modeled in terms of game theory by adding a randomly acting player who makes "chance moves" ("moves by nature"). This player is not typically considered a third player in what is otherwise a two-player game, but merely serves to provide a roll of the dice where required by the game.
For some problems, different approaches to modeling stochastic outcomes may lead to different solutions. For example, the difference in approach between MDPs and the minimax solution is that the latter considers the worst-case over a set of adversarial moves, rather than reasoning in expectation about these moves given a fixed probability distribution. The minimax approach may be advantageous where stochastic models of uncertainty are not available, but may also be overestimating extremely unlikely (but costly) events, dramatically swaying the strategy in such scenarios if it is assumed that an adversary can force such an event to happen. (See Black swan theory for more discussion on this kind of modeling issue, particularly as it relates to predicting and limiting losses in investment banking.)
General models that include all elements of stochastic outcomes, adversaries, and partial or noisy observability (of moves by other players) have also been studied. The "gold standard" is considered to be partially observable stochastic game (POSG), but few realistic problems are computationally feasible in POSG representation.
These are games the play of which is the development of the rules for another game, the target or subject game. Metagames seek to maximize the utility value of the rule set developed. The theory of metagames is related to mechanism design theory.
The term metagame analysis is also used to refer to a practical approach developed by Nigel Howard. whereby a situation is framed as a strategic game in which stakeholders try to realize their objectives by means of the options available to them. Subsequent developments have led to the formulation of confrontation analysis.
These are games prevailing over all forms of society. Pooling games are repeated plays with changing payoff table in general over an experienced path, and their equilibrium strategies usually take a form of evolutionary social convention and economic convention. Pooling game theory emerges to formally recognize the interaction between optimal choice in one play and the emergence of forthcoming payoff table update path, identify the invariance existence and robustness, and predict variance over time. The theory is based upon topological transformation classification of payoff table update over time to predict variance and invariance, and is also within the jurisdiction of the computational law of reachable optimality for ordered system.
Mean field game theory is the study of strategic decision making in very large populations of small interacting agents. This class of problems was considered in the economics literature by Boyan Jovanovic and Robert W. Rosenthal, in the engineering literature by Peter E. Caines, and by mathematician Pierre-Louis Lions and Jean-Michel Lasry.
The games studied in game theory are well-defined mathematical objects. To be fully defined, a game must specify the following elements: the "players" of the game, the "information" and "actions" available to each player at each decision point, and the "payoffs" for each outcome. (Eric Rasmusen refers to these four "essential elements" by the acronym "PAPI".) A game theorist typically uses these elements, along with a solution concept of their choosing, to deduce a set of equilibrium strategies for each player such that, when these strategies are employed, no player can profit by unilaterally deviating from their strategy. These equilibrium strategies determine an equilibrium to the game—a stable state in which either one outcome occurs or a set of outcomes occur with known probability.
Most cooperative games are presented in the characteristic function form, while the extensive and the normal forms are used to define noncooperative games.
The extensive form can be used to formalize games with a time sequencing of moves. Games here are played on trees (as pictured here). Here each vertex (or node) represents a point of choice for a player. The player is specified by a number listed by the vertex. The lines out of the vertex represent a possible action for that player. The payoffs are specified at the bottom of the tree. The extensive form can be viewed as a multi-player generalization of a decision tree. To solve any extensive form game, backward induction must be used. It involves working backward up the game tree to determine what a rational player would do at the last vertex of the tree, what the player with the previous move would do given that the player with the last move is rational, and so on until the first vertex of the tree is reached.
The game pictured consists of two players. The way this particular game is structured (i.e., with sequential decision making and perfect information), "Player 1" "moves" first by choosing either or (fair or unfair). Next in the sequence, "Player 2", who has now seen "Player 1"s move, chooses to play either or . Once "Player 2" has made their choice, the game is considered finished and each player gets their respective payoff. Suppose that "Player 1" chooses and then "Player 2" chooses : "Player 1" then gets a payoff of "eight" (which in real-world terms can be interpreted in many ways, the simplest of which is in terms of money but could mean things such as eight days of vacation or eight countries conquered or even eight more opportunities to play the same game against other players) and "Player 2" gets a payoff of "two".
The extensive form can also capture simultaneous-move games and games with imperfect information. To represent it, either a dotted line connects different vertices to represent them as being part of the same information set (i.e. the players do not know at which point they are), or a closed line is drawn around them. (See example in the imperfect information section.)
The normal (or strategic form) game is usually represented by a matrix which shows the players, strategies, and payoffs (see the example to the right). More generally it can be represented by any function that associates a payoff for each player with every possible combination of actions. In the accompanying example there are two players; one chooses the row and the other chooses the column. Each player has two strategies, which are specified by the number of rows and the number of columns. The payoffs are provided in the interior. The first number is the payoff received by the row player (Player 1 in our example); the second is the payoff for the column player (Player 2 in our example). Suppose that Player 1 plays "Up" and that Player 2 plays "Left". Then Player 1 gets a payoff of 4, and Player 2 gets 3.
When a game is presented in normal form, it is presumed that each player acts simultaneously or, at least, without knowing the actions of the other. If players have some information about the choices of other players, the game is usually presented in extensive form.
Every extensive-form game has an equivalent normal-form game, however the transformation to normal form may result in an exponential blowup in the size of the representation, making it computationally impractical.
In games that possess removable utility, separate rewards are not given; rather, the characteristic function decides the payoff of each unity. The idea is that the unity that is 'empty', so to speak, does not receive a reward at all.
The origin of this form is to be found in John von Neumann and Oskar Morgenstern's book; when looking at these instances, they guessed that when a union formula_1 appears, it works against the fraction
formula_2
as if two individuals were playing a normal game. The balanced payoff of C is a basic function. Although there are differing examples that help determine coalitional amounts from normal games, not all appear that in their function form can be derived from such.
Formally, a characteristic function is seen as: (N,v), where N represents the group of people and formula_3 is a normal utility.
Such characteristic functions have expanded to describe games where there is no removable utility.
Alternative game representation forms exist and are used for some subclasses of games or adjusted to the needs of interdisciplinary research. In addition to classical game representions, some of the alternative representations also encode time related aspects.
As a method of applied mathematics, game theory has been used to study a wide variety of human and animal behaviors. It was initially developed in economics to understand a large collection of economic behaviors, including behaviors of firms, markets, and consumers. The first use of game-theoretic analysis was by Antoine Augustin Cournot in 1838 with his solution of the Cournot duopoly. The use of game theory in the social sciences has expanded, and game theory has been applied to political, sociological, and psychological behaviors as well.
Although pre-twentieth century naturalists such as Charles Darwin made game-theoretic kinds of statements, the use of game-theoretic analysis in biology began with Ronald Fisher's studies of animal behavior during the 1930s. This work predates the name "game theory", but it shares many important features with this field. The developments in economics were later applied to biology largely by John Maynard Smith in his book "Evolution and the Theory of Games".
In addition to being used to describe, predict, and explain behavior, game theory has also been used to develop theories of ethical or normative behavior and to prescribe such behavior. In economics and philosophy, scholars have applied game theory to help in the understanding of good or proper behavior. Game-theoretic arguments of this type can be found as far back as Plato. An alternative version of game theory, called chemical game theory, represents the player's choices as metaphorical chemical reactant molecules called “knowlecules”. Chemical game theory then calculates the outcomes as equilibrium solutions to a system of chemical reactions.
The primary use of game theory is to describe and model how human populations behave. Some scholars believe that by finding the equilibria of games they can predict how actual human populations will behave when confronted with situations analogous to the game being studied. This particular view of game theory has been criticized. It is argued that the assumptions made by game theorists are often violated when applied to real-world situations. Game theorists usually assume players act rationally, but in practice human behavior often deviates from this model. Game theorists respond by comparing their assumptions to those used in physics. Thus while their assumptions do not always hold, they can treat game theory as a reasonable scientific ideal akin to the models used by physicists. However, empirical work has shown that in some classic games, such as the centipede game, guess 2/3 of the average game, and the dictator game, people regularly do not play Nash equilibria. There is an ongoing debate regarding the importance of these experiments and whether the analysis of the experiments fully captures all aspects of the relevant situation.
Some game theorists, following the work of John Maynard Smith and George R. Price, have turned to evolutionary game theory in order to resolve these issues. These models presume either no rationality or bounded rationality on the part of players. Despite the name, evolutionary game theory does not necessarily presume natural selection in the biological sense. Evolutionary game theory includes both biological as well as cultural evolution and also models of individual learning (for example, fictitious play dynamics).
Some scholars see game theory not as a predictive tool for the behavior of human beings, but as a suggestion for how people ought to behave. Since a strategy, corresponding to a Nash equilibrium of a game constitutes one's best response to the actions of the other players – provided they are in (the same) Nash equilibrium – playing a strategy that is part of a Nash equilibrium seems appropriate. This normative use of game theory has also come under criticism.
Game theory is a major method used in mathematical economics and business for modeling competing behaviors of interacting agents. Applications include a wide array of economic phenomena and approaches, such as auctions, bargaining, mergers and acquisitions pricing, fair division, duopolies, oligopolies, social network formation, agent-based computational economics, general equilibrium, mechanism design, and voting systems; and across such broad areas as experimental economics, behavioral economics, information economics, industrial organization, and political economy.
This research usually focuses on particular sets of strategies known as "solution concepts" or "equilibria". A common assumption is that players act rationally. In non-cooperative games, the most famous of these is the Nash equilibrium. A set of strategies is a Nash equilibrium if each represents a best response to the other strategies. If all the players are playing the strategies in a Nash equilibrium, they have no unilateral incentive to deviate, since their strategy is the best they can do given what others are doing.
The payoffs of the game are generally taken to represent the utility of individual players.
A prototypical paper on game theory in economics begins by presenting a game that is an abstraction of a particular economic situation. One or more solution concepts are chosen, and the author demonstrates which strategy sets in the presented game are equilibria of the appropriate type. Naturally one might wonder to what use this information should be put. Economists and business professors suggest two primary uses (noted above): "descriptive" and "prescriptive".
Sensible decision-making is critical for the success of projects. In project management, game theory is used to model the decision-making process of players, such as investors, project managers, contractors, sub-contractors, governments and customers. Quite often, these players have competing interests, and sometimes their interests are directly detrimental to other players, making project management scenarios well-suited to be modeled by game theory.
Piraveenan (2019) in his review provides several examples where game theory is used to model project management scenarios. For instance, an investor typically has several investment options, and each option will likely result in a different project, and thus one of the investment options has to be chosen before the project charter can be produced. Similarly, any large project involving subcontractors, for instance, a construction project, has a complex interplay between the main contractor (the project manager) and subcontractors, or among the subcontractors themselves, which typically has several decision points. For example, if there is an ambiguity in the contract between the contractor and subcontractor, each must decide how hard to push their case without jeopardizing the whole project, and thus their own stake in it. Similarly, when projects from competing organizations are launched, the marketing personnel have to decide what is the best timing and strategy to market the project, or its resultant product or service, so that it can gain maximum traction in the face of competition. In each of these scenarios, the required decisions depend on the decisions of other players who, in some way, have competing interests to the interests of the decision-maker, and thus can ideally be modeled using game theory.
Piraveenan summarises that two-player games are predominantly used to model project management scenarios, and based on the identity of these players, five distinct types of games are used in project management.
In terms of types of games, both cooperative as well as non-cooperative games, normal-form as well as extensive-form games, and zero-sum as well as non-zero-sum games are used to model various project management scenarios.
The application of game theory to political science is focused in the overlapping areas of fair division, political economy, public choice, war bargaining, positive political theory, and social choice theory. In each of these areas, researchers have developed game-theoretic models in which the players are often voters, states, special interest groups, and politicians.
Early examples of game theory applied to political science are provided by Anthony Downs. In his book "An Economic Theory of Democracy", he applies the Hotelling firm location model to the political process. In the Downsian model, political candidates commit to ideologies on a one-dimensional policy space. Downs first shows how the political candidates will converge to the ideology preferred by the median voter if voters are fully informed, but then argues that voters choose to remain rationally ignorant which allows for candidate divergence. Game Theory was applied in 1962 to the Cuban missile crisis during the presidency of John F. Kennedy.
It has also been proposed that game theory explains the stability of any form of political government. Taking the simplest case of a monarchy, for example, the king, being only one person, does not and cannot maintain his authority by personally exercising physical control over all or even any significant number of his subjects. Sovereign control is instead explained by the recognition by each citizen that all other citizens expect each other to view the king (or other established government) as the person whose orders will be followed. Coordinating communication among citizens to replace the sovereign is effectively barred, since conspiracy to replace the sovereign is generally punishable as a crime. Thus, in a process that can be modeled by variants of the prisoner's dilemma, during periods of stability no citizen will find it rational to move to replace the sovereign, even if all the citizens know they would be better off if they were all to act collectively.
A game-theoretic explanation for democratic peace is that public and open debate in democracies sends clear and reliable information regarding their intentions to other states. In contrast, it is difficult to know the intentions of nondemocratic leaders, what effect concessions will have, and if promises will be kept. Thus there will be mistrust and unwillingness to make concessions if at least one of the parties in a dispute is a non-democracy.
On the other hand, game theory predicts that two countries may still go to war even if their leaders are cognizant of the costs of fighting. War may result from asymmetric information; two countries may have incentives to mis-represent the amount of military resources they have on hand, rendering them unable to settle disputes agreeably without resorting to fighting. Moreover, war may arise because of commitment problems: if two countries wish to settle a dispute via peaceful means, but each wishes to go back on the terms of that settlement, they may have no choice but to resort to warfare. Finally, war may result from issue indivisibilities.
Game theory could also help predict a nation's responses when there is a new rule or law to be applied to that nation. One example would be Peter John Wood's (2013) research when he looked into what nations could do to help reduce climate change. Wood thought this could be accomplished by making treaties with other nations to reduce greenhouse gas emissions. However, he concluded that this idea could not work because it would create a prisoner's dilemma to the nations.
Unlike those in economics, the payoffs for games in biology are often interpreted as corresponding to fitness. In addition, the focus has been less on equilibria that correspond to a notion of rationality and more on ones that would be maintained by evolutionary forces. The best-known equilibrium in biology is known as the "evolutionarily stable strategy" (ESS), first introduced in . Although its initial motivation did not involve any of the mental requirements of the Nash equilibrium, every ESS is a Nash equilibrium.
In biology, game theory has been used as a model to understand many different phenomena. It was first used to explain the evolution (and stability) of the approximate 1:1 sex ratios. suggested that the 1:1 sex ratios are a result of evolutionary forces acting on individuals who could be seen as trying to maximize their number of grandchildren.
Additionally, biologists have used evolutionary game theory and the ESS to explain the emergence of animal communication. The analysis of signaling games and other communication games has provided insight into the evolution of communication among animals. For example, the mobbing behavior of many species, in which a large number of prey animals attack a larger predator, seems to be an example of spontaneous emergent organization. Ants have also been shown to exhibit feed-forward behavior akin to fashion (see Paul Ormerod's "Butterfly Economics").
Biologists have used the game of chicken to analyze fighting behavior and territoriality.
According to Maynard Smith, in the preface to "Evolution and the Theory of Games", "paradoxically, it has turned out that game theory is more readily applied to biology than to the field of economic behaviour for which it was originally designed". Evolutionary game theory has been used to explain many seemingly incongruous phenomena in nature.
One such phenomenon is known as biological altruism. This is a situation in which an organism appears to act in a way that benefits other organisms and is detrimental to itself. This is distinct from traditional notions of altruism because such actions are not conscious, but appear to be evolutionary adaptations to increase overall fitness. Examples can be found in species ranging from vampire bats that regurgitate blood they have obtained from a night's hunting and give it to group members who have failed to feed, to worker bees that care for the queen bee for their entire lives and never mate, to vervet monkeys that warn group members of a predator's approach, even when it endangers that individual's chance of survival. All of these actions increase the overall fitness of a group, but occur at a cost to the individual.
Evolutionary game theory explains this altruism with the idea of kin selection. Altruists discriminate between the individuals they help and favor relatives. Hamilton's rule explains the evolutionary rationale behind this selection with the equation , where the cost to the altruist must be less than the benefit to the recipient multiplied by the coefficient of relatedness . The more closely related two organisms are causes the incidences of altruism to increase because they share many of the same alleles. This means that the altruistic individual, by ensuring that the alleles of its close relative are passed on through survival of its offspring, can forgo the option of having offspring itself because the same number of alleles are passed on. For example, helping a sibling (in diploid animals) has a coefficient of , because (on average) an individual shares half of the alleles in its sibling's offspring. Ensuring that enough of a sibling's offspring survive to adulthood precludes the necessity of the altruistic individual producing offspring. The coefficient values depend heavily on the scope of the playing field; for example if the choice of whom to favor includes all genetic living things, not just all relatives, we assume the discrepancy between all humans only accounts for approximately 1% of the diversity in the playing field, a coefficient that was in the smaller field becomes 0.995. Similarly if it is considered that information other than that of a genetic nature (e.g. epigenetics, religion, science, etc.) persisted through time the playing field becomes larger still, and the discrepancies smaller.
Game theory has come to play an increasingly important role in logic and in computer science. Several logical theories have a basis in game semantics. In addition, computer scientists have used games to model interactive computations. Also, game theory provides a theoretical basis to the field of multi-agent systems.
Separately, game theory has played a role in online algorithms; in particular, the -server problem, which has in the past been referred to as "games with moving costs" and "request-answer games". Yao's principle is a game-theoretic technique for proving lower bounds on the computational complexity of randomized algorithms, especially online algorithms.
The emergence of the Internet has motivated the development of algorithms for finding equilibria in games, markets, computational auctions, peer-to-peer systems, and security and information markets. Algorithmic game theory and within it algorithmic mechanism design combine computational algorithm design and analysis of complex systems with economic theory.
Game theory has been put to several uses in philosophy. Responding to two papers by , used game theory to develop a philosophical account of convention. In so doing, he provided the first analysis of common knowledge and employed it in analyzing play in coordination games. In addition, he first suggested that one can understand meaning in terms of signaling games. This later suggestion has been pursued by several philosophers since Lewis. Following game-theoretic account of conventions, Edna Ullmann-Margalit (1977) and Bicchieri (2006) have developed theories of social norms that define them as Nash equilibria that result from transforming a mixed-motive game into a coordination game.
Game theory has also challenged philosophers to think in terms of interactive epistemology: what it means for a collective to have common beliefs or knowledge, and what are the consequences of this knowledge for the social outcomes resulting from the interactions of agents. Philosophers who have worked in this area include Bicchieri (1989, 1993), Skyrms (1990), and Stalnaker (1999).
In ethics, some (most notably David Gauthier, Gregory Kavka, and Jean Hampton) authors have attempted to pursue Thomas Hobbes' project of deriving morality from self-interest. Since games like the prisoner's dilemma present an apparent conflict between morality and self-interest, explaining why cooperation is required by self-interest is an important component of this project. This general strategy is a component of the general social contract view in political philosophy (for examples, see and ).
Other authors have attempted to use evolutionary game theory in order to explain the emergence of human attitudes about morality and corresponding animal behaviors. These authors look at several games including the prisoner's dilemma, stag hunt, and the Nash bargaining game as providing an explanation for the emergence of attitudes about morality (see, e.g., and ).
Game theory applications are used heavily in the pricing strategies of retail and consumer markets, particularly for the sale of inelastic goods. With retailers constantly competing against one another for consumer market share, it has become a fairly common practice for retailers to discount certain goods, intermittently, in the hopes of increasing foot-traffic in brick and mortar locations (websites visits for e-commerce retailers) or increasing sales of ancillary or complimentary products.
Black Friday, a popular shopping holiday in the US, is when many retailers focus on optimal pricing strategies to capture the holiday shopping market. In the Black Friday scenario, retailers using game theory applications typically ask “what is the dominant competitor’s reaction to me?" In such a scenario, the game has two players: the retailer, and the consumer. The retailer is focused on an optimal pricing strategy, while the consumer is focused on the best deal. In this closed system, there often is no dominant strategy as both players have alternative options. That is, retailers can find a different customer, and consumers can shop at a different retailer. Given the market competition that day, however, the dominant strategy for retailers lies in outperforming competitors. The open system assumes multiple retailers selling similar goods, and a finite number of consumers demanding the goods at an optimal price. A blog by a Cornell University professor provided an example of such a strategy, when Amazon priced a Samsung TV $100 below retail value, effectively undercutting competitors. Amazon made up part of the difference by increasing the price of HDMI cables, as it has been found that consumers are less price discriminatory when it comes to the sale of secondary items.
Retail markets continue to evolve strategies and applications of game theory when it comes to pricing consumer goods. The key insights found between simulations in a controlled environment and real-world retail experiences show that the applications of such strategies are more complex, as each retailer has to find an optimal balance between pricing, supplier relations, brand image, and the potential to cannibalize the sale of more profitable items.
Lists
|
https://en.wikipedia.org/wiki?curid=11924
|
Demographics of Germany
The demography of Germany is monitored by the "Statistisches Bundesamt" (Federal Statistical Office of Germany). According to the first census since reunification, Germany's population was 80,219,695 (9 May 2011), making it the sixteenth-most populous country in the world and the most populous in the European Union. The total fertility rate was rated at 1.57 in 2018. In 2008, fertility was related to educational achievement (women with lower levels of education were having more children than women who had completed higher education). In 2011, this was no longer true for Eastern Germany, where more highly educated women now had a somewhat higher fertility rate than the rest of the population. Persons who said they had no religion tend to have fewer children than those who identify as Christians, and studies also found that conservative-leaning Christians had more children compared to liberal-leaning Christians.
The United Nations Population Fund lists Germany as host to the third-highest number of international migrants worldwide, behind the United States and Saudi Arabia. More than 16 million people are descended from immigrants (first and second generation, including mixed heritage and ethnic German repatriates and their descendants). 96.1% of those reside in western Germany and Berlin. About 7,000,000 of these 16,000,000 are foreign residents, defined as those without German citizenship. The largest ethnic group of non-German origin are the Turkish. Since the 1960s, West and later reunified Germany has attracted immigrants primarily from Southern and Eastern Europe as well as Turkey, many of whom (or their children) have acquired German citizenship over time. While most of these immigrants initially arrived as guest workers, Germany has also been a prime destination for refugees who have applied for asylum in Germany, in part because the German constitution has long had a clause guaranteeing political asylum as a human right; but restrictions over the years have since limited the scope of this guarantee.
Germany has one of the world's highest levels of education, technological development, and economic productivity. Since the end of World War II, the number of students entering university has more than tripled, and the trade and technical schools are among the world's best. With a per capita income of about €40,883 in 2018, Germany is a broadly middle-class society. However, there has been a strong increase in the number of children living in poverty. In 1965, one in 75 children was on the welfare rolls; but by 2007 this had increased to one child in six. These children live in relative poverty, but not necessarily in absolute poverty. Germans are typically well-travelled, with millions travelling overseas each year. The social welfare system provides for universal health care, unemployment compensation, child benefits and other social programmes. Germany's ageing population and struggling economy strained the welfare system in the 1990s, so the government adopted a wide-ranging programme of belt-tightening reforms, Agenda 2010, including the labour-market reforms known as Hartz concept.
The contemporary demographics of Germany are also measured by a series of full censuses, with the most recent held in 1987. Since reunification, German authorities rely on a "micro census".
The total fertility rate is the number of children born per woman. It is based on fairly good data for the entire period. Sources: Our World In Data and Gapminder Foundation.
Sources: Our World In Data and the United Nations.
1875-1950
1950-2015
Source: "UN World Population Prospects"
Population statistics since 1900. Territorial changes of Germany occurred in 1918/1919, 1921/1922, 1945/1946 and in 1990.
In 2018, 598,364 children were born to mothers with German citizenship, while 189,159 children were born to mothers with foreign citizenship.
After the World War II border shifts and expulsions, the Germans from Central and Eastern Europe and the former eastern territories moved westward to post-war Germany. During the partition of Germany, many Germans from East Germany fled to West Germany for political and economic reasons. Since Germany's reunification, there are ongoing migrations from the eastern "New Länder" to the western "Old Länder" for economic reasons.
The Federal Republic of Germany and the German Democratic Republic followed different paths when it came to demographics. The politics of the German Democratic Republic was pronatalistic while that of the Federal Republic was compensatory.
Fertility in the GDR was higher than that in the FRG. Demographic politics was only one of the reasons. Women in the GDR had fewer "biographic options", young motherhood was expected of them. State funded costfree childcare was available to all mothers.
Note: Berlin is included into East Germany for the year 2002 and 2008. Source: Kreyenfeld (2002); Kreyenfeld et al. (2010); HFD Germany (2010)
About 1.7 million people have left the new federal states (the East) since the fall of the Berlin Wall, or 12% of the population; a disproportionately high number of them were women under 35.
After 1990, the total fertility rate (TFR) in the East dropped to 0.772 in 1994. This has been attributed to a "demographic shock": people not only had fewer children, they were also less likely to marry or divorce after the end of the GDR; the biographic options of the citizens of the former GDR had increased. Young motherhood seemed to be less attractive and the age of the first birth rose sharply.
In the following years, the TFR in the East started to rise again, surpassing 1.0 in 1997 and 1.3 in 2004, and reaching the West's TFR (1.37) in 2007. In 2010, the East's fertility rate (1.459) clearly exceeded that of the West (1.385), while Germany's overall TFR had risen to 1.393, the highest value since 1990, which was still far below the natural replacement rate of 2.1 and the birth rates seen under communism. In 2016, the TFR was 1.64 in the East and 1.60 in the West.
Between 1989 and 2009, about 2,000 schools closed because there were fewer children.
In some regions the number of women between the ages of 20 and 30 has dropped by more than 30%. In 2004, in the age group 18-29 (statistically important for starting families) there were only 90 women for every 100 men in the new federal states (the East, including Berlin).
Until 2007 family politics in the federal republic was compensatory, which means that poor families received more family benefits (such as the "Erziehungsgeld") than rich ones. In 2007 the so-called "Elterngeld" was introduced. According to Christoph Butterwegge the Elterngeld was meant to "motivate highly educated women to have more children"; the poor on the other hand were disadvantaged by the "Elterngeld", and now received lower child benefits than the middle classes. The very well-off (who earn more than 250.000 Euro per annum) and those on welfare receive no Elterngeld payments.
In 2013 the following most recent developments were noticed:
In the new federal states the fertility rate of college-educated women is now higher than that of those without college degrees. Differences in value priorities and the better availability of childcare in the eastern states are discussed as possible reasons.
Muslims are younger and have more children than non-Muslims in Germany, although their fertility rate is still below replacement level.
In 2019, the non-profit Austrian Institute of Economic Research and the Bertelsmann Stiftung published a study about the economic impact of demographics. The researchers assume a reduction in the per capita income of €3,700 until 2040.
Demographic statistics according to the World Population Review.
Demographic statistics according to the CIA World Factbook, unless otherwise indicated.
While most childbirths in Germany happen within marriage, a growing number of children are born out-of-wedlock. In 2010 the out-of-wedlock-rate was 33%, more than twice of what it was in 1990.
The Mikrozensus done in 2008 revealed that the number of children a German woman aged 40 to 75 had, was closely linked to her educational achievement.
In Western Germany the most educated women were the most likely to be childless. 26% of those groups stated they were childless, while 16% of those having an intermediate education, and 11% of those having compulsory education, stated the same.
In Eastern Germany however, 9% of the most educated women of that age group and 7% of those who had an intermediary education were childless, while 12% of those having only compulsory education were childless.
The reason for that east-western difference is that the GDR had an "educated mother scheme" and actively tried to encourage first births among the more educated. It did so by propagandizing the opinion that every educated woman should "present at least one child to socialism" and also by financially rewarding its more educated citizen to become parents. The government especially tried to persuade students to become parents while still in college and it was quite successful in doing so. In 1986 38% of all women, who were about to graduate from college, were mothers of at least one child and additional 14% were pregnant and 43% of all men, who were about to graduate from college, were fathers of at least one child. There was a sharp decline in the birth rate and especially in the birth rate of the educated after the fall of the Berlin wall. Nowadays, 5% of those about to graduate from college are parents.
The more educated a Western German mother aged 40 to 75 was in 2008, the less likely she was to have a big family.
The same was true for a mother living in Eastern Germany in 2008.
A study done in 2005 in the western German state of Nordrhein-Westfalen by the HDZ revealed that childlessness was especially widespread among scientists. It showed that 78% of the women scientists and 71% of the male scientists working in that state were childless.
The Federal Statistical Office defines persons with a migrant background as all persons who migrated to the present area of the Federal Republic of Germany after 1949, plus all foreign nationals born in Germany and all persons born in Germany as German nationals with at least one parent who migrated to Germany or was born in Germany as a foreign national. The figures presented here are based on this definition only.
In 2010, 2.3 million families with children under 18 years were living in Germany, in which at least one parent had foreign roots. They represented 29% of the total of 8.1 million families with minor children. Compared with 2005 – the year when the microcensus started to collect detailed information on the population with a migrant background – the proportion of migrant families has risen by 2 percentage points. In 2015, 36% children under 5 years old had migrant background.(number includes ethnic German repatriates)
Most of the families with a migrant background live in the western part of Germany. In 2010, the proportion of migrant families in all families was 32% in the former territory of the Federal Republic. This figure was more than double that in the new Länder (incl. Berlin) where it stood at 15%.
Families with a migrant background more often have three or more minor children in the household than families without a migrant background. In 2010, about 15% of the families with a migrant background contained three or more minor children, as compared with just 9% of the families without a migrant background.
In 2009, 3.0 million of the persons of immigrant background had Turkish roots, 2.9 million had their roots in the successor states of the Soviet Union (including a large number of Russian-speaking ethnic Germans), 1.5 million had their roots in the successor states of Yugoslavia including 200.000 Albanians and 1.5 million had Polish roots.
In 2008, 18.4% of Germans of any age group and 30% of German children had at least one parent born abroad. Median age for Germans with at least one parent born abroad was 33.8 years, while that for Germans, who had two parents born in Germany was 44.6 years.
Germany is home to the third-highest number of international migrants worldwide after the United States and Saudi Arabia.
, the population by background was as follows:
Four other sizable groups of people are referred to as "national minorities" ("nationale Minderheiten") because they have lived in their respective regions for centuries: Danes, Frisians, Roma and Sinti, and Sorbs. There is a Danish minority (about 50,000, according to government sources) in the northern-most state of Schleswig-Holstein. Eastern and Northern Frisians live at Schleswig-Holstein's western coast, and in the north-western part of Lower Saxony. They are part of a wider community (Frisia) stretching from Germany to the northern Netherlands. The Sorbs, a Slavic people with about 60,000 members (according to government sources), are in the Lusatia region of Saxony and Brandenburg. They are the last remnants of the Slavs that lived in central and eastern Germany since the 7th century to have kept their traditions and not been completely integrated into the wider German nation.
Until World War II the Poles were recognized as one of the national minorities. In 1924 the Union of Poles in Germany had initiated cooperation between all national minorities in Germany under the umbrella organization Association of National Minorities in Germany. Some of the union members wanted the Polish communities in easternmost Germany (now Poland) to join the newly established Polish nation after World War I. Even before the German invasion of Poland, leading anti-Nazi members of the Polish minority were deported to concentration camps; some were executed at the Piaśnica murder site. Minority rights for Poles in Germany were revoked by Hermann Göring's World War II decree of 27 February 1940, and their property was confiscated.
After the war ended, the German government did not re-implement national minority rights for ethnic Poles. The reason for this is that the areas of Germany which formerly had a native Polish minority were annexed to Poland and the Soviet Union, while almost all of the native German populations (formerly the ethnic majority) in these areas subsequently fled or were expelled by force. With the mixed German-Polish territories now lost, the German government subsequently regarded ethnic Poles residing in what remained of Germany as immigrants, just like any other ethnic population with a recent history of arrival. In contrast, Germans living in Poland are recognized as national minority and have granted seats in Polish Parliament. It must be said, however, that an overwhelming number of Germans in Poland have centuries-old historical ties to the lands they now inhabit, whether from living in territory that once belonged to the German state, or from centuries-old communities. In contrast, most Poles in present-day Germany are recent immigrants, though there are some communities which have been present since the 19th and perhaps even the 18th centuries. Despite protests by some in the older Polish-German communities, and despite Germany being now a signatory to the Framework Convention for the Protection of National Minorities, Germany has so far refused to re-implement minority rights for ethnic Poles, based on the fact that almost all areas of historically mixed German-Polish heritage (where the minority rights formerly existed) are no longer part of Germany and because the vast majority of ethnic Poles now residing in Germany are recent immigrants.
Roma people have been in Germany since the Middle Ages. They were persecuted by the Nazis, and thousands of Roma living in Germany were killed by the Nazi regime. Nowadays, they are spread all over Germany, mostly living in major cities. It is difficult to estimate their exact number, as the German government counts them as "persons without migrant background" in their statistics. There are also many assimilated Sinti and Roma. A vague figure given by the German Department of the Interior is about 70,000. In contrast to the old-established Roma population, the majority of them do not have German citizenship, they are classified as immigrants or refugees.
After World War II, 14 million ethnic Germans were expelled from the eastern territories of Germany and homelands outside the former German Empire. The accommodation and integration of these "Heimatvertriebene" in the remaining part of Germany, in which many cities and millions of apartments had been destroyed, was a major effort in the post-war occupation zones and later states of Germany.
Since the 1960s, ethnic Germans from the People's Republic of Poland and Soviet Union (especially from Kazakhstan, Russia, and Ukraine), have come to Germany. During the time of Perestroika, and after the dissolution of the Soviet Union, the number of immigrants increased heavily. Some of these immigrants are of mixed ancestry. During the 10-year period between 1987 and 2001, a total of 1,981,732 ethnic Germans from the FSU immigrated to Germany, along with more than a million of their non-German relatives. After 1997, however ethnic Slavs or those belonging to Slavic-Germanic mixed origins outnumbered these with only Germanic descent amongst the immigrants. The total number of people currently living in Germany having FSU connection is around 4 to 4.5 million (Including Germans, Slavs, Jews and those of mixed origins), out of that more than 50% is of German descent.
Germany now has Europe's third-largest Jewish population. In 2004, twice as many Jews from former Soviet republics settled in Germany as in Israel, bringing the total inflow to more than 100,000 since 1991. Jews have a voice in German public life through the Central Council of Jews in Germany (Zentralrat der Juden in Deutschland). Some Jews from the former Soviet Union are of mixed heritage.
In 2000 there were also around 300,000–500,000 Afro-Germans (those who have German citizenship) and 150,000+ African nationals. Most of them live in Berlin and Hamburg. Numerous persons from Tunisia and Morocco live in Germany. While they are considered members of a minority group, for the most part, they do not considers themselves "Afro-Germans," nor are most of them perceived as such by the German people. However, Germany does not keep any statistics regarding ethnicity or race. Hence, the exact number of Germans of African descent is unknown.
Germany's biggest East Asian minority are the Vietnamese people in Germany. About 40,000 Vietnamese live in Berlin and surroundings. Also there are about 20,000 to 25,000 Japanese people residing in Germany. Some South Asian and Southeast Asian immigration has taken place. Nearly 50,000 Indians live in Germany. As of 2008, there were 68,000 Filipino residents and an unknown number of Indonesians residing in Germany.
Numerous descendants of the so-called "Gastarbeiter" live in Germany. The "Gastarbeiter" mostly came from Chile, Greece, Italy, Morocco, Portugal, Spain, Tunisia, Turkey and the former Yugoslavia.
Also included were Vietnam, Mongolia, North Korea, Angola, Mozambique and Cuba when the former East Germany existed until reunification in 1990. The (socialist) German Democratic Republic (East Germany) however had their guest-workers stay in single-sex dormitories. Female guest workers had to sign contracts saying that they were not allowed to fall pregnant during their stay. If they fell pregnant nevertheless they faced forced abortion or deportation. This is one of the reasons why the vast majority of ethnic minorities today lives in western Germany and also one of the reasons why minorities such as the Vietnamese have the most unusual population pyramid, with nearly all second-generation Vietnamese Germans born after 1989.
, the numbers of selected groups of resident foreign nationals (non-naturalized residents) in Germany were as follows:
This list does not include foreigners with German nationality and foreign nationals without resident status.
The most common Y chromosome haplogroups among German males are Haplogroup R1b, followed by Haplogroup I1, and Haplogroup R1a.
With an estimated more than 81.8 million inhabitants in late 2011, Germany is the most populous country in the European Union and ranks as the 16th largest country in the world in terms of population. Its population density stands at 229.4 inhabitants per square kilometer.
Germany comprises sixteen states that are collectively referred to as "Länder". Due to differences in size and population the subdivision of these states varies, especially between city-states ("Stadtstaaten") and states with larger territories ("Flächenländer"). For regional administrative purposes five states, namely Baden-Württemberg, Bavaria, Hesse, North Rhine-Westphalia and Saxony, consist of a total of 22 Government Districts ("Regierungsbezirke"). As of 2009 Germany is divided into 403 districts ("Kreise") on municipal level, these consist of 301 rural districts and 102 urban districts.
Germany officially has eleven metropolitan regions. In 2005, Germany had 82 cities with more than 100,000 inhabitants.
Germany had signed special visa agreements with several countries in times of severe labour shortages or when particular skills were deficient within the country. During the 1960s and 1970s, agreements were signed with the governments of Turkey, Yugoslavia, Italy and Spain to help Germany overcome its severe labour shortage.
As of 2012, the largest sources of net immigration to Germany are other European countries, most importantly Poland, Romania, Bulgaria, Hungary, Italy, Spain, and Greece; notably, in the case of Turkey, German Turks moving to Turkey slightly outnumber new immigrants.
In 2015, there were 476,649 asylum applications.
Responsibility for educational oversight in Germany lies primarily with the individual federated states. Since the 1960s, a reform movement has attempted to unify secondary education into a "Gesamtschule" (comprehensive school); several West German states later simplified their school systems to two or three tiers. A system of apprenticeship called "Duale Ausbildung" ("dual education") allows pupils in vocational training to learn in a company as well as in a state-run vocational school.
Optional kindergarten education is provided for all children between three and six years old, after which school attendance is compulsory for at least nine years. Primary education usually lasts for four years and public schools are not stratified at this stage. In contrast, secondary education includes three traditional types of schools focused on different levels of academic ability: the "Gymnasium" enrols the most academically promising children and prepares students for university studies; the "Realschule" for intermediate students lasts six years; the "Hauptschule" prepares pupils for vocational education.
In addition Germany has a comprehensive school known as the "Gesamtschule". While some German schools such as the Gymnasium and the Realschule have rather strict entrance requirements, the Gesamtschule does not have such requirements. They offer college preparatory classes for the students who are doing well, general education classes for average students, and remedial courses for those who aren't doing that well. In most cases students attending a Gesamtschule may graduate with the Hauptschulabschluss, the Realschulabschluss or the Abitur depending on how well they did in school.
The percentage of students attending a Gesamtschule varies by Bundesland. In 2007 the State of Brandenburg more than 50% of all students attended a Gesamtschule, while in the State of Bavaria less than 1% did.
The general entrance requirement for university is Abitur, a qualification normally based on continuous assessment during the last few years at school and final examinations; however there are a number of exceptions, and precise requirements vary, depending on the state, the university and the subject. Germany's universities are recognised internationally; in the Academic Ranking of World Universities (ARWU) for 2008, six of the top 100 universities in the world are in Germany, and 18 of the top 200. Nearly all German universities are public institutions, tuition fees in the range of €500 were introduced in some states after 2006, but quickly abolished again until 2014.
"Percentage of jobholders holding Hauptschulabschluss, Realschulabschluss or Abitur in Germany"
Over 99% of those of age 15 and above are estimated to be able to read and write. However, a growing number of inhabitants are functionally illiterate. The young are much more likely to be functionally illiterate than the old. According to a study done by the University of Bremen in cooperation with the "Bundesverband Alphabetisierung e.V.", 10% of youngsters living in Germany are functionally illiterate and one quarter are able to understand only basic level texts. Illiteracy rates of youngsters vary by ethnic group and parents' socioeconomic class.
The life expectancy in Germany is 81.1 years (78.7 years males, 83.6 years females, 2020 est.).
, the principal cause of death was cardiovascular disease, at 42%, followed by malignant tumours, at 25%.
, about 82,000 Germans had been infected with HIV/AIDS and 26,000 had died from the disease (cumulatively, since 1982).
According to a 2005 survey, 27% of German adults are smokers.
A 2009 study shows Germany is near the median in terms of overweight and obese people in Europe.
The national constitutions of 1919 and 1949 guarantee freedom of faith and religion; earlier, these freedoms were mentioned only in state constitutions. The modern constitution of 1949 also states that no one may be discriminated against due to their faith or religious opinions. A state church does not exist in Germany (see Freedom of religion in Germany).
According to a 1990s poll by "Der Spiegel", 45% of Germans believe in God, and a quarter in Jesus Christ. According to the Eurobarometer Poll 2010, 44% of German citizens responded that "they believe there is a God", 25% responded that "they believe there is some sort of spirit or life force" and 27% responded that "they don't believe there is any sort of spirit, God or life force". 4% gave no response.
Christianity is the largest religion in Germany, comprising an estimated 57.9% of the country's population.
Smaller religious groups (less than 1%) include Judaism, Buddhism and Hinduism.
The two largest churches, the Roman Catholic Church and the Protestant Evangelical Church in Germany (EKD), have lost significant number of adherents. In 2016 the Catholic Church accounted for 28.5% and the Evangelical Church for 26.5% of the population. Other Christian churches and groups summed up to 3.3% with estimations for the Orthodox Church between 1.3% and 1.9%.
Since the reunification of Germany, the number of non-religious people has grown and an estimated 36.2% of the country's population are not affiliated with any church or religion.
The other religions make up to less than 1% of the population. Buddhism has around 200,000 adherents (0.2%), Judaism has around 200.000 adherents (0.2%), Hinduism 90,000 (0.1%), Sikhism 75,000 (0.1%) and Yazidis religion (45,000-60,000). All other religious communities in Germany have fewer than 50,000 (<0.1%) adherents.
Protestantism is concentrated in the north and east and Roman Catholicism is concentrated in the south and west. According to the last nationwide census, Protestantism is more widespread among the population with German citizenship; there are slightly more Catholics total because of the Catholic immigrant population (including such groups as Poles and Italians). The former Pope, Benedict XVI, was born in Bavaria. Non-religious people, including atheists and agnostics, might make up as many as 55% of the total population, and are especially numerous in the former East Germany and major metropolitan areas.
Of the roughly 4 million Muslims, most are Sunnis and Alevites from Turkey, but there are a small number of Shi'ites and other denominations. 1.3% of the country's overall population declare themselves Orthodox Christians, with Serbs, Greeks, Montenegrins, Ukrainians and Russians being the most numerous. Germany has Europe's third-largest Jewish population (after France and the United Kingdom). In 2004, twice as many Jews from former Soviet republics settled in Germany as in Israel, bringing the total Jewish population to more than 200,000, compared to 30,000 prior to German reunification. Large cities with significant Jewish populations include Berlin, Frankfurt and Munich. Around 250,000 active Buddhists live in Germany; 50% of them are Asian immigrants.
Census results were as follows:
German is the only official and most widely spoken language. Standard German is understood throughout the country.
Danish, Low German, Low Rhenish, the Sorbian languages (Lower Sorbian and Upper Sorbian), and the two Frisian languages, Saterfrisian and North Frisian, are officially recognized and protected as minority languages by the European Charter for Regional or Minority Languages in their respective regions. With speakers of Romany living in all parts of Germany, the federal government has promised to take action to protect the language. Until now, only Hesse has followed Berlin's announcement, and agreed on implementing concrete measures to support Romany speakers.
Implementation of the Charter is poor. The monitoring reports on charter implementation in Germany show many provisions unfulfilled.
German dialects – some quite distinct from the standard language – are used in everyday speech, especially in rural regions. Many dialects, for example the Upper German varieties, are to some degree cultivated as symbols of regional identity and have their own literature, theaters and some TV programming. While speaking a dialect outside its native region might be frowned upon, in their native regions some dialects can be spoken by all social classes. . Nevertheless, partly due to the prevalence of Standard German in media, the use of dialects has declined over the past century, especially in the younger population.
The social status of different German dialects can vary greatly. The Alemannic and Bavarian dialects of the south are positively valued by their speakers and can be used in almost all social circumstances. The Saxonian and Thuringian dialects have less prestige and are subject to derision. While Bavarian and Alemannic have kept much of their distinctiveness, the Middle German dialects, which are closer to Standard German, have lost some of their distinctive lexical and grammatical features and tend to be only pronunciation variants of Standard German.
Low Saxon is officially recognized as a language on its own, but despite this fact, there's little official action taken on fostering the language. Historically one third of Germany's territory and population was Low Saxon speaking. No data was ever collected on the actual number of speakers, but today the number of speakers ranges around 5 million persons. Despite this relatively high number of speakers there is very little coverage in the media (mostly on NDR TV, no regular programming) and very little education in or on the language. The language is not fixed as part of the school curriculum and Low Saxon is used as a medium of instruction in one school only in the whole Germany (as a "model project" in primary school sided by education in Standard German). As a consequence the younger generation refused to adopt the native language of their parents. Language prevalence dropped from more than 90% (depending on the exact region) in the 1930s to less than 5% today. This accounts for a massive intergenerational gap in language use. Older people regularly use the language and take private initiative to maintain the language, but the lack of innovative potential of the younger generation hinders language maintenance. The language too has an own literature (around 150 published books every year) and there are many theatres (mostly lay stages, but some professional ones, like for example Ohnsorg-Theater).
Use of Low Saxon is mainly restricted to use among acquaintances, like family members, neighbours and friends. A meeting of a village council can be held almost completely in Low Saxon if all participants know each other (as long as written protocols are written in Standard German), but a single foreigner can make the whole switching to Standard German.
The Low Saxon dialects are different in their status too. There's a north-south gradient in language maintenance. The Southern dialects of Westfalian, Eastfalian and Brandenburgish have had much stronger speaker losses, than the northern coastal dialects of Northern Low Saxon. While Eastfalian has lost speakers to Standard German, Westfalian has lost speakers to Standard German and Standard German based regiolect of the Rhine-Ruhr area. Brandenburgish speakers mostly switched to the Standard German-based regiolect of Berlin. Brandenburgish is almost completely replaced by the Berlin regiolect. Northern Low Saxon speakers switched mostly to pure Standard German.
English is the most common foreign language and almost universally taught by the secondary level; it is also taught at elementary level in some states. Other commonly-taught languages are French, Italian, Spanish, Portuguese, and Russian. Dutch is taught in states bordering the Netherlands, and Polish in the eastern states bordering Poland. Latin and Ancient Greek are part of the classical education syllabus offered in many secondary schools.
According to a 2004 survey, two-thirds of Germany's citizens have at least basic knowledge of English. About 20% consider themselves to be competent speakers of French, followed by speakers of Russian (7%), Italian (6.1%), and Spanish (5.6%). The relatively high number of Russian speakers is a result of immigration from the former Soviet Union to Germany for almost 10 consecutive years, plus its having been learned in school by many older former East Germans.
|
https://en.wikipedia.org/wiki?curid=11929
|
Economy of Germany
The economy of Germany is a highly developed social market economy. It has the largest national economy in Europe, the fourth-largest by nominal GDP in the world, and fifth by GDP (PPP). In 2017, the country accounted for 28% of the euro area economy according to the IMF. Germany is a founding member of the European Union and the Eurozone.
In 2016, Germany recorded the highest trade surplus in the world worth $310 billion, making it the biggest capital exporter globally. Germany is one of the largest exporters globally with $1448.17 billion worth of goods and services exported in 2017. The service sector contributes around 70% of the total GDP, industry 29.1%, and agriculture 0.9%. Exports account for 41% of national output. The top 10 exports of Germany are vehicles, machinery, chemical goods, electronic products, electrical equipment, pharmaceuticals, transport equipment, basic metals, food products, and rubber and plastics. The economy of Germany is the largest manufacturing economy in Europe and it is less likely to be affected by the financial downturn and conduct applied research with practical industrial value and sees itself as a bridge between the latest university insights and industry-specific product and process improvements, and by generating a great deal of knowledge in its own laboratories as well. In July 2017, the International Monetary Fund gave the country's economy "yet another bill of good health" and some advice on steps it might take to maintain this level in the long run.
Germany is rich in timber, lignite, potash and salt. Some minor sources of natural gas are being exploited in the state of Lower Saxony. Until reunification, the German Democratic Republic mined for uranium in the Ore Mountains (see also: SAG/SDAG Wismut). Energy in Germany is sourced predominantly by fossil fuels (30%), followed by wind second, then nuclear power, gas, solar, biomass (wood and biofuels) and hydro. Germany is the first major industrialized nation to commit to the renewable energy transition called Energiewende. Germany is the leading producer of wind turbines in the world. Renewables produced 46% of electricity consumed in Germany (as of 2019).
99 percent of all German companies belong to the German "Mittelstand," small and medium-sized enterprises, which are mostly family-owned. Of the world's 2000 largest publicly listed companies measured by revenue, the Fortune Global 2000, 53 are headquartered in Germany, with the Top 10 being Allianz, Daimler, Volkswagen, Siemens, BMW, Deutsche Telekom, Bayer, BASF, Munich Re and SAP.
Germany is the world's top location for trade fairs. Around two thirds of the world's leading trade fairs take place in Germany. The largest annual international trade fairs and congresses are held in several German cities such as Hanover, Frankfurt, Cologne, Leipzig and Düsseldorf.
The Industrial Revolution in Germany got underway approximately a century later than in the United Kingdom, France, and Belgium, partly because Germany only became a unified country in 1871.
The establishment of the Deutscher Zollverein (German Customs Union) in 1834 and the expansion of railway systems were the main drivers of Germany's industrial development and political union. From 1834, tariff barriers between increasing numbers of the Kleindeutschland German states were eliminated. In 1835 the first German railway linked the Franconian cities of Nuremberg and Fürth – it proved so successful that the decade of the 1840s saw "railway mania" in all the German states. Between 1845 and 1870, of rail had been built and in 1850 Germany was building its own locomotives. Over time, other German states joined the customs union and started linking their railroads, which began to connect the corners of Germany together. The growth of free trade and of a rail system across Germany intensified economic development which opened up new markets for local products, created a pool of middle managers, increased the demand for engineers, architects and skilled machinists, and stimulated investments in coal and iron.
Another factor which propelled German industry forward was the unification of the monetary system, made possible in part by political unification. The Deutsche Mark, a new monetary coinage system backed by gold, was introduced in 1871. However, this system did not fully come into use as silver coins retained their value until 1907.
The victory of Prussia and her allies over Napoleon III of France in the Franco-Prussian War of 1870-1871 marked the end of French hegemony in Europe and resulted in the proclamation of the German Empire in 1871. The establishment of the empire inherently presented Europe with the reality of a new populous and industrializing polity possessing a considerable, and undeniably increasing, economic and diplomatic presence. The influence of French economic principles produced important institutional reforms in Germany, including the abolition of feudal restrictions on the sale of large landed estates, the reduction of the power of the guilds in the cities, and the introduction of a new, more efficient commercial law. Nonetheless, political decisions about the economy of the empire were still largely controlled by a coalition of "rye and iron", that is the Prussian Junker landowners of the east and the Ruhr heavy industry of the west.
Regarding politics and society, between 1881 and 1889 Chancellor Otto von Bismarck promoted laws that provided social insurance and improved working conditions. He instituted the world's first welfare state. Germany was the first to introduce social insurance programs including universal healthcare, compulsory education, sickness insurance, accident insurance, disability insurance, and a retirement pension. Moreover, the government's universal education policy bore fruit with Germany achieving the highest literacy rate in the world – 99% – education levels that provided the nation with more people good at handling numbers, more engineers, chemists, opticians, skilled workers for its factories, skilled managers, knowledgeable farmers and skilled military personnel.
By 1900 Germany surpassed Britain and the United States in steel production. The German economic miracle was also intensified by an unprecedented population growth from 35 million in 1850 to 67 million in 1913. From 1895 to 1907, the number of workers engaged in machine building doubled from half a million to well over a million. Only 40 percent of Germans lived in rural areas by 1910, a drop from 67% at the birth of the Empire. Industry accounted for 60 percent of the gross national product in 1913. The German chemical industry became the most advanced in the world, and by 1914 the country was producing half the world's electrical equipment.
The rapid advance to industrial maturity led to a drastic shift in Germany's economic situation – from a rural economy into a major exporter of finished goods. The ratio of finished product to total exports jumped from 38% in 1872 to 63% in 1912. By 1913 Germany had come to dominate all the European markets. By 1914 Germany had become one of the biggest exporters in the world.
The Nazis rose to power while unemployment was very high, but achieved full employment later thanks to massive public works programs such as the Reichsbahn, Reichspost and the Reichsautobahn projects. In 1935 rearmament in contravention of the Treaty of Versailles added to the economy.
Weimar and Nazi Germany By Stephen J. Lee
The post 1931 financial crisis economic policies of expansionary fiscal policies (as Germany was off the gold standard) was advised by their non-Nazi Minister of Economics, Hjalmar Schacht, who in 1933 became the president of the central bank. Hjalmar Schacht later abdicated from the post in 1938 and was replaced by Hermann Göring.
The trading policies of the Third Reich aimed at self sufficiency but with a lack of raw materials Germany would have to maintain trade links but on bilateral preferences, foreign exchange controls, import quotas and export subsidies under what was called the "New Plan"(Neuer Plan) of 19 September 1934. The "New Plan" was based on trade with less developed countries who would trade raw materials for German industrial goods saving currency. Southern Europe was preferable to Western Europe and North America as there could be no trade blockades. This policy became known as the Grosswirtschaftsraum ("greater economic area") policy.
Eventually, the Nazi party developed strong relationships with big business and abolished trade unions in 1933 in order to form the National Labor Service (RAD), German Labor Front (DAF) to set working hours, Beauty of Labour (SDA) which set working conditions and Strength through Joy (KDF) to ensure sports clubs for workers.
Beginning with the replacement of the Reichsmark with the Deutsche Mark as legal tender, a lasting period of low inflation and rapid industrial growth was overseen by the government led by German Chancellor Konrad Adenauer and his minister of economics, Ludwig Erhard, raising West Germany from total wartime devastation to one of the most developed nations in modern Europe.
In 1953 it was decided that Germany was to repay $1.1 billion of the aid it had received. The last repayment was made in June 1971.
Apart from these factors, hard work and long hours at full capacity among the population in the 1950s, 1960s and early 1970s and extra labor supplied by thousands of Gastarbeiter ("guest workers") provided a vital base for the economic upturn.
By the early 1950s the Soviet Union had seized reparations in the form of agricultural and industrial products and demanded further heavy reparation payments. Silesia with the Upper Silesian Coal Basin, and Stettin, a prominent natural port, were lost to Poland.
Exports from West Germany exceeded $323 billion in 1988. In the same year, East Germany exported $30.7 billion worth of goods; 65% to other communist states. East Germany had zero unemployment.
In 1976 the average annual GDP growth was roughly 5.9%.
The German economy practically stagnated in the beginning of the 2000s. The worst growth figures were achieved in 2002 (+1.4%), in 2003 (+1.0%) and in 2005 (+1.4%). Unemployment was also chronically high. Due to these problems, together with Germany's aging population, the welfare system came under considerable strain. This led the government to push through a wide-ranging program of belt-tightening reforms, Agenda 2010, including the labor market reforms known as Hartz I - IV.
In the later part of the first decade of 2000 the world economy experienced high growth, from which Germany as a leading exporter also profited. Some credit the Hartz reforms with achieving high growth and declining unemployment but others contend that they resulted in a massive decrease in standards of living, and that its effects are limited and temporary.
The nominal GDP of Germany contracted in the second and third quarters of 2008, putting the country in a technical recession following a global and European recession cycle. German industrial output dropped to 3.6% in September vis-à-vis August. In January 2009 the German government under Angela Merkel approved a €50 billion ($70 billion) economic stimulus plan to protect several sectors from a downturn and a subsequent rise in unemployment rates. Germany exited the recession in the second and third quarters of 2009, mostly due to rebounding manufacturing orders and exports - primarily from outside the Euro Zone - and relatively steady consumer demand.
Germany is a founding member of the EU, the G8 and the G20, and was the world's largest exporter from 2003 to 2008. In 2011 it remained the third largest exporter and third largest importer. Most of the country's exports are in engineering, especially machinery, automobiles, chemical goods and metals. Germany is a leading producer of wind turbines and solar-power technology. Annual trade fairs and congresses are held in cities throughout Germany.
2011 was a record-breaking year for the German economy. German companies exported goods worth over €1 trillion ($1.3 trillion), the highest figure in history. The number of people in work has risen to 41.6 million, the highest recorded figure.
Through 2012, Germany's economy continued to be stronger relative to local neighboring nations.
As of December 2017, the unemployment rate was at 5.5 percent.
, the unemployment rate was 4.8 percent.
, the CPI rate was 0.6 percent.
The following table shows the main economic indicators in 1980–2018. Inflation below 2% is in green.
1980 to 1995
1996 to 2010
2011 to 2018
Of the world's 500 largest stock-market-listed companies measured by revenue in 2010, the Fortune Global 500, 37 are headquartered in Germany. 30 Germany-based companies are included in the DAX, the German stock market index. Well-known global brands are Mercedes-Benz, BMW, SAP, Siemens, Volkswagen, Adidas, Audi, Allianz, Porsche, Bayer, BASF, Bosch, and Nivea.
Germany is recognised for its specialised small and medium enterprises, known as the Mittelstand model. Around 1,000 of these companies are global market leaders in their segment and are labelled hidden champions.
From 1991 to 2010, 40,301 mergers and acquisitions with an involvement of German firms with a total known value of 2,422 bil. EUR have been announced. The largest transactions since 1991 are: the acquisition of Mannesmann by Vodafone for 204.8 bil. EUR in 1999, the merger of Daimler-Benz with Chrysler to form DaimlerChrysler in 1998 valued at 36.3 bil. EUR.
Berlin (Economy of Berlin) developed an international Startup ecosystem and became a leading location for venture capital funded firms in the European Union.
The list includes the largest German companies by revenue in 2011:
Since the German reunification there have been 52,258 mergers or acquisitions deals inbound or outbound Germany. The most active year in terms of value was 1999 with a cumulated value of 48. bil. EUR, twice as much as the runner up which was 2006 with 24. bil. EUR (see graphic "M&A in Germany").
Here is a list of the top 10 deals (ranked by value) that include a German company. The Vodafone - Mannesmann deal is still the biggest deal in global history.
Germany as a federation is a polycentric country and does not have a single economic center. The stock exchange is located in Frankfurt am Main, the largest Media company (Bertelsmann SE & Co. KGaA) is headquartered in Gütersloh; the largest car manufacturers are in Wolfsburg (Volkswagen), Stuttgart (Mercedes-Benz and Porsche), and Munich (Audi and BMW).
Germany is an advocate of closer European economic and political integration. Its commercial policies are increasingly determined by agreements among European Union (EU) members and EU single market legislation. Germany introduced the common European currency, the euro on 1 January 1999. Its monetary policy is set by the European Central Bank in Frankfurt.
The southern states ("Bundesländer"), especially Bayern, Baden-Württemberg and Hessen, are economically stronger than the northern states. One of Germany's traditionally strongest (and at the same time oldest) economic regions is the Ruhr area in the west, between Duisburg and Dortmund. 27 of the country's 100 largest companies are located there. In recent years, however, the area, whose economy is based on natural resources and heavy industry, has seen a substantial rise in unemployment (2010: 8.7%).
The economy of Bayern and Baden-Württemberg, the states with the lowest number of unemployed people (2018: 2.7%, 3.1%), on the other hand, is based on high-value products. Important sectors are automobiles, electronics, aerospace and biomedicine, among others. Baden-Württemberg is an industrial center especially for automobile and machine building industry and the home of brands like Mercedes-Benz (Daimler), Porsche and Bosch.
With the reunification on 3 October 1990, Germany began the major task of reconciling the economic systems of the two former republics. Interventionist economic planning ensured gradual development in eastern Germany up to the level of former West Germany, but the standard of living and annual income remains significantly higher in western German states. The modernisation and integration of the eastern German economy continues to be a long-term process scheduled to last until the year 2019, with annual transfers from west to east amounting to roughly $80 billion. The overall unemployment rate has consistently fallen since 2005 and reached a 20-year low in 2012. The country in July 2014 began legislating to introduce a federally mandated minimum wage which would come into effect on 1 January 2015.
The following top 10 list of German billionaires is based on an annual assessment of wealth and assets compiled and published by "Forbes" magazine on 1 March 2016.
Wolfsburg is the city in Germany with the country's highest per capita GDP, at $128,000. The following top 10 list of German cities with the highest per capita GDP is based on a study by the Cologne Institute for Economic Research on 31 July 2013.
Germany has a social market economy characterised by a highly qualified labor force, a developed infrastructure, a large capital stock, a low level of corruption, and a high level of innovation. It has the largest national economy in Europe, the fourth largest by nominal GDP in the world, and ranked fifth by GDP (PPP) in 2015.
The service sector contributes around 70% of the total GDP, industry 29.1%, and agriculture 0.9%.
In 2010 agriculture, forestry, and mining accounted for only 0.9% of Germany's gross domestic product (GDP) and employed only 2.4% of the population, down from 4% in 1991. Agriculture is extremely productive, and Germany is able to cover 90% of its nutritional needs with domestic production. Germany is the third largest agricultural producer in the European Union after France and Italy. Germany's principal agricultural products are potatoes, wheat, barley, sugar beets, fruit, and cabbages.
Despite the country's high level of industrialization, almost one-third of its territory is covered by forest. The forestry industry provides for about two-thirds of domestic consumption of wood and wood products, so Germany is a net importer of these items.
The German soil is relatively poor in raw materials. Only lignite (brown coal) and potash salt (Kalisalz) are available in significant quantities. However, the former GDR's Wismut mining company produced a total of 230,400 tonnes of uranium between 1947 and 1990 and made East Germany the fourth largest producer of uranium ore worldwide (largest in USSR's sphere of control) at the time. Oil, natural gas and other resources are, for the most part, imported from other countries.
Potash salt is mined in the center of the country (Niedersachsen, Sachsen-Anhalt and Thüringen). The most important producer is K+S AG (formerly Kali und Salz AG).
Germany's bituminous coal deposits were created more than 300 million years ago from swamps which extended from the present-day South England, over the Ruhr area to Poland. Lignite deposits developed in a similar way, but during a later period, about 66 million years ago. Because the wood is not yet completely transformed into coal, brown coal contains less energy than bituminous coal.
Lignite is extracted in the extreme western and eastern parts of the country, mainly in Nordrhein-Westfalen, Sachsen and Brandenburg. Considerable amounts are burned in coal plants near to the mining areas, to produce electricity. Transporting lignite over far distances is not economically feasible, therefore the plants are located practically next to the extraction sites. Bituminous coal is mined in Nordrhein-Westfalen and Saarland. Most power plants burning bituminous coal operate on imported material, therefore the plants are located not only near to the mining sites, but throughout the country.
Industry and construction accounted for 30.7% of gross domestic product in 2017, and employed 24.2% of the workforce. Germany excels in the production of automobiles, machinery, electrical equipment and chemicals. With the manufacture of 5.2 million vehicles in 2009, Germany was the world's fourth largest producer and largest exporter of automobiles. German automotive companies enjoy an extremely strong position in the so-called premium segment, with a combined world market share of about 90%.
Small- to medium-sized manufacturing firms (Mittelstand companies) which specialize in technologically advanced niche products and are often family-owned form a major part of the German economy. It is estimated that about 1500 German companies occupy a top three position in their respective market segment worldwide. In about two thirds of all industry sectors German companies belong to the top three competitors.
Germany is the only country among the top five arms exporters that is not a permanent member of the United Nations Security Council.
In 2017 services constituted 68.6% of gross domestic product (GDP), and the sector employed 74.3% of the workforce. The subcomponents of services are financial, renting, and business activities (30.5%); trade, hotels and restaurants, and transport (18%); and other service activities (21.7%).
Germany is the seventh most visited country in the world, with a total of 407 million overnights during 2012. This number includes 68.83 million nights by foreign visitors. In 2012, over 30.4 million international tourists arrived in Germany. Berlin has become the third most visited city destination in Europe. Additionally, more than 30% of Germans spend their holiday in their own country, with the biggest share going to Mecklenburg-Vorpommern. Domestic and international travel and tourism combined directly contribute over EUR43.2 billion to German GDP. Including indirect and induced impacts, the industry contributes 4.5% of German GDP and supports 2 million jobs (4.8% of total employment). The largest annual international trade fairs and congresses are held in several German cities such as Hannover, Frankfurt, and Berlin.
The debt-to-GDP ratio of Germany had its peak in 2010 when it stood at 80.3% and decreased since then. According to Eurostat, the government gross debt of Germany amounts to €2,152.0 billion or 71.9% of its GDP in 2015. The federal government achieved a budget surplus of €12.1 billion ($13.1 billion) in 2015. Germany's credit rating by credit rating agencies Standard & Poor's, Moody's and Fitch Ratings stands at the highest possible rating "AAA" with a stable outlook in 2016.
Germany's "debt clock" ("Schuldenuhr") reversed for the first time in 20 years in January 2018. It is now running backwards at €78 per second.
Economists generally see Germany's current account surplus as undesirable.
Germany is the world's fifth largest consumer of energy, and two-thirds of its primary energy was imported in 2002. In the same year, Germany was Europe's largest consumer of electricity, totaling 512.9 terawatt-hours. Government policy promotes energy conservation and the development of renewable energy sources, such as solar, wind, biomass, hydroelectric, and geothermal energy. As a result of energy-saving measures, energy efficiency has been improving since the beginning of the 1970s. The government has set the goal of meeting half the country's energy demands from renewable sources by 2050.
In 2000, the red-green coalition under Chancellor Schröder and the German nuclear power industry agreed to phase out all nuclear power plants by 2021. The conservative coalition under Chancellor Merkel reversed this decision in January 2010, electing to keep plants open. The nuclear disaster of the Japanese nuclear plant Fukushima in March 2011 however, changed the political climate fundamentally: Older nuclear plants have been shut down and a general phase-out until 2020 or 2022 is now probable. Renewable energy yet still plays a more modest role in energy consumption, though German solar and windpower industries play a leading role worldwide.
In 2009, Germany's total energy consumption (not just electricity) came from the following sources:
Oil 34.6%, Natural gas 21.7%, Lignite 11.4%, Bituminous coal 11.1%, Nuclear power 11.0%, Hydro and wind power 1.5%, Others 9.0%.
There are 3 major entry points for oil pipelines: in the northeast (the Druzhba pipeline, coming from Gdańsk), west (coming from Rotterdam) and southeast (coming from Nelahozeves). The oil pipelines of Germany do not constitute a proper network, and sometimes only connect two different locations. Major oil refineries are located in or near the following cities: Schwedt, Spergau, Vohburg, Burghausen, Karlsruhe, Cologne, Gelsenkirchen, Lingen, Wilhelmshaven, Hamburg and Heide.
Germany's network of natural gas pipelines, on the other hand, is dense and well-connected. Imported pipeline gas comes mostly from Russia, the Netherlands and the United Kingdom. Although gas imports from Russia have been historically reliable, even during the cold war, recent price disputes between Gazprom and the former Soviet states, such as Ukraine, have also affected Germany. As a result, high political importance is placed on the construction of the Nord Stream pipeline, running from Vyborg in Russia along the Baltic sea to Greifswald in Germany. This direct connection avoids third-party transit countries. Germany imports 50% to 75% of its natural gas from Russia.
With its central position in Europe, Germany is an important transportation hub. This is reflected in its dense and modern transportation networks. The extensive motorway (Autobahn) network that ranks worldwide third largest in its total length and features a lack of blanket speed limits on the majority of routes.
Germany has established a polycentric network of high-speed trains. The InterCityExpress or "ICE" is the most advanced service category of the Deutsche Bahn and serves major German cities as well as destinations in neighbouring countries. The train maximum speed varies between 200 km/h and 320 km/h (125-200 mph). Connections are offered at either 30-minute, hourly, or two-hourly intervals. German railways are heavily subsidised, receiving €17.0 billion in 2014.
The largest German airports are the Frankfurt International Airport and the Munich International Airport, both are global hubs of Lufthansa. Other major airports are Berlin Tegel, Berlin Schönefeld, Düsseldorf, Hamburg, Hanover, Cologne-Bonn, Leipzig/Halle and in the future Berlin Brandenburg International Airport.
Germany's achievements in sciences have been significant, and research and development efforts form an integral part of the economy.
Germany is also one of the leading countries in developing and using green technologies. Companies specializing in green technology have an estimated turnover of €200 billion. German expertise in engineering, science and research is eminently respectable.
The lead markets of Germany's green technology industry are power generation, sustainable mobility, material efficiency, energy efficiency, waste management and recycling, sustainable water management.
With regard to triadic patents Germany is in third place after the US and Japan. With more than 26,500 registrations for patents submitted to the European Patent Office, Germany is the leading European nation. Siemens, Bosch and BASF, with almost 5,000 registrations for patents between them in 2008, are among the Top 5 of more than 35,000 companies registering patents. Together with the US and Japan, with regard to patents for nano, bio and new technologies Germany is one of the world's most active nations. With around one third of triadic patents Germany leads the way worldwide in the field of vehicle emission reduction.
According to Winfried Kretschmann, who is premier of the region where Daimler is based, "China dominates the production of solar cells. Tesla is ahead in electric cars and Germany has lost the first round of digitalization to Google, Apple and the like. Whether Germany has a future as an industrial economy will depend on whether we can manage the ecological and digital transformation of our economy".
Despite economic prosperity, Germany's biggest threat for future economic development is the nation's declining birthrate which is among the lowest in the world. This is particularly prevalent in parts of society with higher education. As a result, the numbers of workers is expected to decrease and the government spending needed to support pensioners and healthcare will increase if the trend is not reversed.
|
https://en.wikipedia.org/wiki?curid=11930
|
Transport in Germany
As a densely populated country in a central location in Europe and with a developed economy, Germany has a dense and modern transport infrastructure.
The first highway system to have been built, the extensive German Autobahn network famously has no general speed limit for light vehicles (although advisory speed limits are given in most sections today, and there is a blanket 80 km/h limit for trucks). The country's most important waterway is the river Rhine. The largest port is that of Hamburg. Frankfurt Airport is a major international airport and European transport hub. Air travel is used for greater distances within Germany but faces competition from the state-owned Deutsche Bahn's rail network. High-speed trains called ICE connect cities for passenger travel with speeds up to 300 km/h. Many German cities have rapid transit systems and public transport is available in most areas. Buses have historically only played a marginal role in long distance passenger service, as all routes directly competing with rail services were technically outlawed by a law dating to 1935 (during the Nazi era). Only in 2012 was this law officially amended and thus a long distance bus market has also emerged in Germany since then.
Since German reunification substantial effort has been made to improve and expand transport infrastructure in what was formerly East Germany.
Verkehrsmittel and Verkehrszeichen - Transportation signs in Germany are available here in German and English.
The volume of traffic in Germany, especially goods transportation, is at a very high level due to its central location in Europe.
In the past few decades, much of the freight traffic shifted from rail to road, which led the Federal Government to introduce a motor toll for trucks in 2005. Individual road usage increased resulting in a relatively high traffic density to other nations. A further increase of traffic is expected in the future.
High-speed vehicular traffic has a long tradition in Germany given that the first freeway (Autobahn) in the world, the AVUS, and the world's first automobile were developed and built in Germany. Germany possesses one of the most dense road systems of the world. German motorways have no blanket speed limit for light vehicles. However, posted limits are in place on many dangerous or congested stretches as well as where traffic noise or pollution poses a problem (20.8% under static or temporary limits and an average 2.6% under variable traffic control limit applications as of 2015).
The German government has had issues with upkeep of the country's autobahn network, having had to revamp the Eastern portion's transport system since the unification of Germany between the German Democratic Republic (East Germany) and the Federal Republic of Germany (West Germany). With that, numerous construction projects have been put on hold in the west, and a vigorous reconstruction has been going on for almost 20 years. However, ever since the European Union formed, an overall streamlining and change of route plans have occurred as faster and more direct links to former Soviet bloc countries now exist and are in the works, with intense co-operation among European countries.
Intercity bus service within Germany fell out of favour as post-war prosperity increased, and became almost extinct when legislation was introduced in the 1980s to protect the national railway. After that market was deregulated in 2012, some 150 new intercity bus lines have been established, leading to a significant shift from rail to bus for long journeys. The market has since consolidated with Flixbus controlling over 90% of it and also expanding into neighboring countries.
Germany has approximately 650,000 km of roads, of which 231,000 km are non-local roads. The road network is extensively used with nearly 2 trillion km travelled by car in 2005, in comparison to just 70 billion km travelled by rail and 35 billion km travelled by plane.
The Autobahn is the German federal highway system. The official German term is (plural "", abbreviated 'BAB'), which translates as 'federal motorway'. Where no local speed limit is posted, the advisory limit "(Richtgeschwindigkeit)" is 130 km/h. The "Autobahn" network had a total length of about in 2016, which ranks it among the most dense and longest systems in the world. Only federally built controlled-access highways meeting certain construction standards including at least two lanes per direction are called ""Bundesautobahn"". They have their own, blue-coloured signs and their own numbering system. All "Autobahnen" are named by using the capital letter A, followed by a blank and a number (for example A 8).
The main "Autobahnen" going all across Germany have single digit numbers. Shorter highways of regional importance have double digit numbers (like A 24, connecting Berlin and Hamburg). Very short stretches built for heavy local traffic (for example ring roads or the A 555 from Cologne to Bonn) usually have three digits, where the first digit depends on the region.
East-west routes are usually even-numbered, north-south routes are usually odd-numbered. The numbers of the north-south "Autobahnen" increase from west to east; that is to say, the more easterly roads are given higher numbers. Similarly, the east-west routes use increasing numbers from north to south.
The autobahns are considered the safest category of German roads: for example, in 2012, while carrying 31% of all motorized road traffic, they only accounted for 11% of Germany's traffic fatalities.
German autobahns are still toll-free for light vehicles, but on 1 January 2005, a blanket mandatory toll on heavy trucks was introduced.
The national roads in Germany are called "Bundesstraßen" (federal roads). Their numbers are usually well known to local road users, as they appear (written in black digits on a yellow rectangle with black border) on direction traffic signs and on street maps. A Bundesstraße is often referred to as "B" followed by its number, for example "B1", one of the main east-west routes. More important routes have lower numbers. Odd numbers are usually applied to north-south oriented roads, and even numbers for east-west routes. Bypass routes are referred to with an appended "a" (alternative) or "n" (new alignment), as in "B 56n".
Other main public roads are maintained by the "Bundesländer" (states), called "Landesstraße" (country road) or "Staatsstraße" (state road). The numbers of these roads are prefixed with "L", "S" or "St", but are usually not seen on direction signs or written on maps. They appear on the kilometre posts on the roadside. Numbers are unique only within one state.
The "Landkreise" (districts) and municipalities are in charge of the minor roads and streets within villages, towns and cities. These roads have the number prefix "K" indicating a "Kreisstraße".
Germany features a total of 43,468 km railways, of which at least 19,973 km are electrified (2014).
Deutsche Bahn (German Rail) is the major German railway infrastructure and service operator. Though Deutsche Bahn is a private company, the government still holds all shares and therefore Deutsche Bahn can still be called a state-owned company. Since its reformation under private law in 1994, Deutsche Bahn AG (DB AG) no longer publishes details of the tracks it owns; in addition to the DBAG system there are about 280 privately or locally owned railway companies which own an approximate 3,000 km to 4,000 km of the total tracks and use DB tracks in "open access".
Railway subsidies amounted to €17.0 billion in 2014 and there are significant differences between the financing of long-distance and short-distance (or local) trains in Germany. While long-distance trains can be run by any railway company, the companies also receive no subsidies from the government. Local trains however are subsidised by the German states, which pay the operating companies to run these trains and indeed in 2013, 59% of the cost of short-distance passenger rail transport was covered by subsidies. This resulted in many private companies offering to run local train services as they can provide cheaper service than the state-owned Deutsche Bahn. Track construction is entirely and track maintenance partly government financed both for long and short range trains. On the other hand, all rail vehicles are charged track access charges by DB Netz which in turn delivers (part of) its profits to the federal budget.
High speed rail started in the early 1990s with the introduction of the Inter City Express (ICE) into revenue service after first plans to modernize the rail system had been drawn up under the government of Willy Brandt. While the high speed network is not as dense as those of France or Spain, ICE or slightly slower (max. speed 200 km/h) Intercity (IC) serve most major cities. Several extensions or upgrades to high speed lines are under construction or planned for the near future, some of them after decades of planning.
The fastest high-speed train operated by Deutsche Bahn, the InterCityExpress or ICE connects major German and neighbouring international centres such as Zurich, Vienna, Copenhagen, Paris, Amsterdam and Brussels. The rail network throughout Germany is extensive and provides excellent service in most areas. On regular lines, at least one train every two hours will call even in the smallest of villages during the day. Nearly all larger metropolitan areas are served by S-Bahn, U-Bahn, Straßenbahn and/or bus networks.
The German government on 13 February 2018 announced plans to make public transportation free as a means to reduce road traffic and decrease air pollution to EU-mandated levels. The new policy will be put to the test by the end of the year in the cities of Bonn, Essen, Herrenberg, Reutlingen and Mannheim. Issues remain concerning the costs of such a move as ticket sales for public transportation constitute a major source of income for cities.
While Germany and most of contiguous Europe use , differences in signalling, rules and regulations, electrification voltages, etc. create obstacles for freight operations across borders. These obstacles are slowly being overcome, with international (in- and outgoing) and transit (through) traffic being responsible for a large part of the recent uptake in rail freight volume. EU regulations have done much to harmonize standards, making cross border operations easier. Maschen Marshalling Yard near Hamburg is the second biggest in the world and the biggest in Europe. It serves as a freight hub distributing goods from Scandinavia to southern Europe and from Central Europe to the port of Hamburg and overseas. Being a densely populated prosperous country in the center of Europe, there are many important transit routes through Germany. The Mannheim–Karlsruhe–Basel railway has undergone upgrades and refurbishments since the 1980s and will likely undergo further upgrades for decades to come as it is the main route from the North Sea Ports to northern Italy via the Gotthard Base Tunnel.
Almost all major metro areas of Germany have suburban rail systems called S-Bahnen ("Schnellbahnen"). These usually connect larger agglomerations to their suburbs and often other regional towns, although the Rhein-Ruhr S-Bahn connects several large cities. A S-Bahn doesn't skip stations and runs more frequently than other trains. In Berlin and Hamburg the S-Bahn has a U-Bahn-like service and uses a third rail whereas all other S-Bahn services rely on regular catenary power supply.
Relatively few cities have a full-fledged underground U-Bahn system; S-Bahn (suburban commuter railway) systems are far more common. In some cities the distinction between U-Bahn and S-Bahn systems is blurred, for instance some S-Bahn systems run underground, have frequencies similar to U-Bahn, and form part of the same integrated transport network. A larger number of cities has upgraded their tramways to light rail standards. These systems are called Stadtbahn (not to be confused with S-Bahn), on main line rails.
Cities with U-Bahn systems are:
With the exception of Hamburg, all of those aforementioned cities also have a tram system, often with new lines built to light rail standards.
Cities with "Stadtbahn" systems can be found in the article Trams in Germany.
Germany was among the first countries to have electric street - running railways and Berlin has one of the longest tram networks in the world. Many West German cities abandoned their previous tram systems in the 1960s and 1970s while others upgraded them to "Stadtbahn" (~light rail) standard, often including underground sections. In the East, most cities retained or even expanded their tram systems and since reunification a trend towards new tram construction can be observed in most of the country. Today the only major German city without a tram or light rail system is Hamburg. Tram-train systems like the Karlsruhe model first came to prominence in Germany in the early 1990s and are implemented or discussed in several cities, providing coverage far into the rural areas surrounding cities.
Short distances and the extensive network of motorways and railways make airplanes uncompetitive for travel within Germany. Only about 1% of all distance travelled was by plane in 2002. But due to a decline in prices with the introduction of low-fares airlines, domestic air travel is becoming more attractive. In 2013 Germany had the fifth largest passenger air market in the world with 105,016,346 passengers. However, the advent of new faster rail lines often leads to cuts in service by the airlines or even total abandonment of routes like Frankfurt-Cologne, Berlin-Hannover or Berlin-Hamburg.
Germany's largest airline is Lufthansa, which was privatised in the 1990s. Lufthansa also operates two regional subsidiaries under the Lufthansa Regional brand and a low-cost subsidiary, Eurowings, which operates independently. Lufthansa flies a dense network of domestic, European and intercontinental routes. Germany's second-largest airline was Air Berlin, which also operated a network of domestic and European destinations with a focus on leisure routes as well as some long-haul services. Air Berlin declared bankruptcy in 2017 with the last flight under its own name in October of that year.
Charter and leisure carriers include Condor, TUIfly, MHS Aviation and Sundair. Major German cargo operators are Lufthansa Cargo, European Air Transport Leipzig (which is a subsidiary of DHL) and AeroLogic (which is jointly owned by DHL and Lufthansa Cargo).
Frankfurt Airport is Germany's largest airport, a major transportation hub in Europe and the world's twelfth busiest airport. It is one of the airports with the largest number of international destinations served worldwide. Depending on whether total passengers, flights or cargo traffic are used as a measure, it ranks first, second or third in Europe alongside London Heathrow Airport and Paris-Charles de Gaulle Airport. Germany's second biggest international airport is Munich Airport followed by Düsseldorf Airport.
There are several more scheduled passenger airports throughout Germany, mainly serving European metropolitan and leisure destinations. Intercontinental long-haul routes are operated to and from the airports in Frankfurt, Munich, Düsseldorf, Berlin-Tegel, Cologne/Bonn, Hamburg and Stuttgart.
Berlin Brandenburg Airport is expected to become the third largest German airport by annual passengers once it opens, serving as single airport for Berlin. Originally planned to be completed in 2011, the new airport has been delayed several times due to poor construction management and technical difficulties. As of September 2014, it is not yet known when the new airport will become operational. In 2017 it was announced that the airport wouldn't open before 2019. In the same year a non-binding referendum to keep Tegel Airport open even after the new airport opens was passed by Berlin voters.
Airports — with paved runways:
Airports — with unpaved runways:
Heliports: 23 (2013 est.)
Waterways: 7,467 km (2013); major rivers include the Rhine and Elbe; Kiel Canal is an important connection between the Baltic Sea and North Sea and one of the busiest waterways in the world, the Rhine-Main-Danube Canal links Rotterdam on the North Sea with the Black Sea. It passes through the highest point reachable by ocean-going vessels from the sea. The Canal has gained importance for leisure cruises in addition to cargo traffic.
Pipelines: oil 2,400 km (2013)
Ports and harbours: Berlin, Bonn, Brake, Bremen, Bremerhaven, Cologne, Dortmund, Dresden, Duisburg, Emden, Fürth, Hamburg, Karlsruhe, Kiel, Lübeck, Magdeburg, Mannheim, Nuremberg, Oldenburg, Rostock, Stuttgart, Wilhelmshaven
The port of Hamburg is the largest sea-harbour in Germany and ranks #3 in Europe (after Rotterdam and Antwerpen), #17 worldwide (2016), in total container traffic.
Merchant marine:
total: 427 ships
Ships by type: barge carrier 2, bulk carrier 6, cargo ship 51, chemical tanker 15, container ship 298, Liquified Gas Carrier 6, passenger ship 4, petroleum tanker 10, refrigerated cargo 3, roll-on/roll-off ship 6 (2010 est.)
Ferries operate mostly between mainland Germany and its islands, serving both tourism and freight transport. Car ferries also operate across the Baltic Sea to the Nordic countries, Russia and the Baltic countries. Rail ferries operate across the Fehmahrnbelt, from Rostock to Sweden (both carrying passenger trains) and from the Mukran port in Sassnitz on the island of Rügen to numerous Baltic Sea destinations (freight only).
|
https://en.wikipedia.org/wiki?curid=11932
|
Foreign relations of Germany
The Federal Republic of Germany (FRG) is a Central European country and member of the European Union, G4, G8, the G20, the Organisation for Economic Co-operation and Development and the North Atlantic Treaty Organization (NATO). It maintains a network of 229 diplomatic missions abroad and holds relations with more than 190 countries. As one of the world's leading industrialized countries it is recognized as a major power in European and global affairs.
The three cabinet-level ministries responsible for guiding Germany's foreign policy are the Ministry of Defense, the Ministry of Economic Cooperation and Development and the Federal Foreign Office. In practice, most German federal departments play some role in shaping foreign policy in the sense that there are few policy areas left that remain outside of international jurisdiction. The bylaws of the Federal Cabinet (as delineated in Germany's Basic Law), however, assign the Federal Foreign Office a coordinating function. Accordingly, other ministries may only invite foreign guests or participate in treaty negotiations with the approval of the Federal Foreign Office.
With respect to foreign policy, the Bundestag acts in a supervisory capacity. Each of its committees – most notably the foreign relations committee – oversees the country's foreign policy. The consent of the Bundestag (and insofar as Länder are impacted, the Bundesrat) is required to ratify foreign treaties. If a treaty legislation passes first reading, it is referred to the Committee on Foreign Affairs, which is capable of delaying ratification and prejudice decision through its report to the Bundestag.
In 1994, a full EU Committee was also created for the purpose of addressing the large flow of EU-related topics and legislation. Also, the committee has the mandate to speak on behalf of the Bundestag and represent it when deciding an EU policy position. A case in point was the committee's involvement regarding the European Union's eastern enlargement wherein the Committee on Foreign Affairs is responsible for relations with ECE states while the EU Committee is tasked with the negotiations.
There is a raft of NGOs in Germany that engage foreign policy issues. These NGOs include think-tanks (German Council on Foreign Relations), single-issue lobbying organizations (Amnesty International), as well as other organizations that promote stronger bilateral ties between Germany and other countries (Atlantic Bridge). While the budgets and methods of NGOs are distinct, the overarching goal to persuade decision-makers to the wisdom of their own views is a shared one. In 2004, a new German governance framework, particularly on foreign and security policy areas, emerged where NGOs are integrated into actual policymaking. The idea is that the cooperation between state and civil society groups increases the quality of conflict resolution, development cooperation and humanitarian aid for fragile states. The framework seeks to benefit from the expertise of the NGOs in exchange for these groups to have a chance for influencing foreign policy.
In 2001, the discovery that the terrorist cell which carried out the attacks against the United States on 11 September 2001, was based in Hamburg, sent shock waves through the country.
The government of Chancellor Gerhard Schröder backed the following U.S. military actions, sending Bundeswehr troops to Afghanistan to lead a joint NATO program to provide security in the country after the ousting of the Taliban.
Nearly all of the public was strongly against America's 2003 invasion of Iraq, and any deployment of troops. This position was shared by the SPD/Green government, which led to some friction with the United States.
In August 2006, the German government disclosed a botched plot to bomb two German trains. The attack was to occur in July 2006 and involved a 21-year-old Lebanese man, identified only as Youssef Mohammed E. H. Prosecutors said Youssef and another man left suitcases stuffed with crude propane-gas bombs on the trains.
As of February 2007, Germany had about 3,000 NATO-led International Security Assistance Force force in Afghanistan as part of the War on Terrorism, the third largest contingent after the United States (14,000) and the United Kingdom (5,200). German forces are mostly in the more secure north of the country.
However, Germany, along with some other larger European countries (with the exception of the UK and the Netherlands), have been criticised by the UK and Canada for not sharing the burden of the more intensive combat operations in southern Afghanistan.
Germany is the largest net contributor to the United Nations and has several development agencies working in Africa and the Middle East. The development policy of the Federal Republic of Germany is an independent area of German foreign policy. It is formulated by the Federal Ministry for Economic Cooperation and Development (BMZ) and carried out by the implementing organisations. The German government sees development policy as a joint responsibility of the international community. It is the world's third biggest aid donor after the United States and France. Germany spent 0.37 per cent of its gross domestic product (GDP) on development, which is below the government's target of increasing aid to 0.51 per cent of GDP by 2010. The international target of 0.7% of GNP would have not been reached either.
Germany is a member of the Council of Europe, European Union, European Space Agency, G4, G8, International Monetary Fund, NATO, OECD, Organization for Security and Co-operation in Europe, UN, World Bank Group and the World Trade Organization.
European integration has gone a long way since the European Coal and Steel Community (ECSC) and the Elysée Treaty. Peaceful collaborations with its neighbors remain one of Germany's biggest political objectives, and Germany has been on the forefront of most achievements made in European integration:
Most of the social issues facing European countries in general: immigration, aging populations, straining social-welfare and pension systems – are all important in Germany.
Germany seeks to maintain peace through the "deepening" of integration among current members of the European Union member states
Germany has been the largest net contributor to EU budgets for decades (in absolute terms – given Germany's comparatively large population – not per capita) and seeks to limit the growth of these net payments in the enlarged union.
Under the doctrine introduced by the 2003 Defense Policy Guidelines, Germany continues to give priority to the transatlantic partnership with the United States through the North Atlantic Treaty Organization. However, Germany is giving increasing attention to coordinating its policies with the European Union through the Common Foreign and Security Policy.
The German Federal Government began an initiative to obtain a permanent seat in the United Nations Security Council, as part of the Reform of the United Nations. This would require approval of a two-thirds majority of the member states and approval of all five Security Council veto powers.
This aspiration could be successful due to Germany's good relations with the People's Republic of China and the Russian Federation. Germany is a stable and democratic republic and a G7 country which are also favourable attributes. The United Kingdom and France support German ascension to the supreme body. The U.S. is sending mixed signals.
NATO member states, including Germany, decided not to sign the UN treaty on the Prohibition of Nuclear Weapons, a binding agreement for negotiations for the total elimination of nuclear weapons, supported by more than 120 nations.
The German government was a strong supporter of the enlargement of NATO.
Germany was one of the first nations to recognize Croatia and Slovenia as independent nations, rejecting the concept of Yugoslavia as the only legitimate political order in the Balkans (unlike other European powers, who first proposed a pro-Belgrade policy). This is why Serb authorities sometimes referred to "new German imperialism" as one of the main reasons for Yugoslavia's collapse. German troops participate in the multinational efforts to bring "peace and stability" to the Balkans.
Weimar triangle (France, Germany and Poland); Germany continues to be active economically in the states of Central Europe, and to actively support the development of democratic institutions. In the 2000s, Germany has been arguably the centerpiece of the European Union (though the importance of France cannot be overlooked in this connection).
|
https://en.wikipedia.org/wiki?curid=11934
|
Politics of Germany
Germany is a democratic, federal parliamentary republic, where federal legislative power is vested in the Bundestag (the parliament of Germany) and the Bundesrat (the representative body of the Länder, Germany's regional states).
The multilateral system has, since 1949, been dominated by the Christian Democratic Union (CDU) and the Social Democratic Party of Germany (SPD). The judiciary of Germany is independent of the executive and the legislature, while it is common for leading members of the executive to be members of the legislature as well. The political system is laid out in the 1949 constitution, the "Grundgesetz" (Basic Law), which remained in effect with minor amendments after German reunification in 1990.
The constitution emphasizes the protection of individual liberty in an extensive catalogue of human and civil rights and divides powers both between the federal and state levels and between the legislative, executive and judicial branches.
West Germany was a founding member of the European Community in 1958, which became the EU in 1993. It is part of the Schengen Area, and has been a member of the eurozone since 1999. It is a member of the United Nations, NATO, the G7, the G20 and the OECD.
After 1949, the Federal Republic of Germany had Christian Democratic chancellors for 20 years until a coalition between the Social Democrats and the Liberals took over. From 1982, Christian Democratic leader Helmut Kohl was chancellor in a coalition with the Liberals for 16 years. In this period fell the reunification of Germany, in 1990: the German Democratic Republic joined the Federal Republic. In the former GDR's territory, five "Länder" (states) were established or reestablished. The two parts of Berlin united as one "Land" (state).
The political system of the Federal Republic remained more or less unchanged. Specific provisions for the former GDR territory were enabled via the "unification treaty" between the Federal Republic and the GDR prior to the unification day of 3 October 1990. However, Germany saw in the following two distinct party systems: the Green party and the Liberals remained mostly West German parties, while in the East the former socialist state party, now called PDS, flourished along with the Christian Democrats and Social Democrats.
After 16 years of the Christian–Liberal coalition, led by Helmut Kohl, the Social Democratic Party of Germany (SPD) together with the Greens won the Bundestag elections of 1998. SPD vice chairman Gerhard Schröder positioned himself as a centrist candidate, in contradiction to the leftist SPD chairman Oskar Lafontaine. The Kohl government was hurt at the polls by slower economic growth in the East in the previous two years, and constantly high unemployment. The final margin of victory was sufficiently high to permit a "red-green" coalition of the SPD with Alliance 90/The Greens ("Bündnis '90/Die Grünen"), bringing the Greens into a national government for the first time.
Initial problems of the new government, marked by policy disputes between the moderate and traditional left wings of the SPD, resulted in some voter disaffection. Lafontaine left the government (and later his party) in early 1999. The CDU won in some important state elections but was hit in 2000 by a party donation scandal from the Kohl years. As a result of this Christian Democratic Union (CDU) crisis, Angela Merkel became chair.
The next election for the "Bundestag" was on 22 September 2002. Gerhard Schröder led the coalition of SPD and Greens to an eleven-seat victory over the Christian Democrat challengers headed by Edmund Stoiber (CSU). Three factors are generally cited that enabled Schröder to win the elections despite poor approval ratings a few months before and a weaker economy: good handling of the 100-year flood, firm opposition to the US 2003 invasion of Iraq, and Stoiber's unpopularity in the east, which cost the CDU crucial seats there.
In its second term, the red–green coalition lost several very important state elections, for example in Lower Saxony where Schröder was the prime minister from 1990 to 1998. On 20 April 2003, chancellor Schröder announced massive labor market reforms, called Agenda 2010, that cut unemployment benefits. Although these reforms sparked massive protests, they are now credited with being in part responsible for the relatively strong economic performance of Germany during the euro-crisis and the decrease in unemployment in Germany in the years 2006-2007.
On 22 May 2005 the SPD received a devastating defeat in its former heartland, North Rhine-Westphalia. Half an hour after the election results, the SPD chairman Franz Müntefering announced that the chancellor would clear the way for new federal elections.
This took the republic by surprise, especially because the SPD was below 20% in polls at the time. The CDU quickly announced Angela Merkel as Christian Democrat candidate for chancellor, aspiring to be the first female chancellor in German history.
New for the 2005 election was the alliance between the newly formed Electoral Alternative for Labor and Social Justice (WASG) and the PDS, planning to fuse into a common party (see Left Party.PDS). With the former SPD chairman, Oskar Lafontaine for the WASG and Gregor Gysi for the PDS as prominent figures, this alliance soon found interest in the media and in the population. Polls in July saw them as high as 12%.
Whereas in May and June 2005 victory of the Christian Democrats seemed highly likely, with some polls giving them an absolute majority, this picture changed shortly before the election on 18 September 2005.
The election results of 18 September were surprising because they differed widely from the polls of the previous weeks. The Christian Democrats even lost votes compared to 2002, narrowly reaching the first place with only 35.2%, and failed to get a majority for a "black–yellow" government of CDU/CSU and liberal FDP. But the red–green coalition also failed to get a majority, with the SPD losing votes, but polling 34.2% and the greens staying at 8.1%. The Left reached 8.7% and entered the "Bundestag", whereas the far-right NPD only got 1.6%.
The most likely outcome of coalition talks was a so-called grand coalition between the Christian Democrats (CDU/CSU) and the Social Democrats (SPD). Three party coalitions and coalitions involving The Left had been ruled out by all interested parties (including The Left itself). On 22 November 2005, Angela Merkel was sworn in by president Horst Köhler for the office of Bundeskanzlerin.
The existence of the grand coalition on federal level helped smaller parties' electoral prospects in state elections. Since in 2008, the CSU lost its absolute majority in Bavaria and formed a coalition with the FDP, the grand coalition had no majority in the "Bundesrat" and depended on FDP votes on important issues. In November 2008, the SPD re-elected its already retired chair Franz Müntefering and made Frank-Walter Steinmeier its leading candidate for the federal election in September 2009.
As a result of that federal election, the grand coalition brought losses for both parties and came to an end. The SPD suffered the heaviest losses in its history and was unable to form a coalition government. The CDU/CSU had only little losses but also reached a new historic low with its worst result since 1949. The three smaller parties thus had more seats in the German "Bundestag" than ever before, with the liberal party FDP winning 14.6% of votes.
The CDU/CSU and FDP together held 332 seats (of 622 total seats) and had been in coalition since 27 October 2009. Angela Merkel was re-elected as chancellor, and Guido Westerwelle served as the foreign minister and vice chancellor of Germany. After being elected into the federal government, the FDP suffered heavy losses in the following state elections. The FDP had promised to lower taxes in the electoral campaign, but after being part of the coalition they had to concede that this was not possible due to the economic crisis of 2008. Because of the losses, Guido Westerwelle had to resign as chair of the FDP in favor of Philipp Rösler, Federal minister of health, who was consequently appointed as vice chancellor. Shortly after, Philipp Rösler changed office and became federal minister of economics and technology.
After their electoral fall, the Social Democrats were led by Sigmar Gabriel, a former federal minister and prime minister of Lower Saxony, and by Frank-Walter Steinmeier as the head of the parliamentary group. He resigned on 16 January 2017 and proposed his longtime friend and president of European Parliament Martin Schulz as his successor and chancellor candidate.
Germany has seen increased political activity by citizens outside the established political parties with respect to local and environmental issues such as the location of Stuttgart 21, a railway hub, and construction of Berlin Brandenburg Airport.
The 18th federal elections in Germany resulted in the re-election of Angela Merkel and her Christian democratic parliamentary group of the parties CDU and CSU, receiving 41.5% of all votes. Following Merkel's first two historically low results, her third campaign marked the CDU/CSU's best result since 1994 and only for the second time in German history the possibility of gaining an absolute majority. Their former coalition partner, the FDP, narrowly failed to reach the 5% threshold and did not gain seats in the Bundestag.
Not having reached an absolute majority, the CDU/CSU formed a grand coalition with the social-democratic SPD after the longest coalition talks in history, making the head of the party Sigmar Gabriel vice-chancellor and federal Minister for Economic Affairs and Energy. Together they held 504 of a total 631 seats (CDU/CSU 311 and SPD 193). The only two opposition parties were The Left (64 seats) and Alliance '90/The Greens (63 seats), which was acknowledged as creating a critical situation in which the opposition parties did not even have enough seats to use the special controlling powers of the opposition.
The 19th federal elections in Germany took place on 24 September 2017. The two big parties, the conservative parliamentary group CDU/CSU and the social democrat SPD were in a similar situation as in 2009, after the last grand coalition had ended, and both had suffered severe losses; reaching their second worst and worst result respectively in 2017.
Many votes in the 2017 elections went to smaller parties, leading the right-wing populist party AfD (Alternative for Germany) into the Bundestag which marked a big shift in German politics since it was the first far-right party to win seats in parliament since the 1950s.
With Merkel's candidacy for a fourth term, the CDU/CSU only reached 33.0% of the votes, but won the highest number of seats, leaving no realistic coalition option without the CDU/CSU. As all parties in the Bundestag strictly ruled out a coalition with the AfD, the only options for a majority coalition were a so-called "Jamaican" coalition (CDU/CSU, FDP, Greens; named after the party colors resembling those of the Jamaican flag) and a grand coalition with the SPD, which was at first opposed by the Social Democrats and their leader Martin Schulz.
Coalition talks between the three parties of the "Jamaican" coalition were held but the final proposal was rejected by the liberals of the FDP, leaving the government in limbo. Following the unprecedented situation, for the first time in German history different minority coalitions or even direct snap coalitions were also heavily discussed. At this point, Federal President Steinmeier invited leaders of all parties for talks about a government, being the first President in the history of the Federal Republic to do so.
Official coalition talks between CDU/CSU and SPD started in January 2018 and led to a renewal of the grand coalition on 12 March 2018 as well as the subsequent re-election of Angela Merkel as chancellor.
The "Basic Law for the Federal Republic of Germany" (Grundgesetz der Bundesrepublik Deutschland) is the Constitution of Germany. It was formally approved on 8 May 1949, and, with the signature of the Allies of World War II on 12 May, came into effect on 23 May, as the constitution of those states of West Germany that were initially included within the Federal Republic. The 1949 Basic Law is a response to the perceived flaws of the 1919 Weimar Constitution, which failed to prevent the rise of the Nazi party in 1933. Since 1990, in the course of the reunification process after the fall of the Berlin Wall, the Basic Law also applies to the eastern states of the former German Democratic Republic.
The German head of state is the Federal President. As in Germany's parliamentary system of government, the Federal Chancellor runs the government and day-to-day politics, the role of the Federal President is mostly ceremonial. The Federal President, by their actions and public appearances, represents the state itself, its existence, its legitimacy, and unity. Their office involves an integrative role. Nearly all actions of the Federal President become valid only after a countersignature of a government member.
The President is not obliged by Constitution to refrain from political views. He or she is expected to give direction to general political and societal debates, but not in a way that links him to party politics. Most German Presidents were active politicians and party members prior to the office, which means that they have to change their political style when becoming President. The function comprises the official residence of Bellevue Palace.
Under Article 59 (1) of the Basic Law, the Federal President represents the Federal Republic of Germany in matters of international law, concludes treaties with foreign states on its behalf and accredits diplomats.
All federal laws must be signed by the President before they can come into effect; he or she does not have a veto, but the conditions for refusing to sign a law on the basis of unconstitutionality are the subject of debate. The office is currently held by Frank-Walter Steinmeier (since 2017).
The Federal President does have a role in the political system, especially at the establishment of a new government and the dissolution of the Bundestag (parliament). This role is usually nominal but can become significant in case of political instability. Additionally, a Federal President together with the Federal Council can support the government in a "legislatory emergency state" to enable laws against the will of the Bundestag (Article 81 of the Basic Law). However, until now the Federal President has never had to use these "reserve powers".
The "Bundeskanzler" (federal chancellor) heads the "Bundesregierung" (federal government) and thus the executive branch of the federal government. They are elected by and responsible to the "Bundestag", Germany's parliament. The other members of the government are the Federal Ministers; they are chosen by the Chancellor. Germany, like the United Kingdom, can thus be classified as a parliamentary system. The office is currently held by Angela Merkel (since 2005).
The Chancellor cannot be removed from office during a four-year term unless the "Bundestag" has agreed on a successor. This constructive vote of no confidence is intended to avoid a similar situation to that of the Weimar Republic in which the executive did not have enough support in the legislature to govern effectively, but the legislature was too divided to name a successor. The current system also prevents the Chancellor from calling a snap election.
Except in the periods 1969–1972 and 1976–1982, when the Social Democratic party of Chancellor Brandt and Schmidt came in second in the elections, the chancellor has always been the candidate of the largest party, usually supported by a coalition of two parties with a majority in the parliament. The chancellor appoints one of the federal ministers as their deputy, who has the unofficial title Vice Chancellor (). The office is currently held by Olaf Scholz (since March 2018).
The German Cabinet (Bundeskabinett or Bundesregierung) is the chief executive body of the Federal Republic of Germany. It consists of the chancellor and the cabinet ministers. The fundamentals of the cabinet's organization are set down in articles 62–69 of the Basic Law. The current cabinet is Merkel IV (since 2018).
Agencies of the German government include:
Federal legislative power is divided between the "Bundestag" and the "Bundesrat". The "Bundestag" is directly elected by the German people, while the "Bundesrat" represents the governments of the regional states ("Länder"). The federal legislature has powers of exclusive jurisdiction and concurrent jurisdiction with the states in areas specified in the constitution.
The "Bundestag" is more powerful than the "Bundesrat" and only needs the latter's consent for proposed legislation related to revenue shared by the federal and state governments, and the imposition of responsibilities on the states. In practice, however, the agreement of the "Bundesrat" in the legislative process is often required, since federal legislation frequently has to be executed by state or local agencies. In the event of disagreement between the "Bundestag" and the "Bundesrat", either side can appeal to the to find a compromise.
The "Bundestag" (Federal Diet) is elected for a four-year term and consists of 598 or more members elected by a means of mixed-member proportional representation, which Germans call "personalised proportional representation". 299 members represent single-seat constituencies and are elected by a first past the post electoral system. Parties that obtain fewer constituency seats than their national share of the vote are allotted seats from party lists to make up the difference. In contrast, parties that obtain more constituency seats than their national share of the vote are allowed to keep these so-called overhang seats. In the parliament that was elected in 2009, there were 24 overhang seats, giving the "Bundestag" a total of 622 members. After Bundestag elections since 2013, other parties obtain extra seats ("balance seats") that offset advantages from their rival's overhang seats. The current "Bundestag" is the largest in German history with 709 members.
A party must receive either five percent of the national vote or win at least three directly elected seats to be eligible for non-constituency seats in the "Bundestag". This rule, often called the "five percent hurdle", was incorporated into Germany's election law to prevent political fragmentation and disproportionately influential minority parties. The first "Bundestag" elections were held in the Federal Republic of Germany ("West Germany") on 14 August 1949. Following reunification, elections for the first all-German "Bundestag" were held on 2 December 1990. The last federal election was held on 24 September 2017.
Germany follows the civil law tradition. The judicial system comprises three types of courts.
The main difference between the Federal Constitutional Court and the Federal Court of Justice is that the Federal Constitutional Court may only be called if a constitutional matter within a case is in question (e.g. a possible violation of human rights in a criminal trial), while the Federal Court of Justice may be called in any case.
Germany maintains a network of 229 diplomatic missions abroad and holds relations with more than 190 countries. It is the largest contributor to the budget of the European Union (providing 27%) and third largest contributor to the United Nations (providing 8%). Germany is a member of the NATO defence alliance, the Organisation of Economic Co-operation and Development (OECD), the G8, the G20, the World Bank and the International Monetary Fund (IMF).
Germany has played a leading role in the European Union since its inception and has maintained a strong alliance with France since the end of World War II. The alliance was especially close in the late 1980s and early 1990s under the leadership of Christian Democrat Helmut Kohl and Socialist François Mitterrand. Germany is at the forefront of European states seeking to advance the creation of a more unified European political, defence, and security apparatus. For a number of decades after WWII, the Federal Republic of Germany kept a notably low profile in international relations, because of both its recent history and its occupation by foreign powers.
During the Cold War, Germany's partition by the Iron Curtain made it a symbol of East–West tensions and a political battleground in Europe. However, Willy Brandt's "Ostpolitik" was a key factor in the "détente" of the 1970s. In 1999, Chancellor Gerhard Schröder's government defined a new basis for German foreign policy by taking a full part in the decisions surrounding the NATO war against Yugoslavia and by sending German troops into combat for the first time since World War II.
The governments of Germany and the United States are close political allies. The 1948 Marshall Plan and strong cultural ties have crafted a strong bond between the two countries, although Schröder's very vocal opposition to the Iraq War had suggested the end of Atlanticism and a relative cooling of German–American relations. The two countries are also economically interdependent: 5.0% of German exports in goods are US-bound and 3.5% of German imported goods originate from the US with a trade deficit of -63,678.5 million dollars for the United States (2017). Other signs of the close ties include the continuing position of German–Americans as the largest reported ethnic group in the US, and the status of Ramstein Air Base (near Kaiserslautern) as the largest US military community outside the US.
The policy on foreign aid is an important area of German foreign policy. It is formulated by the Federal Ministry for Economic Cooperation and Development (BMZ) and carried out by the implementing organisations. The German government sees development policy as a joint responsibility of the international community. It is the world's fourth biggest aid donor after the United States, the United Kingdom and France. Germany spent 0.37 per cent of its gross domestic product (GDP) on development, which is below the government's target of increasing aid to 0.51 per cent of GDP by 2010.
Germany comprises sixteen states that are collectively referred to as "Länder". Due to differences in size and population, the subdivision of these states varies especially between city-states ("Stadtstaaten") and states with larger territories ("Flächenländer"). For regional administrative purposes five states, namely Baden-Württemberg, Bavaria, Hesse, North Rhine-Westphalia and Saxony, consist of a total of 22 Government Districts ("Regierungsbezirke"). As of 2009 Germany is divided into 403 districts ("Kreise") on municipal level, these consist of 301 rural districts and 102 urban districts.
|
https://en.wikipedia.org/wiki?curid=11935
|
History of geometry
Geometry (from the ; "geo-" "earth", "-metron" "measurement") arose as the field of knowledge dealing with spatial relationships. Geometry was one of the two fields of pre-modern mathematics, the other being the study of numbers (arithmetic).
Classic geometry was focused in compass and straightedge constructions. Geometry was revolutionized by Euclid, who introduced mathematical rigor and the axiomatic method still in use today. His book, "The Elements" is widely considered the most influential textbook of all time, and was known to all educated people in the West until the middle of the 20th century.
In modern times, geometric concepts have been generalized to a high level of abstraction and complexity, and have been subjected to the methods of calculus and abstract algebra, so that many modern branches of the field are barely recognizable as the descendants of early geometry. (See Areas of mathematics and Algebraic geometry.)
The earliest recorded beginnings of geometry can be traced to early peoples, who discovered obtuse triangles in the ancient Indus Valley (see Harappan mathematics), and ancient Babylonia (see Babylonian mathematics) from around 3000 BC. Early geometry was a collection of empirically discovered principles concerning lengths, angles, areas, and volumes, which were developed to meet some practical need in surveying, construction, astronomy, and various crafts. Among these were some surprisingly sophisticated principles, and a modern mathematician might be hard put to derive some of them without the use of calculus and algebra . For example, both the Egyptians and the Babylonians were aware of versions of the Pythagorean theorem about 1500 years before Pythagoras and the Indian Sulba Sutras around 800 BC contained the first statements of the theorem; the Egyptians had a correct formula for the volume of a frustum of a square pyramid;
The ancient Egyptians knew that they could approximate the area of a circle as follows:
Problem 30 of the Ahmes papyrus uses these methods to calculate the area of a circle, according to a rule that the area is equal to the square of 8/9 of the circle's diameter. This assumes that is 4×(8/9)2 (or 3.160493...), with an error of slightly over 0.63 percent. This value was slightly less accurate than the calculations of the Babylonians (25/8 = 3.125, within 0.53 percent), but was not otherwise surpassed until Archimedes' approximation of 211875/67441 = 3.14163, which had an error of just over 1 in 10,000.
Ahmes knew of the modern 22/7 as an approximation for , and used it to split a hekat, hekat x 22/x x 7/22 = hekat; however, Ahmes continued to use the traditional 256/81 value for for computing his hekat volume found in a cylinder.
Problem 48 involved using a square with side 9 units. This square was cut into a 3x3 grid. The diagonal of the corner squares were used to make an irregular octagon with an area of 63 units. This gave a second value for of 3.111...
The two problems together indicate a range of values for between 3.11 and 3.16.
Problem 14 in the Moscow Mathematical Papyrus gives the only ancient example finding the volume of a frustum of a pyramid, describing the correct formula:
where "a" and "b" are the base and top side lengths of the truncated pyramid and "h" is the height.
The Babylonians may have known the general rules for measuring areas and volumes. They measured the circumference of a circle as three times the diameter and the area as one-twelfth the square of the circumference, which would be correct if "π" is estimated as 3. The volume of a cylinder was taken as the product of the base and the height, however, the volume of the frustum of a cone or a square pyramid was incorrectly taken as the product of the height and half the sum of the bases. The Pythagorean theorem was also known to the Babylonians. Also, there was a recent discovery in which a tablet used "π" as 3 and 1/8. The Babylonians are also known for the Babylonian mile, which was a measure of distance equal to about seven miles today. This measurement for distances eventually was converted to a time-mile used for measuring the travel of the Sun, therefore, representing time. There have been recent discoveries showing that ancient Babylonians may have discovered astronomical geometry nearly 1400 years before Europeans did.
The Indian Vedic period had a tradition of geometry, mostly expressed in the construction of elaborate altars.
Early Indian texts (1st millennium BC) on this topic include the "Satapatha Brahmana" and the "Śulba Sūtras".
According to , the "Śulba Sūtras" contain "the earliest extant verbal expression of the Pythagorean Theorem in the world, although it had already been known to the Old Babylonians." The diagonal rope (') of an oblong (rectangle) produces both which the flank ("pārśvamāni") and the horizontal (') produce separately."
They contain lists of Pythagorean triples, which are particular cases of Diophantine equations.
They also contain statements (that with hindsight we know to be approximate) about squaring the circle and "circling the square."
The "Baudhayana Sulba Sutra", the best-known and oldest of the "Sulba Sutras" (dated to the 8th or 7th century BC) contains examples of simple Pythagorean triples, such as: formula_7, formula_8, formula_9, formula_10, and formula_11 as well as a statement of the Pythagorean theorem for the sides of a square: "The rope which is stretched across the diagonal of a square produces an area double the size of the original square." It also contains the general statement of the Pythagorean theorem (for the sides of a rectangle): "The rope stretched along the length of the diagonal of a rectangle makes an area which the vertical and horizontal sides make together."
According to mathematician S. G. Dani, the Babylonian cuneiform tablet Plimpton 322 written c. 1850 BC "contains fifteen Pythagorean triples with quite large entries, including (13500, 12709, 18541) which is a primitive triple, indicating, in particular, that there was sophisticated understanding on the topic" in Mesopotamia in 1850 BC. "Since these tablets predate the Sulbasutras period by several centuries, taking into account the contextual appearance of some of the triples, it is reasonable to expect that similar understanding would have been there in India." Dani goes on to say:
would not correspond directly to the overall knowledge on the topic at that time. Since, unfortunately, no other contemporaneous sources have been found it may never be possible to settle this issue satisfactorily."
In all, three "Sulba Sutras" were composed. The remaining two, the "Manava Sulba Sutra" composed by Manava (fl. 750-650 BC) and the "Apastamba Sulba Sutra", composed by Apastamba (c. 600 BC), contained results similar to the "Baudhayana Sulba Sutra".
For the ancient Greek mathematicians, geometry was the crown jewel of their sciences, reaching a completeness and perfection of methodology that no other branch of their knowledge had attained. They expanded the range of geometry to many new kinds of figures, curves, surfaces, and solids; they changed its methodology from trial-and-error to logical deduction; they recognized that geometry studies "eternal forms", or abstractions, of which physical objects are only approximations; and they developed the idea of the "axiomatic method", still in use today.
Thales (635-543 BC) of Miletus (now in southwestern Turkey), was the first to whom deduction in mathematics is attributed. There are five geometric propositions for which he wrote deductive proofs, though his proofs have not survived. Pythagoras (582-496 BC) of Ionia, and later, Italy, then colonized by Greeks, may have been a student of Thales, and traveled to Babylon and Egypt. The theorem that bears his name may not have been his discovery, but he was probably one of the first to give a deductive proof of it. He gathered a group of students around him to study mathematics, music, and philosophy, and together they discovered most of what high school students learn today in their geometry courses. In addition, they made the profound discovery of incommensurable lengths and irrational numbers.
Plato (427-347 BC) was a philosopher, highly esteemed by the Greeks. There is a story that he had inscribed above the entrance to his famous school, "Let none ignorant of geometry enter here." However, the story is considered to be untrue. Though he was not a mathematician himself, his views on mathematics had great influence. Mathematicians thus accepted his belief that geometry should use no tools but compass and straightedge – never measuring instruments such as a marked ruler or a protractor, because these were a workman's tools, not worthy of a scholar. This dictum led to a deep study of possible compass and straightedge constructions, and three classic construction problems: how to use these tools to trisect an angle, to construct a cube twice the volume of a given cube, and to construct a square equal in area to a given circle. The proofs of the impossibility of these constructions, finally achieved in the 19th century, led to important principles regarding the deep structure of the real number system. Aristotle (384-322 BC), Plato's greatest pupil, wrote a treatise on methods of reasoning used in deductive proofs (see Logic) which was not substantially improved upon until the 19th century.
Euclid (c. 325-265 BC), of Alexandria, probably a student at the Academy founded by Plato, wrote a treatise in 13 books (chapters), titled "The Elements of Geometry", in which he presented geometry in an ideal axiomatic form, which came to be known as Euclidean geometry. The treatise is not a compendium of all that the Hellenistic mathematicians knew at the time about geometry; Euclid himself wrote eight more advanced books on geometry. We know from other references that Euclid's was not the first elementary geometry textbook, but it was so much superior that the others fell into disuse and were lost. He was brought to the university at Alexandria by Ptolemy I, King of Egypt.
"The Elements" began with definitions of terms, fundamental geometric principles (called "axioms" or "postulates"), and general quantitative principles (called "common notions") from which all the rest of geometry could be logically deduced. Following are his five axioms, somewhat paraphrased to make the English easier to read.
Concepts, that are now understood as algebra, were expressed geometrically by Euclid, a method referred to as Greek geometric algebra.
Archimedes (287-212 BC), of Syracuse, Sicily, when it was a Greek city-state, is often considered to be the greatest of the Greek mathematicians, and occasionally even named as one of the three greatest of all time (along with Isaac Newton and Carl Friedrich Gauss). Had he not been a mathematician, he would still be remembered as a great physicist, engineer, and inventor. In his mathematics, he developed methods very similar to the coordinate systems of analytic geometry, and the limiting process of integral calculus. The only element lacking for the creation of these fields was an efficient algebraic notation in which to express his concepts.
After Archimedes, Hellenistic mathematics began to decline. There were a few minor stars yet to come, but the golden age of geometry was over. Proclus (410-485), author of "Commentary on the First Book of Euclid", was one of the last important players in Hellenistic geometry. He was a competent geometer, but more importantly, he was a superb commentator on the works that preceded him. Much of that work did not survive to modern times, and is known to us only through his commentary. The Roman Republic and Empire that succeeded and absorbed the Greek city-states produced excellent engineers, but no mathematicians of note.
The great Library of Alexandria was later burned. There is a growing consensus among historians that the Library of Alexandria likely suffered from several destructive events, but that the destruction of Alexandria's pagan temples in the late 4th century was probably the most severe and final one. The evidence for that destruction is the most definitive and secure. Caesar's invasion may well have led to the loss of some 40,000-70,000 scrolls in a warehouse adjacent to the port (as Luciano Canfora argues, they were likely copies produced by the Library intended for export), but it is unlikely to have affected the Library or Museum, given that there is ample evidence that both existed later.
Civil wars, decreasing investments in maintenance and acquisition of new scrolls and generally declining interest in non-religious pursuits likely contributed to a reduction in the body of material available in the Library, especially in the 4th century. The Serapeum was certainly destroyed by Theophilus in 391, and the Museum and Library may have fallen victim to the same campaign.
In the Bakhshali manuscript, there is a handful of geometric problems (including problems about volumes of irregular solids). The Bakhshali manuscript also "employs a decimal place value system with a dot for zero." Aryabhata's "Aryabhatiya" (499) includes the computation of areas and volumes.
Brahmagupta wrote his astronomical work "" in 628. Chapter 12, containing 66 Sanskrit verses, was divided into two sections: "basic operations" (including cube roots, fractions, ratio and proportion, and barter) and "practical mathematics" (including mixture, mathematical series, plane figures, stacking bricks, sawing of timber, and piling of grain). In the latter section, he stated his famous theorem on the diagonals of a cyclic quadrilateral:
Brahmagupta's theorem: If a cyclic quadrilateral has diagonals that are perpendicular to each other, then the perpendicular line drawn from the point of intersection of the diagonals to any side of the quadrilateral always bisects the opposite side.
Chapter 12 also included a formula for the area of a cyclic quadrilateral (a generalization of Heron's formula), as well as a complete description of rational triangles ("i.e." triangles with rational sides and rational areas).
Brahmagupta's formula: The area, "A", of a cyclic quadrilateral with sides of lengths "a", "b", "c", "d", respectively, is given by
where "s", the semiperimeter, given by: formula_17
Brahmagupta's Theorem on rational triangles: A triangle with rational sides formula_18 and rational area is of the form:
for some rational numbers formula_20 and formula_21.
The first definitive work (or at least oldest existent) on geometry in China was the "Mo Jing", the Mohist canon of the early philosopher Mozi (470-390 BC). It was compiled years after his death by his followers around the year 330 BC. Although the "Mo Jing" is the oldest existent book on geometry in China, there is the possibility that even older written material existed. However, due to the infamous Burning of the Books in a political maneuver by the Qin Dynasty ruler Qin Shihuang (r. 221-210 BC), multitudes of written literature created before his time were purged. In addition, the "Mo Jing" presents geometrical concepts in mathematics that are perhaps too advanced not to have had a previous geometrical base or mathematic background to work upon.
The "Mo Jing" described various aspects of many fields associated with physical science, and provided a small wealth of information on mathematics as well. It provided an 'atomic' definition of the geometric point, stating that a line is separated into parts, and the part which has no remaining parts (i.e. cannot be divided into smaller parts) and thus forms the extreme end of a line is a point. Much like Euclid's first and third definitions and Plato's 'beginning of a line', the "Mo Jing" stated that "a point may stand at the end (of a line) or at its beginning like a head-presentation in childbirth. (As to its invisibility) there is nothing similar to it." Similar to the atomists of Democritus, the "Mo Jing" stated that a point is the smallest unit, and cannot be cut in half, since 'nothing' cannot be halved. It stated that two lines of equal length will always finish at the same place, while providing definitions for the "comparison of lengths" and for "parallels", along with principles of space and bounded space. It also described the fact that planes without the quality of thickness cannot be piled up since they cannot mutually touch. The book provided definitions for circumference, diameter, and radius, along with the definition of volume.
The Han Dynasty (202 BC-220 AD) period of China witnessed a new flourishing of mathematics. One of the oldest Chinese mathematical texts to present geometric progressions was the "Suàn shù shū" of 186 BC, during the Western Han era. The mathematician, inventor, and astronomer Zhang Heng (78-139 AD) used geometrical formulas to solve mathematical problems. Although rough estimates for pi (π) were given in the "Zhou Li" (compiled in the 2nd century BC), it was Zhang Heng who was the first to make a concerted effort at creating a more accurate formula for pi. Zhang Heng approximated pi as 730/232 (or approx 3.1466), although he used another formula of pi in finding a spherical volume, using the square root of 10 (or approx 3.162) instead. Zu Chongzhi (429-500 AD) improved the accuracy of the approximation of pi to between 3.1415926 and 3.1415927, with 355⁄113 (密率, Milü, detailed approximation) and 22⁄7 (约率, Yuelü, rough approximation) being the other notable approximation. In comparison to later works, the formula for pi given by the French mathematician Franciscus Vieta (1540-1603) fell halfway between Zu's approximations.
"The Nine Chapters on the Mathematical Art", the title of which first appeared by 179 AD on a bronze inscription, was edited and commented on by the 3rd century mathematician Liu Hui from the Kingdom of Cao Wei. This book included many problems where geometry was applied, such as finding surface areas for squares and circles, the volumes of solids in various three-dimensional shapes, and included the use of the Pythagorean theorem. The book provided illustrated proof for the Pythagorean theorem, contained a written dialogue between of the earlier Duke of Zhou and Shang Gao on the properties of the right angle triangle and the Pythagorean theorem, while also referring to the astronomical gnomon, the circle and square, as well as measurements of heights and distances. The editor Liu Hui listed pi as 3.141014 by using a 192 sided polygon, and then calculated pi as 3.14159 using a 3072 sided polygon. This was more accurate than Liu Hui's contemporary Wang Fan, a mathematician and astronomer from Eastern Wu, would render pi as 3.1555 by using 142⁄45. Liu Hui also wrote of mathematical surveying to calculate distance measurements of depth, height, width, and surface area. In terms of solid geometry, he figured out that a wedge with rectangular base and both sides sloping could be broken down into a pyramid and a tetrahedral wedge. He also figured out that a wedge with trapezoid base and both sides sloping could be made to give two tetrahedral wedges separated by a pyramid. Furthermore, Liu Hui described Cavalieri's principle on volume, as well as Gaussian elimination. From the "Nine Chapters", it listed the following geometrical formulas that were known by the time of the Former Han Dynasty (202 BCE–9 CE).
Areas for the
Volumes for the
Continuing the geometrical legacy of ancient China, there were many later figures to come, including the famed astronomer and mathematician Shen Kuo (1031-1095 CE), Yang Hui (1238-1298) who discovered Pascal's Triangle, Xu Guangqi (1562-1633), and many others.
By the beginning of the 9th century, the "Islamic Golden Age" flourished, the establishment of the House of Wisdom in Baghdad marking a separate tradition of science in the medieval Islamic world, building not only Hellenistic but also on Indian sources.
Although the Islamic mathematicians are most famed for their work on algebra, number theory and number systems, they also made considerable contributions to geometry, trigonometry and mathematical astronomy, and were responsible for the development of algebraic geometry.
Al-Mahani (born 820) conceived the idea of reducing geometrical problems such as duplicating the cube to problems in algebra. Al-Karaji (born 953) completely freed algebra from geometrical operations and replaced them with the arithmetical type of operations which are at the core of algebra today.
Thābit ibn Qurra (known as Thebit in Latin) (born 836) contributed to a number of areas in mathematics, where he played an important role in preparing the way for such important mathematical discoveries as the extension of the concept of number to (positive) real numbers, integral calculus, theorems in spherical trigonometry, analytic geometry, and non-Euclidean geometry. In astronomy Thabit was one of the first reformers of the Ptolemaic system, and in mechanics he was a founder of statics. An important geometrical aspect of Thabit's work was his book on the composition of ratios. In this book, Thabit deals with arithmetical operations applied to ratios of geometrical quantities. The Greeks had dealt with geometric quantities but had not thought of them in the same way as numbers to which the usual rules of arithmetic could be applied. By introducing arithmetical operations on quantities previously regarded as geometric and non-numerical, Thabit started a trend which led eventually to the generalisation of the number concept.
In some respects, Thabit is critical of the ideas of Plato and Aristotle, particularly regarding motion. It would seem that here his ideas are based on an acceptance of using arguments concerning motion in his geometrical arguments. Another important contribution Thabit made to geometry was his generalization of the Pythagorean theorem, which he extended from special right triangles to all triangles in general, along with a general proof.
Ibrahim ibn Sinan ibn Thabit (born 908), who introduced a method of integration more general than that of Archimedes, and al-Quhi (born 940) were leading figures in a revival and continuation of Greek higher geometry in the Islamic world. These mathematicians, and in particular Ibn al-Haytham, studied optics and investigated the optical properties of mirrors made from conic sections.
Astronomy, time-keeping and geography provided other motivations for geometrical and trigonometrical research. For example, Ibrahim ibn Sinan and his grandfather Thabit ibn Qurra both studied curves required in the construction of sundials. Abu'l-Wafa and Abu Nasr Mansur both applied spherical geometry to astronomy.
A 2007 paper in the journal "Science" suggested that girih tiles possessed properties consistent with self-similar fractal quasicrystalline tilings such as the Penrose tilings.
The transmission of the Greek Classics to medieval Europe via the Arabic literature of the 9th to 10th century "Islamic Golden Age" began in the 10th century and culminated in the Latin translations of the 12th century.
A copy of Ptolemy's "Almagest" was brought back to Sicily by Henry Aristippus (d. 1162), as a gift from the Emperor to King William I (r. 1154–1166). An anonymous student at Salerno travelled to Sicily and translated the "Almagest" as well as several works by Euclid from Greek to Latin. Although the Sicilians generally translated directly from the Greek, when Greek texts were not available, they would translate from Arabic. Eugenius of Palermo (d. 1202) translated Ptolemy's "Optics" into Latin, drawing on his knowledge of all three languages in the task.
The rigorous deductive methods of geometry found in Euclid's "Elements of Geometry" were relearned, and further development of geometry in the styles of both Euclid (Euclidean geometry) and Khayyam (algebraic geometry) continued, resulting in an abundance of new theorems and concepts, many of them very profound and elegant.
Advances in the treatment of perspective were made in Renaissance art of the 14th to 15th century which went beyond what had been achieved in antiquity.
In Renaissance architecture of the "Quattrocento", concepts of architectural order were explored and rules were formulated. A prime example of is the Basilica di San Lorenzo in Florence by Filippo Brunelleschi (1377–1446).
In c. 1413 Filippo Brunelleschi demonstrated the geometrical method of perspective, used today by artists, by painting the outlines of various Florentine buildings onto a mirror.
Soon after, nearly every artist in Florence and in Italy used geometrical perspective in their paintings, notably Masolino da Panicale and Donatello. Melozzo da Forlì first used the technique of upward foreshortening (in Rome, Loreto, Forlì and others), and was celebrated for that. Not only was perspective a way of showing depth, it was also a new method of composing a painting. Paintings began to show a single, unified scene, rather than a combination of several.
As shown by the quick proliferation of accurate perspective paintings in Florence, Brunelleschi likely understood (with help from his friend the mathematician Toscanelli), but did not publish, the mathematics behind perspective. Decades later, his friend Leon Battista Alberti wrote "De pictura" (1435/1436), a treatise on proper methods of showing distance in painting based on Euclidean geometry. Alberti was also trained in the science of optics through the school of Padua and under the influence of Biagio Pelacani da Parma who studied Alhazen's "Optics'.
Piero della Francesca elaborated on Della Pittura in his "De Prospectiva Pingendi" in the 1470s. Alberti had limited himself to figures on the ground plane and giving an overall basis for perspective. Della Francesca fleshed it out, explicitly covering solids in any area of the picture plane. Della Francesca also started the now common practice of using illustrated figures to explain the mathematical concepts, making his treatise easier to understand than Alberti's. Della Francesca was also the first to accurately draw the Platonic solids as they would appear in perspective.
Perspective remained, for a while, the domain of Florence. Jan van Eyck, among others, was unable to create a consistent structure for the converging lines in paintings, as in London's The Arnolfini Portrait, because he was unaware of the theoretical breakthrough just then occurring in Italy. However he achieved very subtle effects by manipulations of scale in his interiors. Gradually, and partly through the movement of academies of the arts, the Italian techniques became part of the training of artists across Europe, and later other parts of the world.
The culmination of these Renaissance traditions finds its ultimate synthesis in the research of the architect, geometer, and optician Girard Desargues on perspective, optics and projective geometry.
The "Vitruvian Man" by Leonardo da Vinci(c. 1490) depicts a man in two superimposed positions with his arms and legs apart and inscribed in a circle and square. The drawing is based on the correlations of ideal human proportions with geometry described by the ancient Roman architect Vitruvius in Book III of his treatise "De Architectura".
In the early 17th century, there were two important developments in geometry. The first and most important was the creation of analytic geometry, or geometry with coordinates and equations, by René Descartes (1596–1650) and Pierre de Fermat (1601–1665). This was a necessary precursor to the development of calculus and a precise quantitative science of physics. The second geometric development of this period was the systematic study of projective geometry by Girard Desargues (1591–1661). Projective geometry is the study of geometry without measurement, just the study of how points align with each other. There had been some early work in this area by Hellenistic geometers, notably Pappus (c. 340). The greatest flowering of the field occurred with Jean-Victor Poncelet (1788–1867).
In the late 17th century, calculus was developed independently and almost simultaneously by Isaac Newton (1642–1727) and Gottfried Wilhelm Leibniz (1646–1716). This was the beginning of a new field of mathematics now called analysis. Though not itself a branch of geometry, it is applicable to geometry, and it solved two families of problems that had long been almost intractable: finding tangent lines to odd curves, and finding areas enclosed by those curves. The methods of calculus reduced these problems mostly to straightforward matters of computation.
The very old problem of proving Euclid's Fifth Postulate, the "Parallel Postulate", from his first four postulates had never been forgotten. Beginning not long after Euclid, many attempted demonstrations were given, but all were later found to be faulty, through allowing into the reasoning some principle which itself had not been proved from the first four postulates. Though Omar Khayyám was also unsuccessful in proving the parallel postulate, his criticisms of Euclid's theories of parallels and his proof of properties of figures in non-Euclidean geometries contributed to the eventual development of non-Euclidean geometry. By 1700 a great deal had been discovered about what can be proved from the first four, and what the pitfalls were in attempting to prove the fifth. Saccheri, Lambert, and Legendre each did excellent work on the problem in the 18th century, but still fell short of success. In the early 19th century, Gauss, Johann Bolyai, and Lobatchewsky, each independently, took a different approach. Beginning to suspect that it was impossible to prove the Parallel Postulate, they set out to develop a self-consistent geometry in which that postulate was false. In this they were successful, thus creating the first non-Euclidean geometry. By 1854, Bernhard Riemann, a student of Gauss, had applied methods of calculus in a ground-breaking study of the intrinsic (self-contained) geometry of all smooth surfaces, and thereby found a different non-Euclidean geometry. This work of Riemann later became fundamental for Einstein's theory of relativity.
It remained to be proved mathematically that the non-Euclidean geometry was just as self-consistent as Euclidean geometry, and this was first accomplished by Beltrami in 1868. With this, non-Euclidean geometry was established on an equal mathematical footing with Euclidean geometry.
While it was now known that different geometric theories were mathematically possible, the question remained, "Which one of these theories is correct for our physical space?" The mathematical work revealed that this question must be answered by physical experimentation, not mathematical reasoning, and uncovered the reason why the experimentation must involve immense (interstellar, not earth-bound) distances. With the development of relativity theory in physics, this question became vastly more complicated.
All the work related to the Parallel Postulate revealed that it was quite difficult for a geometer to separate his logical reasoning from his intuitive understanding of physical space, and, moreover, revealed the critical importance of doing so. Careful examination had uncovered some logical inadequacies in Euclid's reasoning, and some unstated geometric principles to which Euclid sometimes appealed. This critique paralleled the crisis occurring in calculus and analysis regarding the meaning of infinite processes such as convergence and continuity. In geometry, there was a clear need for a new set of axioms, which would be complete, and which in no way relied on pictures we draw or on our intuition of space. Such axioms, now known as Hilbert's axioms, were given by David Hilbert in 1894 in his dissertation "Grundlagen der Geometrie" ("Foundations of Geometry"). Some other complete sets of axioms had been given a few years earlier, but did not match Hilbert's in economy, elegance, and similarity to Euclid's axioms.
In the mid-18th century, it became apparent that certain progressions of mathematical reasoning recurred when similar ideas were studied on the number line, in two dimensions, and in three dimensions. Thus the general concept of a metric space was created so that the reasoning could be done in more generality, and then applied to special cases. This method of studying calculus- and analysis-related concepts came to be known as analysis situs, and later as topology. The important topics in this field were properties of more general figures, such as connectedness and boundaries, rather than properties like straightness, and precise equality of length and angle measurements, which had been the focus of Euclidean and non-Euclidean geometry. Topology soon became a separate field of major importance, rather than a sub-field of geometry or analysis.
Developments in algebraic geometry included the study of curves and surfaces over finite fields as demonstrated by the works of among others André Weil, Alexander Grothendieck, and Jean-Pierre Serre as well as over the real or complex numbers. Finite geometry itself, the study of spaces with only finitely many points, found applications in coding theory and cryptography. With the advent of the computer, new disciplines such as computational geometry or digital geometry deal with geometric algorithms, discrete representations of geometric data, and so forth.
|
https://en.wikipedia.org/wiki?curid=11953
|
George H. W. Bush
George Herbert Walker Bush (June 12, 1924November 30, 2018) was an American politician and businessman who served as the 41st president of the United States from 1989 to 1993. A member of the Republican Party, Bush also served in the U.S. House of Representatives, as U.S. Ambassador to the United Nations, as Director of Central Intelligence, and as the 43rd vice president.
Bush was raised in Greenwich, Connecticut and attended Phillips Academy before serving in the United States Navy during World War II. After the war, he graduated from Yale University and moved to West Texas, where he established a successful oil company. After an unsuccessful run for the United States Senate, he won election to the 7th congressional district of Texas in 1966. President Richard Nixon appointed Bush to the position of Ambassador to the United Nations in 1971 and to the position of chairman of the Republican National Committee in 1973. In 1974, President Gerald Ford appointed him as the Chief of the Liaison Office to the People's Republic of China, and in 1976 Bush became the Director of Central Intelligence. Bush ran for president in 1980, but was defeated in the Republican presidential primaries by Ronald Reagan. He was then elected vice president in 1980 and 1984 as Reagan's running mate.
In the 1988 presidential election, Bush defeated Democrat Michael Dukakis, becoming the first incumbent vice president to be elected president since Martin Van Buren in 1836. Foreign policy drove the Bush presidency, as he navigated the final years of the Cold War and played a key role in the reunification of Germany. Bush presided over the invasion of Panama and the Gulf War, ending the Iraqi occupation of Kuwait in the latter conflict. Though the agreement was not ratified until after he left office, Bush negotiated and signed the North American Free Trade Agreement (NAFTA), which created a trade bloc consisting of the United States, Canada, and Mexico. Domestically, Bush reneged on by signing a bill that increased taxes and helped reduce the federal budget deficit. He also signed the Americans with Disabilities Act of 1990 and appointed David Souter and Clarence Thomas to the Supreme Court. Bush lost the 1992 presidential election to Democrat Bill Clinton following an economic recession and the decreased emphasis of foreign policy in a post–Cold War political climate.
After leaving office in 1993, Bush was active in humanitarian activities, often working alongside Clinton, his former opponent. With George W. Bush's victory in the 2000 presidential election, Bush and his son became the second father–son pair to serve as the nation's president, following John Adams and John Quincy Adams. Another son, Jeb Bush, unsuccessfully sought the Republican presidential nomination in the 2016 Republican primaries. After a long battle with vascular Parkinson's disease, Bush died at his home on November 30, 2018. Historians generally rank Bush as an above average president.
George Herbert Walker Bush was born in Milton, Massachusetts on June 12, 1924. He was the second son of Prescott Bush and Dorothy (Walker) Bush. His paternal grandfather, Samuel P. Bush, worked as an executive for a railroad parts company in Columbus, Ohio, and his maternal grandfather, George Herbert Walker, led Wall Street investment bank W. A. Harriman & Co. Bush was named after his maternal grandfather, who was known as "Pop", and young Bush was called "Poppy" as a tribute to his namesake. The Bush family moved to Greenwich, Connecticut in 1925, and Prescott took a position with W. A. Harriman & Co. (which later merged into Brown Brothers Harriman & Co.) the following year.
Bush spent most of his childhood in Greenwich, at the family vacation home in Kennebunkport, Maine, or at his maternal grandparents' plantation in South Carolina. Because of the family's wealth, Bush was largely unaffected by the Great Depression, a major economic downturn that led to high levels of unemployment for much of the 1930s. He attended Greenwich Country Day School from 1929 to 1937 and Phillips Academy, an elite private academy in Andover, Massachusetts, from 1937 to 1942. While at Phillips Academy, he served as president of the senior class, secretary of the student council, president of the community fund-raising group, a member of the editorial board of the school newspaper, and captain of the varsity baseball and soccer teams.
On December 7, 1941, Japan attacked the American naval base of Pearl Harbor, marking the start of U.S. participation in World War II. On his 18th birthday, immediately after graduating from Phillips Academy, he enlisted in the United States Navy as a naval aviator. After a period of training, he was commissioned as an ensign in the Naval Reserve at Naval Air Station Corpus Christi on June 9, 1943, becoming one of the youngest aviators in the Navy. Beginning in 1944, Bush served in the Pacific theater of World War II, where he flew a Grumman TBF Avenger, a torpedo bomber capable of taking off from aircraft carriers. His squadron was assigned to the as a member of Air Group 51, where his lanky physique earned him the nickname "Skin".
Bush flew his first combat mission in May 1944, bombing Japanese-held Wake Island, and was promoted to lieutenant (junior grade) on August 1, 1944. During an attack on a Japanese installation in Chichijima, Bush's aircraft successfully attacked several targets, but was downed by enemy fire. Though both of Bush's fellow crew members died, Bush successfully bailed out from the aircraft and was rescued by the . Several of the aviators shot down during the attack were captured and executed, and their livers were eaten by their captors. Bush's near-death experience shaped him profoundly, leading him to ask, "Why had I been spared and what did God have for me?" He was later awarded the Distinguished Flying Cross for his role in the mission.
Bush returned to "San Jacinto" in November 1944, participating in operations in the Philippines. In early 1945, he was assigned to a new combat squadron, VT-153, where he trained to take part in an invasion of mainland Japan. On September 2, 1945, before any invasion took place, Japan formally surrendered following the atomic bombings of Hiroshima and Nagasaki. Bush was released from active duty that same month, but was not formally discharged from the navy until October 1955, at which point he had reached the rank of lieutenant. By the end of his period of active service, Bush had flown 58 missions, completed 128 carrier landings, and recorded 1228 hours of flight time.
Bush met Barbara Pierce at a Christmas dance in Greenwich in December 1941, and, after a period of courtship, they became engaged in December 1943. While Bush was on leave from the navy, they married in Rye, New York, on January 6, 1945. The Bushes enjoyed a strong marriage, and Barbara would later be a popular First Lady, seen by many as "a kind of national grandmother". The marriage produced six children: George W. (b. 1946), Robin (b. 1949), Jeb (b. 1953), Neil (b. 1955), Marvin (b. 1956), and Doro (b. 1959). Their oldest daughter, Robin, died of leukemia in 1953.
Following the end of World War II, Bush enrolled at Yale University, where he took part in an accelerated program that enabled him to graduate in two and a half years rather than the usual four. He was a member of the Delta Kappa Epsilon fraternity and was elected its president. He also captained the Yale baseball team and played in the first two College World Series as a left-handed first baseman. Like his father, he was a member of the Yale cheerleading squad and was initiated into the Skull and Bones secret society. He graduated Phi Beta Kappa in 1948 with a Bachelor of Arts degree, majoring in economics and minoring in sociology.
After graduating from Yale, Bush moved his young family to West Texas. Biographer Jon Meacham writes that Bush's relocation to Texas allowed him to move out of the "daily shadow of his Wall Street father and Grandfather Walker, two dominant figures in the financial world", but would still allow Bush to "call on their connections if he needed to raise capital." His first position in Texas was an oil field equipment salesman for Dresser Industries, which was led by family friend Neil Mallon. While working for Dresser, Bush lived in various places with his family: Odessa, Texas; Ventura, Bakersfield and Compton, California; and Midland, Texas. In 1952, he volunteered for the successful presidential campaign of Republican candidate Dwight D. Eisenhower. That same year, his father won election to represent Connecticut in the United States Senate as a member of the Republican Party.
With support from Mallon and Bush's uncle, George Herbert Walker Jr., Bush and John Overbey launched the Bush-Overbey Oil Development Company in 1951. In 1953 he co-founded the Zapata Petroleum Corporation, an oil company that drilled in the Permian Basin in Texas. In 1954, he was named president of the Zapata Offshore Company, a subsidiary which specialized in offshore drilling. Shortly after the subsidiary became independent in 1959, Bush moved the company and his family from Midland to Houston. In Houston, he befriended James Baker, a prominent attorney who later became an important political ally. Bush remained involved with Zapata until the mid-1960s, when he sold his stock in the company for approximately $1 million. In 1988, "The Nation" published an article alleging that Bush worked as an operative of the Central Intelligence Agency (CIA) during the 1960s; Bush denied this allegation.
By the early 1960s, Bush was widely regarded as an appealing political candidate, and some leading Democrats attempted to convince Bush to become a Democrat. He declined to leave the Republican Party, later citing his belief that the national Democratic Party favored "big, centralized government". The Democratic Party had historically dominated Texas, but Republicans scored their first major victory in the state with John G. Tower's victory in a 1961 special election to the United States Senate. Motivated by Tower's victory, and hoping to prevent the far-right John Birch Society from coming to power, Bush ran for the chairmanship of the Harris County, Texas Republican Party, winning election in February 1963. Like most other Texas Republicans, Bush supported conservative Senator Barry Goldwater over the more centrist Nelson Rockefeller in the 1964 Republican Party presidential primaries.
In 1964, Bush sought to unseat liberal Democrat Ralph W. Yarborough in Texas's U.S. Senate election. Bolstered by superior fundraising, Bush won the Republican primary by defeating former gubernatorial nominee Jack Cox in a run-off election. In the general election, Bush attacked Yarborough's vote for the Civil Rights Act of 1964, which banned racial and gender discrimination in public institutions and in many privately owned businesses. Bush argued that the act unconstitutionally expanded the powers of the federal government, but he was privately uncomfortable with the racial politics of opposing the act. He lost the election 56 percent to 44 percent, though he did run well ahead of Goldwater, the Republican presidential nominee. Despite the loss, the "New York Times" reported that Bush was "rated by political friend and foe alike as the Republicans' best prospect in Texas because of his attractive personal qualities and the strong campaign he put up for the Senate".
In 1966, Bush ran for the United States House of Representatives in Texas's 7th congressional district, a newly redistricted seat in the Greater Houston area. Initial polling showed him trailing his Democratic opponent, Harris County District Attorney Frank Briscoe, but he ultimately won the race with 57 percent of the vote. In an effort to woo potential candidates in the South and Southwest, House Republicans secured Bush an appointment to the powerful United States House Committee on Ways and Means, making Bush the first freshman to serve on the committee since 1904. His voting record in the House was generally conservative. He supported the Nixon administration's Vietnam policies, but broke with Republicans on the issue of birth control, which he supported. He also voted for the Civil Rights Act of 1968, although it was generally unpopular in his district. In 1968, Bush joined several other Republicans in issuing the party's Response to the State of the Union address; Bush's part of the address focused on a call for fiscal responsibility.
Though most other Texas Republicans supported Ronald Reagan in the 1968 Republican Party presidential primaries, Bush endorsed Richard Nixon, who went on to win the party's nomination. Nixon considered selecting Bush as his running mate in the 1968 presidential election, but he ultimately chose Spiro Agnew instead. Bush won re-election to the House unopposed, while Nixon defeated Hubert Humphrey in the presidential election. In 1970, with President Nixon's support, Bush gave up his seat in the House to run for the Senate against Yarborough. Bush easily won the Republican primary, but Yarborough was defeated by the more centrist Lloyd Bentsen in the Democratic primary. Ultimately, Bentsen defeated Bush, taking 53.5 percent of the vote.
After the 1970 Senate election, Bush accepted a position as a senior adviser to the president, but he convinced Nixon to instead appoint him as the U.S. Ambassador to the United Nations. The position represented Bush's first foray into foreign policy, as well as his first major experiences with the Soviet Union and the People's Republic of China, the two major U.S. rivals in the Cold War. During Bush's tenure, the Nixon administration pursued a policy of détente, seeking to ease tensions with both the Soviet Union and China. Bush's ambassadorship was marked by a defeat on the China question, as the United Nations General Assembly voted to expel the Republic of China and replace it with the People's Republic of China in October 1971. In the 1971 crisis in Pakistan, Bush supported an Indian motion at the UN General Assembly to condemn the Pakistani government of Yahya Khan for waging genocide in East Pakistan (modern Bangladesh), referring to the "tradition which we have supported that the human rights question transcended domestic jurisdiction and should be freely debated". Bush's support for India at the UN put him into conflict with Nixon who was supporting Pakistan, partly because Yahya Khan was a useful intermediary in his attempts to reach out to China and partly because the president was fond of Yahya Khan.
After Nixon won a landslide victory in the 1972 presidential election, he appointed Bush as chair of the Republican National Committee (RNC). In that position, he was charged with fundraising, candidate recruitment, and making appearances on behalf of the party in the media.
When Agnew was being investigated for corruption, Bush assisted, at the request of Nixon and Agnew, in pressuring John Glenn Beall Jr., the U.S. Senator from Maryland to force his brother, George Beall the U.S. Attorney in Maryland, who was supervising the investigation into Agnew. Attorney Bealls ignored the pressure.
During Bush's tenure at the RNC, the Watergate scandal emerged into public view; the scandal originated from the June 1972 break-in of the Democratic National Committee, but also involved later efforts to cover up the break-in by Nixon and other members of the White House. Bush initially defended Nixon steadfastly, but as Nixon's complicity became clear he focused more on defending the Republican Party.
Following the resignation of Vice President Agnew in 1973 for a scandal unrelated to Watergate, Bush was considered for the position of vice president, but the appointment instead went to Gerald Ford. After the public release of an audio recording that confirmed that Nixon had plotted to use the CIA to cover up the Watergate break-in, Bush joined other party leaders in urging Nixon to resign. When Nixon resigned on August 9, 1974, Bush noted in his diary that "There was an aura of sadness, like somebody died... The [resignation] speech was vintage Nixon—a kick or two at the press—enormous strains. One couldn't help but look at the family and the whole thing and think of his accomplishments and then think of the shame... [President Gerald Ford's swearing-in offered] indeed a new spirit, a new lift."
Upon his ascension to the presidency, Ford strongly considered Bush, Donald Rumsfeld, and Nelson Rockefeller for the vacant position of vice president. Ford ultimately chose Nelson Rockefeller, partly because of the publication of a news report claiming that Bush's 1970 campaign had benefited from a secret fund set up by Nixon; Bush was later cleared of any suspicion by a special prosecutor. Bush accepted appointment as Chief of the U.S. Liaison Office in the People's Republic of China, making the him the de facto ambassador to China. According to biographer Jon Meacham, Bush's time in China convinced him that American engagement abroad was needed to ensure global stability, and that the United States "needed to be visible but not pushy, muscular but not domineering."
In January 1976, Ford brought Bush back to Washington to become the Director of Central Intelligence (DCI), placing him in charge of the CIA. In the aftermath of the Watergate scandal and the Vietnam War, the CIA's reputation had been damaged for its role in various covert operations, and Bush was tasked with restoring the agency's morale and public reputation. During Bush's year in charge of the CIA, the U.S. national security apparatus actively supported Operation Condor operations and right-wing military dictatorships in Latin America. Meanwhile, Ford decided to drop Rockefeller from the ticket for the 1976 presidential election; he considered Bush as his running mate, but ultimately chose Bob Dole. In his capacity as DCI, Bush gave national security briefings to Jimmy Carter both as a presidential candidate and as president-elect.
Bush's tenure at the CIA ended after Carter narrowly defeated Ford in the 1976 presidential election. Out of public office for the first time since the 1960s, Bush became chairman on the Executive Committee of the First International Bank in Houston. He also spent a year as a part-time professor of Administrative Science at Rice University's Jones School of Business, continued his membership in the Council on Foreign Relations, and joined the Trilateral Commission. Meanwhile, he began to lay the groundwork for his candidacy in the 1980 Republican Party presidential primaries. In the 1980 Republican primary campaign, Bush would face Ronald Reagan, who was widely regarded as the front-runner, as well as other contenders like Senator Bob Dole, Senator Howard Baker, Texas Governor John Connally, Congressman Phil Crane, and Congressman John B. Anderson.
Bush's campaign cast him as a youthful, "thinking man's candidate" who would emulate the pragmatic conservatism of President Eisenhower. In the midst of the Soviet–Afghan War, which brought an end to a period of détente, and the Iran hostage crisis, in which 52 Americans were taken hostage, the campaign highlighted Bush's foreign policy experience. At the outset of the race, Bush focused heavily on winning the January 21 Iowa caucuses, making 31 visits to the state. Ultimately, he won a close victory Iowa with 31.5% to Reagan's 29.4%. After the win, Bush stated that his campaign was full of momentum, or "the Big Mo", and Reagan reorganized his campaign. Partly in response to the Bush campaign's frequent questioning of Reagan's age (Reagan turned 69 in 1980), the Reagan campaign stepped up attacks on Bush, painting him as an elitist who was not truly committed to conservatism. Prior to the New Hampshire primary, Bush and Reagan agreed to a two-person debate, organized by "The Nashua Telegraph" but paid for by the Reagan campaign.
Days before the debate, Reagan announced that he would invite four other candidates to the debate; Bush, who had hoped that the one-on-one debate would allow him to emerge as the main alternative to Reagan in the primaries, refused to debate the other candidate. All six candidates took the stage, but Bush refused to speak in the presence of the other candidates. Ultimately, the other four candidates left the stage and the debate continued, but Bush's refusal to debate anyone other than Reagan badly damaged his campaign in New Hampshire. He ended up decisively losing New Hampshire's primary to Reagan, winning just 23 percent of the vote. Bush revitalized his campaign with a victory in Massachusetts, but lost the next several primaries. As Reagan built up a commanding delegate lead, Bush refused to end his campaign, but the other candidates dropped out of the race. Criticizing his more conservative rival's policy proposals, Bush famously labeled Reagan's supply side-influenced plans for massive tax cuts as "voodoo economics". Though he favored lower taxes, Bush feared that dramatic reductions in taxation would lead to deficits and, in turn, cause inflation.
After Reagan clinched a majority of delegates in late May, Bush reluctantly dropped out of the race. At the 1980 Republican National Convention, Reagan made the last-minute decision to select Bush as his vice presidential nominee after negotiations with Ford regarding a Reagan-Ford ticket collapsed. Though Reagan had resented many of the Bush campaign's attacks during the primary campaign, and several conservative leaders had actively opposed Bush's nomination, Reagan ultimately decided that Bush's popularity with moderate Republicans made him the best and safest pick. Bush, who had believed his political career might be over following the primaries, eagerly accepted the position and threw himself into campaigning for the Reagan-Bush ticket. The 1980 general election campaign between Reagan and Carter was conducted amid a multitude of domestic concerns and the ongoing Iran hostage crisis, and Reagan sought to focus the race on Carter's handling of the economy. Though the race was widely regarded as a close contest for most of the campaign, Reagan ultimately won over the large majority of undecided voters. Reagan took 50.7 percent of the popular vote and 489 of the 538 electoral votes, while Carter won 41% of the popular vote and John Anderson, running as an independent candidate, won 6.6% of the popular vote.
As vice president, Bush generally maintained a low profile, recognizing the constitutional limits of the office; he avoided decision-making or criticizing Reagan in any way. This approach helped him earn Reagan's trust, easing tensions left over from their earlier rivalry. Bush also generally enjoyed a good relationship with Reagan staffers, including his close friend Jim Baker, who served as Reagan's initial chief of staff. His understanding of the vice presidency was heavily influenced by Vice President Walter Mondale, who enjoyed a strong relationship with President Carter in part because of his ability to avoid confrontations with senior staff and Cabinet members, and by Vice President Nelson Rockefeller's difficult relationship with some members of the White House staff during the Ford administration. The Bushes attended a large number of public and ceremonial events in their positions, including many state funerals, which became a common joke for comedians. As the President of the Senate, Bush also stayed in contact with members of Congress and kept the president informed on occurrences on Capitol Hill.
On March 30, 1981, while Bush was in Texas, Reagan was shot and seriously wounded by John Hinckley Jr. Bush immediately flew back from Washington D.C.; when his plane landed, his aides advised him to proceed directly to the White House by helicopter in order to show that the government was still functioning. Bush rejected the idea, as he feared that such a dramatic scene risked giving the impression that he sought to usurp Reagan's powers and prerogatives. During Reagan's short period of incapacity, Bush presided over Cabinet meetings, met with congressional leaders and foreign leaders, and briefed reporters, but he consistently rejected the possibility of invoking the Twenty-fifth Amendment. Bush's handling of the attempted assassination and its aftermath made a positive impression on Reagan, who recovered and returned to work within two weeks of the shooting. From then on, the two men would have regular Thursday lunches in the Oval Office.
Bush was assigned by Reagan to chair two special task forces, one on deregulation and one on international drug smuggling. Both were popular issues with conservatives, and Bush, largely a moderate, began courting them through his work. The deregulation task force reviewed hundreds of rules, making specific recommendations on which ones to amend or revise, in order to curb the size of the federal government. The Reagan administration's deregulation push had a strong impact on broadcasting, finance, resource extraction, and other economic activities, and the administration eliminated numerous government positions. Bush also oversaw the administration's national security crisis management organization, which had traditionally been the responsibility of the National Security Advisor. In 1983, Bush toured Western Europe as part of the Reagan administration's ultimately successful efforts to convince skeptical NATO allies to support the deployment of Pershing II missiles.
Reagan's approval ratings fell after his first year in office, but they bounced back when the United States began to emerge from recession in 1983. Former Vice President Walter Mondale was nominated by the Democratic Party in the 1984 presidential election. Down in the polls, Mondale selected Congresswoman Geraldine Ferraro as his running mate in hopes of galvanizing support for his campaign, thus making Ferraro the first female major party vice presidential nominee in U.S. history. She and Bush squared off in a single televised vice presidential debate. Public opinion polling consistently showed a Reagan lead in the 1984 campaign, and Mondale was unable to shake up the race. In the end, Reagan won re-election, winning 49 of 50 states and receiving 59% of the popular vote to Mondale's 41%.
Mikhail Gorbachev came to power in the Soviet Union in 1985; less ideologically rigid than his predecessors, Gorbachev believed that the Soviet Union urgently needed economic and political reforms. At the 1987 Washington Summit, Gorbachev and Reagan signed the Intermediate-Range Nuclear Forces Treaty, which committed both signatories to the total abolition of their respective short-range and medium-range missile stockpiles. The treaty marked the beginning of a new era of trade, openness, and cooperation between the two powers. Though President Reagan and Secretary of State George Shultz took the lead in these negotiations, Bush sat in on many meetings and promised Gorbachev that he would seek to continue improving Soviet-U.S. relations if he succeeded Reagan. On July 13, 1985, Bush became the first vice president to serve as acting president when Reagan underwent surgery to remove polyps from his colon; Bush served as the acting president for approximately eight hours.
In 1986, the Reagan administration was shaken by a scandal when it was revealed that administration officials had secretly arranged weapon sales to Iran during the Iran–Iraq War. The officials had used the proceeds to fund the anti-communist Contras in Nicaragua, which was a direct violation of law. When news of affair broke to the media, Bush, like Reagan, stated that he had been "out of the loop" and unaware of the diversion of funds, although this has assertion has since been challenged. Biographer Jon Meacham writes that "no evidence was ever produced proving Bush was aware of the diversion to the contras," but he criticizes Bush's "out of the loop" characterization, writing that the "record is clear that Bush was aware that the United States, in contravention of its own stated policy, was trading arms for hostages". The Iran–Contra scandal, as it became known, did serious damage to the Reagan presidency, raising questions about Reagan's competency. Congress established the Tower Commission to investigate the scandal, and, at Reagan's request, a panel of federal judges appointed Lawrence Walsh as a special prosecutor charged with investigating the Iran–Contra scandal. The investigations continued after Reagan left office and, though Bush was never charged with a crime, the Iran–Contra scandal would remain a political liability for him.
Bush began planning for a presidential run after the 1984 election, and he officially entered the 1988 Republican Party presidential primaries in October 1987. He put together a campaign led by Reagan staffer Lee Atwater, and which also included his son, George W. Bush, and media consultant Roger Ailes. Though he had moved to the right during his time as vice president, endorsing a Human Life Amendment and repudiating his earlier comments on "voodoo economics," Bush still faced opposition from many conservatives in the Republican Party. His major rivals for the Republican nomination were Senate Minority Leader Bob Dole of Kansas, Congressman Jack Kemp of New York, and Christian televangelist Pat Robertson. Reagan did not publicly endorse any candidate, but he privately expressed support for Bush.
Though considered the early front-runner for the nomination, Bush came in third in the Iowa caucus, behind Dole and Robertson. Much as Reagan had done in 1980, Bush reorganized his staff and concentrated on the New Hampshire primary. With help from Governor John H. Sununu and an effective campaign attacking Dole for raising taxes, Bush overcame an initial polling deficit and won New Hampshire with 39 percent of the vote. After Bush won South Carolina and 16 of the 17 states holding a primary on Super Tuesday, his competitors dropped out of the race.
Bush, occasionally criticized for his lack of eloquence when compared to Reagan, delivered a well-received speech at the Republican convention. Known as the "thousand points of light" speech, it described Bush's vision of America: he endorsed the Pledge of Allegiance, prayer in schools, capital punishment, and gun rights. Bush also , stating: "Congress will push me to raise taxes, and I'll say no, and they'll push, and I'll say no, and they'll push again. And all I can say to them is: read my lips. No new taxes." Bush selected little-known Senator Dan Quayle of Indiana as his running mate. Though Quayle had compiled an unremarkable record in Congress, he was popular among many conservatives, and the campaign hoped that Quayle's youth would appeal to younger voters.
Meanwhile, the Democratic Party nominated Governor Michael Dukakis, who was known for presiding over an economic turnaround in Massachusetts. Leading in the general election polls against Bush, Dukakis ran an ineffective, low-risk campaign. The Bush campaign attacked Dukakis as an unpatriotic liberal extremist and seized on Dukakis's pardon of Willie Horton, a convicted felon from Massachusetts who had raped a woman while on a prison furlough. The Bush campaign charged that Dukakis presided over a "revolving door" that allowed dangerous convicted felons to leave prison. Dukakis damaged his own campaign with a widely mocked ride in an M1 Abrams tank and a poor performance at the second presidential debate. Bush also attacked Dukakis for opposing a law that would require all students to recite the Pledge of Allegiance. The election is widely considered to have had a high level of negative campaigning, though political scientist John Geer has argued that the share of negative ads was in line with previous presidential elections.
Bush defeated Dukakis by a margin of 426 to 111 in the Electoral College, and he took 53.4 percent of the national popular vote. Bush ran well in all the major regions of the country, but especially in the South. He became the first sitting vice president to be elected president since Martin Van Buren in 1836, the first person to succeed a president from his own party via election since Herbert Hoover in 1929. In the concurrent congressional elections, Democrats retained control of both houses of Congress.
Bush was inaugurated on January 20, 1989, succeeding Ronald Reagan. In his inaugural address, Bush said:
Bush's first major appointment was that of James Baker as Secretary of State. Leadership of the Department of Defense went to Dick Cheney, who had had previously served as Gerald Ford's chief of staff and would later serve as vice president under George W. Bush. Jack Kemp joined the administration as Secretary of Housing and Urban Development, while Elizabeth Dole, the wife of Bob Dole and a former Secretary of Transportation, became the Secretary of Labor under Bush. Bush retained several Reagan officials, including Secretary of the Treasury Nicholas F. Brady, Attorney General Dick Thornburgh, and Secretary of Education Lauro Cavazos. New Hampshire Governor John Sununu, a strong supporter of Bush during the 1988 campaign, became chief of staff. Brent Scowcroft was appointed as the National Security Advisor, a role he had also held under Ford.
During the first year of his tenure, Bush pursued what Soviets referred to as the "pauza", a break in Reagan's détente policies. Bush and his advisers were initially divided on Gorbachev; some administration officials saw him as a democratic reformer, but others suspected him of trying to make the minimum changes necessary to restore the Soviet Union to a competitive position with the United States. In 1989, Communist governments fell in Poland, Hungary, Czechoslovakia, the governments of Bulgaria and Romania instituted major reforms, and the government of East Germany opened the Berlin Wall, which was subsequently demolished by Berliners. Many Soviet leaders urged Gorbachev to crush the dissidents in Eastern Europe, but Gorbachev declined to send in the Soviet military, effectively abandoning the Brezhnev Doctrine. The U.S. was not directly involved in these upheavals, but the Bush administration avoided the appearance of gloating over the demise of the Eastern Bloc to avoid undermining further democratic reforms. Bush also helped convince Polish leaders to allow democratic elections and became the first sitting U.S. president to visit Hungary.
By mid-1989, as unrest blanketed Eastern Europe, Bush requested a meeting with Gorbachev, and the two agreed to hold the December 1989 Malta Summit. Though many on the right remained wary of Gorbachev, Bush came away from the Malta Summit with the belief that Gorbachev would negotiate in good faith. For the remainder of his term, Bush sought cooperative relations with Gorbachev, believing that the Soviet leader was the key to peacefully ending the Soviet domination of Eastern Europe. The key issue at the Malta Summit was the potential reunification of Germany. While Britain and France were wary of a re-unified Germany, Bush joined West German Chancellor Helmut Kohl in pushing for German reunification. Bush believed that a reunified Germany would serve U.S. interests, but he also saw reunification as providing a final symbolic end to World War II. After extensive negotiations, Gorbachev agreed to allow a reunified Germany to be a part of NATO, and Germany officially reunified in October 1990.
Though Gorbachev acquiesced to the democratization of Soviet satellite states, he suppressed nationalist movements within the Soviet Union itself. A crisis in Lithuania left Bush in a difficult position, as he needed Gorbachev's cooperation in the reunification of Germany and feared that the collapse of the Soviet Union could leave nuclear arms in dangerous hands. The Bush administration mildly protested Gorbachev's suppression of Lithuania's independence movement, but took no action to directly intervene. Bush warned independence movements of the disorder that could come with secession from the Soviet Union; in a 1991 address that critics labeled the "Chicken Kiev speech", he cautioned against "suicidal nationalism". In July 1991, Bush and Gorbachev signed the Strategic Arms Reduction Treaty (START I) treaty, in which both countries agreed to cut their strategic nuclear weapons by 30 percent.
In August 1991, hard-line Communists launched a coup against Gorbachev; while the coup quickly fell apart, it broke the remaining power of Gorbachev and the central Soviet government. Later that month, Gorbachev resigned as general secretary of the Communist party, and Russian president Boris Yeltsin ordered the seizure of Soviet property. Gorbachev clung to power as the President of the Soviet Union until December 1991, when the Soviet Union dissolved. Fifteen states emerged from the Soviet Union, and of those states, Russia was the largest and most populous. Bush and Yeltsin met in February 1992, declaring a new era of "friendship and partnership". In January 1993, Bush and Yeltsin agreed to START II, which provided for further nuclear arms reductions on top of the original START treaty. The collapse of the Soviet Union prompted reflections on the future of the world following the end of the Cold War; one political scientist, Francis Fukuyama, speculated that humanity had reached the "end of history" in that liberal, capitalist democracy had permanently triumphed over Communism and fascism. Meanwhile, the collapse of the Soviet Union and other Communist governments led to post-Soviet conflicts in Central Europe, Eastern Europe, Central Asia, and Africa that would continue long after Bush left office.
During the 1980s, the U.S. had provided aid to Panamanian leader Manuel Noriega, an anti-Communist dictator who engaged in drug trafficking. In May 1989, Noriega annulled the results of a democratic presidential election in which Guillermo Endara had been elected. Bush objected to the annulment of the election and worried about the status of the Panama Canal with Noriega still in office. Bush dispatched 2,000 soldiers to the country, where they began conducting regular military exercises in violation of prior treaties. After a U.S. serviceman was shot by Panamanian forces in December 1989, Bush ordered the United States invasion of Panama, known as "Operation Just Cause". The invasion was the first large-scale American military operation in more than 40 years that was not related to the Cold War. American forces quickly took control of the Panama Canal Zone and Panama City. Noriega surrendered on January 3, 1990, and was quickly transported to a prison in the United States. Twenty-three Americans died in the operation, while another 394 were wounded. Noriega was convicted and imprisoned on racketeering and drug trafficking charges in April 1992. Historian Stewart Brewer argues that the invasion "represented a new era in American foreign policy" because Bush did not justify the invasion under the Monroe Doctrine or the threat of Communism, but rather on the grounds that it was in the best interests of the United States.
Faced with massive debts and low oil prices in the aftermath of the Iran–Iraq War, Iraqi leader Saddam Hussein decided to conquer the country of Kuwait, a small, oil-rich country situated on Iraq's southern border. After Iraq invaded Kuwait in August 1990, Bush imposed economic sanctions on Iraq and assembled a multi-national coalition opposed to the invasion. The administration feared that a failure to respond to the invasion would embolden Hussein to attack Saudi Arabia or Israel, and wanted to discourage other countries from similar aggression. Bush also wanted to ensure continued access to oil, as Iraq and Kuwait collectively accounted for 20 percent of the world's oil production, and Saudi Arabia produced another 26 percent of the world's oil supply.
At Bush's insistence, in November 1990, the United Nations Security Council approved a resolution authorizing the use of force if Iraq did not withdrawal from Kuwait by January 15, 1991. Gorbachev's support, as well as China's abstention, helped ensure passage of the UN resolution. Bush convinced Britain, France, and other nations to commit soldiers to an operation against Iraq, and he won important financial backing from Germany, Japan, South Korea, Saudi Arabia, and the United Arab Emirates. In January 1991, Bush asked Congress to approve a joint resolution authorizing a war against Iraq. Bush believed that the UN resolution had already provided him with the necessary authorization to launch a military operation against Iraq, but he wanted to show that the nation was united behind a military action. Despite the opposition of a majority of Democrats in both the House and the Senate, Congress approved the Authorization for Use of Military Force Against Iraq Resolution of 1991.
After the January 15 deadline passed without an Iraqi withdrawal from Kuwait, U.S. and coalition forces began a conducted a bombing campaign that devastated Iraq's power grid and communications network, and resulted in the desertion of about 100,000 Iraqi soldiers. In retaliation, Iraq launched Scud missiles at Israel and Saudi Arabia, but most of the missiles did little damage. On February 23, coalition forces began a ground invasion into Kuwait, evicting Iraqi forces by the end of February 27. About 300 Americans, as well as approximately 65 soldiers from other coalition nations, died during the military action. A cease fire was arranged on March 3, and the UN passed a resolution establishing a peacekeeping force in a demilitarized zone between Kuwait and Iraq. A March 1991 Gallup poll showed that Bush had an approval rating of 89 percent, the highest presidential approval rating in the history of Gallup polling. After 1991, the UN maintained economic sanctions against Iraq, and the United Nations Special Commission was assigned to ensure that Iraq did not revive its weapons of mass destruction program.
In 1987, the U.S. and Canada had reached a free trade agreement that eliminated many tariffs between the two countries. President Reagan had intended it as the first step towards a larger trade agreement to eliminate most tariffs among the United States, Canada, and Mexico. The Bush administration, along with the Progressive Conservative Canadian Prime Minister Brian Mulroney, spearheaded the negotiations of the North American Free Trade Agreement (NAFTA) with Mexico. In addition to lowering tariffs, the proposed treaty would affected patents, copyrights, and trademarks. In 1991, Bush sought fast track authority, which grants the president the power to submit an international trade agreement to Congress without the possibility of amendment. Despite congressional opposition led by House Majority Leader Dick Gephardt, both houses of Congress voted to grant Bush fast track authority. NAFTA was signed in December 1992, after Bush lost re-election, but President Clinton won ratification of NAFTA in 1993. NAFTA remains controversial for its impact on wages, jobs, and overall economic growth.
The U.S. economy had generally performed well since emerging from recession in late 1982, but it slipped into a mild recession in 1990. The unemployment rate rose from 5.9 percent in 1989 to a high of 7.8 percent in mid-1991. Large federal deficits, spawned during the Reagan years, rose from $152.1 billion in 1989 to $220 billion for 1990; the $220 billion deficit represented a threefold increase since 1980. As the public became increasingly concerned about the economy and other domestic affairs, Bush's well-received handling of foreign affairs became less of an issue for most voters. Bush's top domestic priority was to bring an end to federal budget deficits, which he saw as a liability for the country's long-term economic health and standing in the world. As he was opposed to major defense spending cuts and had pledged to not raise taxes, the president had major difficulties in balancing the budget.
Bush and congressional leaders agreed to avoid major changes to the budget for fiscal year 1990, which began in October 1989. However, both sides knew that spending cuts or new taxes would be necessary in the following year's budget in order to avoid the draconian automatic domestic spending cuts required by the Gramm–Rudman–Hollings Balanced Budget Act of 1987. Bush and other leaders also wanted to cut deficits because Federal Reserve Chair Alan Greenspan refused to lower interest rates, and thus stimulate economic growth, unless the federal budget deficit was reduced. In a statement released in late June 1990, Bush said that he would be open to a deficit reduction program which included spending cuts, incentives for economic growth, budget process reform, as well as tax increases. To fiscal conservatives in the Republican Party, Bush's statement represented a betrayal, and they heavily criticized him for compromising so early in the negotiations.
In September 1990, Bush and Congressional Democrats announced a compromise to cut funding for mandatory and discretionary programs while also raising revenue, partly through a higher gas tax. The compromise additionally included a "pay as you go" provision that required that new programs be paid for at the time of implementation. House Minority Whip Newt Gingrich led the conservative opposition to the bill, strongly opposing any form of tax increase. Some liberals also criticized the budget cuts in the compromise, and in October, the House rejected the deal, resulting in a brief government shutdown. Without the strong backing of the Republican Party, Bush agreed to another compromise bill, this one more favorable to Democrats. The Omnibus Budget Reconciliation Act of 1990 (OBRA-90), enacted on October 27, 1990, dropped much of the gasoline tax increase in favor of higher income taxes on top earners. It included cuts to domestic spending, but the cuts were not as deep as those that had been proposed in the original compromise. Bush's decision to sign the bill damaged his standing with conservatives and the general public, but it also laid the groundwork for the budget surpluses of the late 1990s.
The disabled had not received legal protections under the landmark Civil Rights Act of 1964, and many faced discrimination and segregation by the time Bush took office. In 1988, Lowell P. Weicker Jr. and Tony Coelho had introduced the Americans with Disabilities Act, which barred employment discrimination against qualified individuals with disabilities. The bill had passed the Senate but not the House, and it was reintroduced in 1989. Though some conservatives opposed the bill due to its costs and potential burdens on businesses, Bush strongly supported it, partly because his son, Neil, had struggled with dyslexia. After the bill passed both houses of Congress, Bush signed the Americans with Disabilities Act of 1990 into law in July 1990. The act required employers and public accommodations to make "reasonable accommodations" for the disabled, while providing an exception when such accommodations imposed an "undue hardship".
Senator Ted Kennedy later led the congressional passage of a separate civil rights bill designed to facilitate launching employment discrimination lawsuits. In vetoing the bill, Bush argued that it would lead to racial quotas in hiring. In November 1991, Bush signed the Civil Rights Act of 1991, which was largely similar to the bill he had vetoed in the previous year.
In August 1990, Bush signed the Ryan White CARE Act, the largest federally funded program dedicated to assisting persons living with HIV/AIDS. Throughout his presidency, the AIDS epidemic grew dramatically in the U.S. and around the world, and Bush often found himself at odds with AIDS activist groups who criticized him for not placing a high priority on HIV/AIDS research and funding. Frustrated by the administration's lack of urgency on the issue, ACT UP, dumped the ashes of HIV/AIDS victims on the White House lawn during a viewing of the AIDS Quilt in 1992. By that time, HIV had become the leading cause of death in the U.S. for men aged 25–44.
In June 1989, the Bush administration proposed a bill to amend the Clean Air Act. Working with Senate Majority Leader George J. Mitchell, the administration won passage of the amendments over the opposition of business-aligned members of Congress who feared the impact of tougher regulations. The legislation sought to curb acid rain and smog by requiring decreased emissions of chemicals such as sulfur dioxide, and was the first major update to the Clean Air Act since 1977. Bush also signed the Oil Pollution Act of 1990 in response to the Exxon Valdez oil spill. However, the League of Conservation Voters criticized some of Bush's other environmental actions, including his opposition to stricter auto-mileage standards.
President Bush devoted attention to voluntary service as a means of solving some of America's most serious social problems. He often used the "thousand points of light" theme to describe the power of citizens to solve community problems. In his 1989 inaugural address, President Bush said, "I have spoken of a thousand points of light, of all the community organizations that are spread like stars throughout the Nation, doing good." During his presidency, Bush honored numerous volunteers with the Daily Point of Light Award, a tradition that was continued by his presidential successors. In 1990, the Points of Light Foundation was created as a nonprofit organization in Washington to promote this spirit of volunteerism. In 2007, the Points of Light Foundation merged with the Hands On Network to create a new organization, Points of Light.
Bush appointed two justices to the Supreme Court of the United States. In 1990, Bush appointed a largely unknown state appellate judge, David Souter, to replace liberal icon William Brennan. Souter was easily confirmed and served until 2009, but joined the liberal bloc of the court, disappointing Bush. In 1991, Bush nominated conservative federal judge Clarence Thomas to succeed Thurgood Marshall, a long-time liberal stalwart. Thomas, the former head of the Equal Employment Opportunity Commission (EEOC), faced heavy opposition in the Senate, as well as from pro-choice groups and the NAACP. His nomination faced another difficulty when Anita Hill accused Thomas of having sexually harassed her during his time as the chair of EEOC. Thomas won confirmation in a narrow 52–48 vote; 43 Republicans and 9 Democrats voted to confirm Thomas's nomination, while 46 Democrats and 2 Republicans voted against confirmation. Thomas became one of the most conservative justices of his era. In addition to his two Supreme Court appointments, Bush appointed 42 judges to the United States courts of appeals, and 148 judges to the United States district courts. Among these appointments were future Supreme Court Justice Samuel Alito, as well as Vaughn R. Walker, who was later revealed to be the earliest known gay federal judge.
Bush's education platform consisted mainly of offering federal support for a variety of innovations, such as open enrollment, incentive pay for outstanding teachers, and rewards for schools that improve performance with underprivileged children. Though Bush did not pass a major educational reform package during his presidency, his ideas influenced later reform efforts, including Goals 2000 and the No Child Left Behind Act. Bush signed the Immigration Act of 1990, which led to a 40 percent increase in legal immigration to the United States. The act more than doubled the number of visas given to immigrants on the basis of job skills. In the wake of the savings and loan crisis, Bush proposed a $50 billion package to rescue the savings and loans industry, and also proposed the creation of the Office of Thrift Supervision to regulate the industry. Congress passed the Financial Institutions Reform, Recovery, and Enforcement Act of 1989, which incorporated most of Bush's proposals.
Bush was widely seen as a "pragmatic caretaker" president who lacked a unified and compelling long-term theme in his efforts. Indeed, Bush's sound bite where he refers to the issue of overarching purpose as "the vision thing" has become a metonym applied to other political figures accused of similar difficulties. His ability to gain broad international support for the Gulf War and the war's result were seen as both a diplomatic and military triumph, rousing bipartisan approval, though his decision to withdraw without removing Saddam Hussein left mixed feelings, and attention returned to the domestic front and a souring economy. A "New York Times" article mistakenly depicted Bush as being surprised to see a supermarket barcode reader; the report of his reaction exacerbated the notion that he was "out of touch". Amid the early 1990s recession, his image shifted from "conquering hero" to "politician befuddled by economic matters".
Bush announced his reelection bid in early 1992; with a coalition victory in the Persian Gulf War and high approval ratings, Bush's reelection initially looked likely. As a result, many leading Democrats, including Mario Cuomo, Dick Gephardt, and Al Gore, declined to seek their party's presidential nomination. However, Bush's tax increase had angered many conservatives, who believed that Bush had strayed from the conservative principles of Ronald Reagan. He faced a challenge from conservative political columnist Pat Buchanan in the 1992 Republican primaries. Bush fended off Buchanan's challenge and won his party's nomination at the 1992 Republican National Convention, but the convention adopted a socially conservative platform strongly influenced by the Christian right.
Meanwhile, the Democrats nominated Governor Bill Clinton of Arkansas. A moderate who was affiliated with the Democratic Leadership Council (DLC), Clinton favored welfare reform, deficit reduction, and a tax cut for the middle class. In early 1992, the race took an unexpected twist when Texas billionaire H. Ross Perot launched a third party bid, claiming that neither Republicans nor Democrats could eliminate the deficit and make government more efficient. His message appealed to voters across the political spectrum disappointed with both parties' perceived fiscal irresponsibility. Perot also attacked NAFTA, which he claimed would lead to major job losses. National polling taken in mid-1992 showed Perot in the lead, but Clinton experienced a surge through effective campaigning and the selection of Senator Al Gore, a popular and relatively young Southerner, as his running mate.
Clinton won the election, taking 43 percent of the popular vote and 370 electoral votes, while Bush won 37.5 percent of the popular vote and 168 electoral votes. Perot won 19% of the popular vote, one of the highest totals for a third party candidate in U.S. history, drawing equally from both major candidates, according to exit polls. Clinton performed well in the Northeast, the Midwest, and the West Coast, while also waging the strongest Democratic campaign in the South since the 1976 election. Several factors were important in Bush's defeat. The ailing economy which arose from recession may have been the main factor in Bush's loss, as 7 in 10 voters said on election day that the economy was either "not so good" or "poor". On the eve of the 1992 election, the unemployment rate stood at 7.8%, which was the highest it had been since 1984. The president was also damaged by his alienation of many conservatives in his party. Bush blamed Perot in part for his defeat, though exit polls showed that Perot drew his voters about equally from Clinton and Bush.
Despite his defeat, Bush left office with a 56 percent job approval rating in January 1993. Like many of his predecessors, Bush issued a series of pardons during his last days in office. In December 1992, he granted executive clemency to six former senior government officials implicated in the Iran-Contra scandal, most prominently former Secretary of Defense Caspar Weinberger. The pardons effectively brought an end to special prosecutor Lawrence Walsh's investigation of the Iran-Contra scandal.
After leaving office, Bush and his wife built a retirement house in the community of West Oaks, Houston. He established a presidential office within the Park Laureate Building on Memorial Drive in Houston. He also frequently spent time at his vacation home in Kennebunkport, took annual cruises in Greece, went on fishing trips in Florida, and visited the Bohemian Club in Northern California. He declined to serve on corporate boards, but delivered numerous paid speeches and served as an adviser to The Carlyle Group, a private equity firm. He never published his memoirs, but he and Brent Scowcroft co-wrote "A World Transformed", a 1999 work on foreign policy. Portions of his letters and his diary were later published as "The China Diary of George H. W. Bush" and "All The Best, George Bush".
During a 1993 visit to Kuwait, Bush was targeted in an assassination plot directed by the Iraqi Intelligence Service. President Clinton retaliated when he ordered the firing of 23 cruise missiles at Iraqi Intelligence Service headquarters in Baghdad. Bush did not publicly comment on the assassination attempt or the missile strike, but privately spoke with Clinton shortly before the strike took place. In the 1994 gubernatorial elections, his sons George W. and Jeb concurrently ran for Governor of Texas and Governor of Florida. Concerning their political careers, he advised them both that "[a]t some point both of you may want to say 'Well, I don't agree with my Dad on that point' or 'Frankly I think Dad was wrong on that.' Do it. Chart your own course, not just on the issues but on defining yourselves". George W. won his race against Ann Richards while Jeb lost to Lawton Chiles. After the results came in, the elder Bush told ABC, "I have very mixed emotions. Proud father, is the way I would sum it all up." Jeb would again run for governor of Florida in 1998 and win at the same time that his brother George W. won re-election in Texas. It marked the second time in United States history that a pair of brothers served simultaneously as governors.
Bush supported his son's candidacy in the 2000 presidential election, but did not actively campaign in the election and did not deliver a speech at the 2000 Republican National Convention. George W. Bush defeated Al Gore in the 2000 election and was re-elected in 2004. Bush and his son thus became the second father–son pair to each serve as President of the United States, following John Adams and John Quincy Adams. Through previous administrations, the elder Bush had ubiquitously been known as "George Bush" or "President Bush", but following his son's election the need to distinguish between them has made retronymic forms such as "George H. W. Bush" and "George Bush Sr." and colloquialisms such as "Bush 41" and "Bush the Elder" more common. Bush advised his son on some personnel choices, approving of the selection of Dick Cheney as running mate and the retention of George Tenet as CIA Director. However, he was not consulted on all appointments, including that of his old rival, Donald Rumsfeld, as Secretary of Defense. Though he avoided giving unsolicited advice to his son, Bush and his son also discussed some matters of policy, especially regarding national security issues.
In his retirement, Bush generally avoided publicly expressing his opinion on political issues, instead using the public spotlight to support various charities. Despite earlier political differences with Bill Clinton, the two former presidents eventually became friends. They appeared together in television ads, encouraging aid for victims of Hurricane Katrina and the 2004 Indian Ocean earthquake and tsunami.
Bush supported Republican John McCain in the 2008 presidential election, and Republican Mitt Romney in the 2012 presidential election, but both were defeated by Democrat Barack Obama. In 2011, Obama awarded Bush with the Presidential Medal of Freedom, the highest civilian honor in the United States.
Bush supported his son Jeb's bid in the 2016 presidential election. Jeb Bush's campaign struggled however, and he withdrew from the race during the primaries. Neither George H.W. nor George W. Bush endorsed the eventual Republican nominee, Donald Trump; all three Bushes emerged as frequent critics of Trump's policies and speaking style, while Trump frequently criticized George W. Bush's presidency. George H. W. Bush later said that he voted for the Democratic nominee, Hillary Clinton, in the general election. After the election, Bush wrote a letter to president-elect Donald Trump in January 2017 to inform him that because of his poor health, he would not be able to attend Trump's inauguration on January 20; he gave him his best wishes.
In August 2017, after the violence at Unite the Right rally in Charlottesville, both Presidents Bush released a joint statement saying, "America must always reject racial bigotry, anti-Semitism, and hatred in all forms," the statement from presidents George H. W. and George W. Bush reads. "As we pray for Charlottesville, we are all reminded of the fundamental truths recorded by that city's most prominent citizen in the Declaration of Independence: we are all created equal and endowed by our Creator with unalienable rights."
On April 17, 2018, Bush's wife Former First Lady Barbara Bush died at the age of 92, at her home in Houston, Texas. Her funeral was held at St. Martin's Episcopal Church in Houston four days later. Bush along with former Presidents Barack Obama, George W. Bush (son), Bill Clinton and fellow First Ladies Melania Trump, Michelle Obama, Laura Bush (daughter-in-law) and Hillary Clinton were representatives who attended the funeral and who also took a photo together after the service as a sign of unity which went viral online.
On November 1, Bush went to the polls to vote early in the midterm elections. This would be his final public appearance.
George H. W. Bush died on November 30, 2018, aged 94 years, 171 days, at his home in Houston. At the time of his death he was the longest-lived U.S. president, a distinction now held by Jimmy Carter. He was also the third-oldest vice president. Bush lay in state in the Rotunda of the U.S. Capitol from December 3 through December 5; he was the 12th U.S. president to be accorded this honor. Then, on December 5, Bush's casket was transferred from the Capitol rotunda to Washington National Cathedral where a state funeral was held. After the funeral, Bush's body was transported to George H.W. Bush Presidential Library in College Station, Texas, where he was buried next to his wife Barbara and daughter Robin. At the funeral, former president George W. Bush eulogized his father saying,
In 1991, "The New York Times" revealed that Bush was suffering from Graves' disease, a non-contagious thyroid condition that his wife Barbara also suffered from. Later in life, Bush suffered from vascular parkinsonism, a form of Parkinson's disease which forced him to use a motorized scooter or wheelchair.
Bush was raised in the Episcopal Church, though by the end of his life his apparent religious beliefs are considered to have more in line with Evangelical Christian doctrine and practices. He cited various moments in his life deepening of his faith, including his escape from Japanese forces in 1944, and the death of his three-year-old daughter Robin in 1953. His faith was reflected in his Thousand Points of Light speech, his support for prayer in schools, and his support for the pro-life movement (following his election as vice president).
Polls of historians and political scientists have ranked Bush in the top half of presidents. A 2018 poll of the American Political Science Association's Presidents and Executive Politics section ranked Bush as the 17th best president out of 44. A 2017 C-Span poll of historians also ranked Bush as the 20th best president out of 43. Richard Rose described Bush as a "guardian" president, and many other historians and political scientists have similarly described Bush as a passive, hands-off president who was "largely content with things as they were". Professor Steven Knott writes that "[g]enerally the Bush presidency is viewed as successful in foreign affairs but a disappointment in domestic affairs."
Biographer Jon Meacham writes that, after he left office, many Americans viewed Bush as "a gracious and underappreciated man who had many virtues but who had failed to project enough of a distinctive identity and vision to overcome the economic challenges of 1991–92 and to win a second term." Bush himself noted that his legacy was "lost between the glory of Reagan ... and the trials and tribulations of my sons." In the 2010s, Bush was fondly remembered for his willingness to compromise, which contrasted with the intensely partisan era that followed his presidency.
According to "USA Today", the legacy of Bush's presidency was defined by his victory over Iraq after the invasion of Kuwait, and for his presiding over the collapse of the USSR and the re-unification of Germany. Michael Beschloss and Strobe Talbott praise Bush's handling of the USSR, especially how he prodded Gorbachev in terms of releasing control over the satellites and permitting German unification—and especially a united Germany in NATO. Andrew Bacevich judges the Bush administration as “morally obtuse” in the light of its “business-as-usual” attitude towards China after the massacre in Tiananmen Square and its uncritical support of Gorbachev as the Soviet Union disintegrated. David Rothkopf argues:
In 1990, "Time "magazine named him the Man of the Year. In 1997, the Houston Intercontinental Airport was renamed as the George Bush Intercontinental Airport. In 1999, the CIA headquarters in Langley, Virginia, was named the "George Bush Center for Intelligence" in his honor. In 2011, Bush, an avid golfer, was inducted in the World Golf Hall of Fame. The (CVN-77), the tenth and last supercarrier of the United States Navy, was named for Bush. Bush is commemorated on a postage stamp that was issued by the United States Postal Service in 2019.
The George H.W. Bush Presidential Library and Museum, the tenth U.S. presidential library, was completed in 1997. It contains the presidential and vice presidential papers of Bush and the vice presidential papers of Dan Quayle. The library is located on a site on the west campus of Texas A&M University in College Station, Texas. Texas A&M University also hosts the Bush School of Government and Public Service, a graduate public policy school.
|
https://en.wikipedia.org/wiki?curid=11955
|
G. E. Moore
George Edward Moore (4 November 1873 – 24 October 1958), usually cited as G. E. Moore, was an English philosopher. He was, with Bertrand Russell, Ludwig Wittgenstein, and (before them) Gottlob Frege, one of the founders of the analytic tradition in philosophy. Along with Russell, he led the turn away from idealism in British philosophy, and became well known for his advocacy of common sense concepts, his contributions to ethics, epistemology, and metaphysics, and "his exceptional personality and moral character".
Moore was Professor of Philosophy at the University of Cambridge, highly influential among (though not a member of) the Bloomsbury Group, and the editor of the influential journal "Mind". He was elected a fellow of the British Academy in 1918. He was a member of the Cambridge Apostles, the intellectual secret society, from 1894 to 1901, and chairman of the Cambridge University Moral Sciences Club from 1912 to 1944. A humanist, he served as President of the British Ethical Union (now known as Humanists UK) from 1935 to 1936.
Moore was born in Upper Norwood, Croydon, Greater London, on 4 November 1873, the middle child of seven of Dr Daniel Moore and Henrietta Sturge. His grandfather was the author Dr George Moore. His eldest brother was Thomas Sturge Moore, a poet, writer and engraver.
He was educated at Dulwich College and in 1892 went up to Trinity College, Cambridge to study classics and moral sciences. He became a Fellow of Trinity in 1898, and went on to hold the University of Cambridge chair of Mental Philosophy and Logic, from 1925 to 1939.
Moore is best known today for his defence of ethical non-naturalism, his emphasis on common sense in philosophical method, and the paradox that bears his name. He was admired by and influential among other philosophers, and also by the Bloomsbury Group, but is (unlike his colleague and admirer Russell, who, for some years thought he fulfilled his "ideal of genius") mostly unknown today outside of academic philosophy. Moore's essays are known for their clear, circumspect writing style, and for his methodical and patient approach to philosophical problems. He was critical of modern philosophy for its lack of progress, which he believed was in stark contrast to the dramatic advances in the natural sciences since the Renaissance. Among Moore's most famous works are his book "Principia Ethica", and his essays, "The Refutation of Idealism", "A Defence of Common Sense", and "A Proof of the External World".
Moore was an important and much admired member of the secretive Cambridge Apostles, a discussion group with members drawn from the British intellectual elite. At the time another member, a 22-year-old Bertrand Russell, wrote “I almost worship him as if he were a god. I have never felt such an extravagant admiration for anybody.”
From 1918-19 he was president of the Aristotelian Society, a group committed to the systematic study of philosophy, its historical development and its methods and problems.
G. E. Moore died at the Evelyn Nursing Home on 24 October 1958; he was cremated at Cambridge Crematorium on 28 October 1958 and his ashes interred at the Parish of the Ascension Burial Ground in Cambridge; his wife, Dorothy Ely (1892-1977) was buried there. Together they had two sons, the poet Nicholas Moore and the composer Timothy Moore.
His influential work "Principia Ethica" is one of the main inspirations of the movement against ethical naturalism (see ethical non-naturalism) and is partly responsible for the twentieth-century concern with meta-ethics.
Moore asserted that philosophical arguments can suffer from a confusion between the use of a term in a particular argument and the definition of that term (in all arguments). He named this confusion the naturalistic fallacy. For example, an ethical argument may claim that if a thing has certain properties, then that thing is 'good.' A hedonist may argue that 'pleasant' things are 'good' things. Other theorists may argue that 'complex' things are 'good' things. Moore contends that, even if such arguments are correct, they do not provide definitions for the term 'good'. The property of 'goodness' cannot be defined. It can only be shown and grasped. Any attempt to define it (X is good if it has property Y) will simply shift the problem (Why is Y-ness good in the first place?).
Moore's argument for the indefinability of 'good' (and thus for the fallaciousness in the "naturalistic fallacy") is often called the open-question argument; it is presented in §13 of "Principia Ethica". The argument hinges on the nature of statements such as "Anything that is pleasant is also good" and the possibility of asking questions such as "Is it "good" that x is pleasant?". According to Moore, these questions are "open" and these statements are "significant"; and they will remain so no matter what is substituted for "pleasure". Moore concludes from this that any analysis of value is bound to fail. In other words, if value could be analysed, then such questions and statements would be trivial and obvious. Since they are anything but trivial and obvious, value must be indefinable.
Critics of Moore's arguments sometimes claim that he is appealing to general puzzles concerning analysis (cf. the paradox of analysis), rather than revealing anything special about value. The argument clearly depends on the assumption that if 'good' were definable, it would be an analytic truth about 'good', an assumption that many contemporary moral realists like Richard Boyd and Peter Railton reject. Other responses appeal to the Fregean distinction between sense and reference, allowing that value concepts are special and "sui generis", but insisting that value properties are nothing but natural properties (this strategy is similar to that taken by non-reductive materialists in philosophy of mind).
Moore contended that goodness cannot be analysed in terms of any other property. In "Principia Ethica", he writes:
Therefore, we cannot define 'good' by explaining it in other words. We can only point to an "action" or a "thing" and say "That is good." Similarly, we cannot describe to a person born totally blind exactly what yellow is. We can only show a sighted person a piece of yellow paper or a yellow scrap of cloth and say "That is yellow."
In addition to categorising 'good' as indefinable, Moore also emphasized that it is a non-natural property. This means that it cannot be empirically or scientifically tested or verifiedit is not within the bounds of "natural science".
Moore argued that, once arguments based on the naturalistic fallacy had been discarded, questions of intrinsic goodness could be settled only by appeal to what he (following Sidgwick) called "moral intuitions": self-evident propositions which recommend themselves to moral reflection, but which are not susceptible to either direct proof or disproof (PE § 45). As a result of his view, he has often been described by later writers as an advocate of ethical intuitionism. Moore, however, wished to distinguish his view from the views usually described as "Intuitionist" when "Principia Ethica" was written:
Moore distinguished his view from the view of deontological intuitionists, who held that "intuitions" could determine questions about what "actions" are right or required by duty. Moore, as a consequentialist, argued that "duties" and moral rules could be determined by investigating the "effects" of particular actions or kinds of actions (PE § 89), and so were matters for empirical investigation rather than direct objects of intuition (PE § 90). On Moore's view, "intuitions" revealed not the rightness or wrongness of specific actions, but only what things were good in themselves, as "ends to be pursued".
One of the most important parts of Moore's philosophical development was his break from the idealism that dominated British philosophy (as represented in the works of his former teachers F. H. Bradley and John McTaggart), and his defence of what he regarded as a "common sense" form of realism. In his 1925 essay "A Defence of Common Sense", he argued against idealism and scepticism toward the external world, on the grounds that they could not give reasons to accept that their metaphysical premises were more plausible than the reasons we have to accept the common sense claims about our knowledge of the world, which sceptics and idealists must deny. He famously put the point into dramatic relief with his 1939 essay "Proof of an External World", in which he gave a common sense argument against scepticism by raising his right hand and saying "Here is one hand" and then raising his left and saying "And here is another", then concluding that there are at least two external objects in the world, and therefore that he knows (by this argument) that an external world exists. Not surprisingly, not everyone inclined to sceptical doubts found Moore's method of argument entirely convincing; Moore, however, defends his argument on the grounds that sceptical arguments seem invariably to require an appeal to "philosophical intuitions" that we have considerably less reason to accept than we have for the common sense claims that they supposedly refute. (In addition to fueling Moore's own work, the "Here is one hand" argument also deeply influenced Wittgenstein, who spent his last years working out a new approach to Moore's argument in the remarks that were published posthumously as "On Certainty".)
Moore is also remembered for drawing attention to the peculiar inconsistency involved in uttering a sentence such as "It is raining, but I do not believe it is raining," a puzzle now commonly called "Moore's paradox". The puzzle arises because it seems impossible for anyone to consistently "assert" such a sentence; but there doesn't seem to be any "logical contradiction" between "It is raining" and "I don't believe that it is raining", because the former is a statement about the weather and the latter a statement about a person's belief about the weather, and it is perfectly logically possible that it may rain whilst a person does not believe that it is raining.
In addition to Moore's own work on the paradox, the puzzle also inspired a great deal of work by Ludwig Wittgenstein, who described the paradox as the most impressive philosophical insight that Moore had ever introduced. It is said that when Wittgenstein first heard this paradox one evening (which Moore had earlier stated in a lecture), he rushed round to Moore's lodgings, got him out of bed and insisted that Moore repeat the entire lecture to him.
Moore's description of the principle of organic unity is extremely straightforward, nonetheless, and a variant on a pattern that began with Aristotle:
According to Moore, a moral actor cannot survey the 'goodness' inherent in the various parts of a situation, assign a value to each of them, and then generate a sum in order to get an idea of its total value. A moral scenario is a complex assembly of parts, and its total value is often created by the relations between those parts, and not by their individual value. The organic metaphor is thus very appropriate: biological organisms seem to have emergent properties which cannot be found anywhere in their individual parts. For example, a human brain seems to exhibit a capacity for thought when none of its neurons exhibit any such capacity. In the same way, a moral scenario can have a value far greater than the sum of its component parts.
To understand the application of the organic principle to questions of value, it is perhaps best to consider Moore's primary example, that of a consciousness experiencing a beautiful object. To see how the principle works, a thinker engages in "reflective isolation", the act of isolating a given concept in a kind of null-context and determining its intrinsic value. In our example, we can easily see that per "sui", beautiful objects and consciousnesses are not particularly valuable things. They might have some value, but when we consider the total value of a consciousness experiencing a beautiful object, it seems to exceed the simple sum of these values (Principia 18:2).
|
https://en.wikipedia.org/wiki?curid=11959
|
Genus–differentia definition
A genus–differentia definition is a type of intensional definition, and it is composed of two parts:
For example, consider these two definitions:
Those definitions can be expressed as one genus and two "differentiae":
The use of genus and differentia in constructing definitions goes back at least as far as Aristotle (384–322 BCE).
The process of producing new definitions by "extending" existing definitions is commonly known as differentiation (and also as derivation). The reverse process, by which just part of an existing definition is used itself as a new definition, is called abstraction; the new definition is called "an abstraction" and it is said to have been "abstracted away from" the existing definition.
For instance, consider the following:
A part of that definition may be singled out (using parentheses here):
and with that part, an abstraction may be formed:
Then, the definition of "a square" may be recast with that abstraction as its genus:
Similarly, the definition of "a square" may be rearranged and another portion singled out:
leading to the following abstraction:
Then, the definition of "a square" may be recast with that abstraction as its genus:
In fact, the definition of "a square" may be recast in terms of both of the abstractions, where one acts as the genus and the other acts as the differentia:
Hence, abstraction is crucial in simplifying definitions.
When multiple definitions could serve equally well, then all such definitions apply simultaneously. Thus, "a square" is a member of both the genus "[a] rectangle" and the genus "[a] rhombus". In such a case, it is notationally convenient to consolidate the definitions into one definition that is expressed with multiple genera (and possibly no differentia, as in the following):
or completely equivalently:
More generally, a collection of formula_1 equivalent definitions (each of which is expressed with one unique genus) can be recast as one definition that is expressed with formula_2 genera. Thus, the following:
could be recast as:
A genus of a definition provides a means by which to specify an "is-a relationship":
The non-genus portion of the differentia of a definition provides a means by which to specify a "has-a relationship":
When a system of definitions is constructed with genera and differentiae, the definitions can be thought of as nodes forming a hierarchy or—more generally—a directed acyclic graph; a node that has no predecessor is "a most general definition"; each node along a directed path is "more differentiated (or "more derived) than any one of its predecessors, and a node with no successor is "a most differentiated" (or "a most derived") definition.
When a definition, "S", is the tail of each of its successors (that is, "S" has at least one successor and each direct successor of "S" is a most differentiated definition), then "S" is often called "the species of each of its successors, and each direct successor of "S" is often called "an individual (or "an entity") of the species "S"; that is, the genus of an individual is synonymously called "the species" of that individual. Furthermore, the differentia of an individual is synonymously called "the identity" of that individual. For instance, consider the following definition:
In this case:
As in that example, the identity itself (or some part of it) is often used to refer to the entire individual, a phenomenon that is known in linguistics as a "pars pro toto synecdoche".
|
https://en.wikipedia.org/wiki?curid=11964
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.